THESIS STATEMENT FOR WHOLE ARTICLE

Issues of tradition vs. egalitarianism in US English are at root political issues and can be effectively addressed only in what this article hereby terms a “Democratic Spirit.” A Democratic Spirit is one that combines rigor and humility, i.e., passionate conviction plus a sedulous respect for the convictions of others. As any American knows, this is a difficult spirit to cultivate and maintain, particularly when it comes to issues you feel strongly about. Equally tough is a DS’s criterion of 100 percent intellectual integrity — you have to be willing to look honestly at yourself and at your motives for believing what you believe, and to do it more or less continually.

This kind of stuff is advanced US citizenship. A true Democratic Spirit is up there with religious faith and emotional maturity and all those other top-of-the-Maslow-Pyramid-type qualities that people spend their whole lives working on. A Democratic Spirit’s constituent rigor and humility and self-honesty are, in fact, so hard to maintain on certain issues that it’s almost irresistibly tempting to fall in with some established dogmatic camp and to follow that camp’s line on the issue and to let your position harden within the camp and become inflexible and to believe that the other camps 9 are either evil or insane and to spend all your time and energy trying to shout over them.

I submit, then, that it is indisputably easier to be Dogmatic than Democratic, especially about issues that are both vexed and highly charged. I submit further that the issues surrounding “correctness” in contemporary American usage are both vexed and highly charged, and that the fundamental questions they involve are ones whose answers have to be literally worked out instead of merely found.

A distinctive feature of ADMAU is that its author is willing to acknowledge that a usage dictionary is not a bible or even a textbook but rather just the record of one bright person’s attempts to work out answers to certain very difficult questions. This willingness appears to me to be informed by a Democratic Spirit. The big question is whether such a spirit compromises Bryan Garner’s ability to present himself as a genuine “authority” on issues of usage. Assessing Garner’s book, then, requires us to trace out the very weird and complicated relationship between Authority and Democracy in what we as a culture have decided is English. That relationship is, as many educated Americans would say, still in process at this time.


A Dictionary of Modern American Usage has no Editorial Staff or Distinguished Panel. It’s been conceived, researched, and written ab ovo usque ad mala by Mr. Bryan A. Garner. This Garner is an interesting guy. He’s both a lawyer and a usage expert (which seems a bit like being both a narcotics wholesaler and a DEA agent). His 1987 A Dictionary of Modern Legal Usage is already a minor classic; and now, instead of practicing law anymore, he goes around conducting writing seminars for JDs and doing prose-consulting for various judicial bodies. Garner’s also the founder of something called the H. W. Fowler Society, 10 a worldwide group of usage Trekkies who like to send one another linguistic boners clipped from different periodicals. You get the idea. This Garner is one serious and very hard-core SNOOT.

The lucid, engaging, and extremely sneaky preface to ADMAU serves to confirm Garner’s SNOOTitude in fact while undercutting it in tone. For one thing, whereas the traditional usage pundit cultivates a remote and imperial persona — the kind who uses one or we to refer to himself — Garner gives us an almost Waltonishly endearing sketch of his own background:


I realized early — at the age of 15 [11] — that my primary intellectual interest was the use of the English language…. It became an all-consuming passion…. I read everything I could find on the subject. Then, on a wintry evening while visiting New Mexico at the age of 16, I discovered Eric Partridge’s Usage and Abusage. I was enthralled. Never had I held a more exciting book…. Suffice it to say that by the time I was 18, I had committed to memory most of Fowler, Partridge, and their successors.


Although this reviewer regrets the bio-sketch’s failure to mention the rather significant social costs of being an adolescent whose overriding passion is English usage, 12 the critical hat is off to yet another personable preface-section, one that Garner entitles “First Principles”: “Before going any further, I should explain my approach. That’s an unusual thing for the author of a usage dictionary to do — unprecedented, as far as I know. But a guide to good writing is only as good as the principles on which it’s based. And users should be naturally interested in those principles. So, in the interests of full disclosure …” 13

The “unprecedented” and “full disclosure” here are actually good-natured digs at Garner’s Fowlerite predecessors, and a slight nod to one camp in the wars that have raged in both lexicography and education ever since the notoriously liberal Webster’s Third New International Dictionary came out in 1961 and included terms like heighth and irregardless without any monitory labels on them. You can think of Webster’s Third as sort of the Fort Sumter of the contemporary Usage Wars. These wars are both the context and the target of a very subtle rhetorical strategy in A Dictionary of Modern American Usage, and without talking about them it’s impossible to explain why Garner’s book is both so good and so sneaky.

We regular citizens tend to go to The Dictionary for authoritative guidance. 14 Rarely, however, do we ask ourselves who exactly decides what gets in The Dictionary or what words or spellings or pronunciations get deemed substandard or incorrect. Whence the authority of dictionary-makers to decide what’s OK and what isn’t? Nobody elected them, after all. And simply appealing to precedent or tradition won’t work, because what’s considered correct changes over time. In the 1600s, for instance, the second-singular took a singular conjugation—“You is.” Earlier still, the standard 2-S pronoun wasn’t you but thou. Huge numbers of now-acceptable words like clever, fun, banter, and prestigious entered English as what usage authorities considered errors or egregious slang. And not just usage conventions but English itself changes over time; if it didn’t, we’d all still be talking like Chaucer. Who’s to say which changes are natural and good and which are corruptions? And when Bryan Garner or E. Ward Gilman do in fact presume to say, why should we believe them?

These sorts of questions are not new, but they do now have a certain urgency. America is in the midst of a protracted Crisis of Authority in matters of language. In brief, the same sorts of political upheavals that produced everything from Kent State to Independent Counsels have produced an influential contra-SNOOT school for whom normative standards of English grammar and usage are functions of nothing but custom and the ovine docility of a populace that lets self-appointed language experts boss them around. See for example MIT’s Steven Pinker in a famous New Republic article—“Once introduced, a prescriptive rule is very hard to eradicate, no matter how ridiculous. Inside the writing establishment, the rules survive by the same dynamic that perpetuates ritual genital mutilations”—or, at a somewhat lower emotional pitch, Bill Bryson in Mother Tongue: English and How It Got That Way:


Who sets down all those rules that we know about from childhood — the idea that we must never end a sentence with a preposition or begin one with a conjunction, that we must use each other for two things and one another for more than two …? The answer, surprisingly often, is that no one does, that when you look into the background of these “rules” there is often little basis for them.


In ADMAU’s preface, Garner himself addresses the Authority question with a Trumanesque simplicity and candor that simultaneously disguise the author’s cunning and exemplify it:


As you might already suspect, I don’t shy away from making judgments. I can’t imagine that most readers would want me to. Linguists don’t like it, of course, because judgment involves subjectivity. [15] It isn’t scientific. But rhetoric and usage, in the view of most professional writers, [16] aren’t scientific endeavors. You [17] don’t want dispassionate descriptions; you want sound guidance. And that requires judgment.


Whole monographs could be written just on the masterful rhetoric of this passage. Besides the FN 16 stuff, note for example the ingenious equivocation of judgment, which in “I don’t shy away from making judgments” means actual rulings (and thus invites questions about Authority), but in “And that requires judgment” refers instead to perspicacity, discernment, reason. As the body of ADMAU makes clear, part of Garner’s overall strategy is to collapse these two different senses of judgment, or rather to use the second sense as a justification for the first. The big things to recognize here are (1) that Garner wouldn’t be doing any of this if he weren’t keenly aware of the Authority Crisis in modern usage, and (2) that his response to this crisis is — in the best Democratic Spirit — rhetorical.


So …

COROLLARY TO THESIS STATEMENT FOR WHOLE ARTICLE

The most salient and timely feature of Bryan A. Garner’s dictionary is that its project is both lexicographical and rhetorical. Its main strategy involves what is known in classical rhetoric as the Ethical Appeal. Here the adjective, derived from the Greek e?248–175?thos, doesn’t mean quite what we usually mean by ethical. But there are affinities. What the Ethical Appeal amounts to is a complex and sophisticated “Trust me.” It’s the boldest, most ambitious, and also most democratic of rhetorical Appeals because it requires the rhetor to convince us not just of his intellectual acuity or technical competence but of his basic decency and fairness and sensitivity to the audience’s own hopes and fears. 18

These latter are not qualities one associates with the traditional SNOOT usage-authority, a figure who for many Americans exemplifies snobbishness and anality, and one whose modern image is not helped by stuff like The American Heritage Dictionary’s Distinguished Usage Panelist Morris Bishop’s “The arrant solecisms of the ignoramus are here often omitted entirely, ‘irregardless’ of how he may feel about this neglect” or critic John Simon’s “The English language is being treated nowadays exactly as slave traders once handled their merchandise.” Compare those lines’ authorial personas with Garner’s in, e.g., “English usage is so challenging that even experienced writers need guidance now and then.”

The thrust here is going to be that A Dictionary of Modern American Usage earns Garner pretty much all the trust his Ethical Appeal asks us for. What’s interesting is that this trust derives not so much from the book’s lexicographical quality as from the authorial persona and spirit it cultivates. ADMAU is a feel-good usage dictionary in the very best sense of feel-good. The book’s spirit marries rigor and humility in such a way as to let Garner be extremely prescriptive without any appearance of evangelism or elitist put-down. This is an extraordinary accomplishment. Understanding why it’s basically a rhetorical accomplishment, and why this is both historically significant and (in this reviewer’s opinion) politically redemptive, requires a more detailed look at the Usage Wars.


You’d definitely know that lexicography had an underbelly if you read the different little introductory essays in modern dictionaries — pieces like Webster’s DEU’s “A Brief History of English Usage” or Webster’s Third’s “Linguistic Advances and Lexicography” or AHD-2’s “Good Usage, Bad Usage, and Usage” or AHD-3’s “Usage in the Dictionary: The Place of Criticism.” But almost nobody ever bothers with these little intros, and it’s not just their six-point type or the fact that dictionaries tend to be hard on the lap. It’s that these intros aren’t actually written for you or me or the average citizen who goes to The Dictionary just to see how to spell (for instance) meringue. They’re written for other lexicographers and critics; and in fact they’re not really introductory at all, but polemical. They’re salvos in the Usage Wars that have been under way ever since editor Philip Gove first sought to apply the value-neutral principles of structural linguistics to lexicography in Webster’s Third. Gove’s now-famous response to conservatives who howled 19 when W3 endorsed OK and described ain’t as “used colloquially by educated speakers in many regions of the United States” was this: “A dictionary should have no truck with artificial notions of correctness or superiority. It should be descriptive and not prescriptive.” Gove’s terms stuck and turned epithetic, and linguistic conservatives are now formally known as Prescriptivists and linguistic liberals as Descriptivists.

The former are better known, though not because of dictionaries’ prologues or scholarly Fowlerites. When you read the columns of William Safire or Morton Freeman or books like Edwin Newman’s Strictly Speaking or John Simon’s Paradigms Lost, you’re actually reading Popular Prescriptivism, a genre sideline of certain journalists (mostly older males, the majority of whom actually do wear bow ties 20) whose bemused irony often masks a Colonel Blimp’s rage at the way the beloved English of their youth is being trashed in the decadent present. Some Pop Prescriptivism is funny and smart, though much of it just sounds like old men grumbling about the vulgarity of modern mores. 21 And some PP is offensively small-minded and knuckle-dragging, such as Paradigms Lost’s simplistic dismissal of Standard Black English: “As for ‘I be,’ ‘you be,’ ‘he be,’ etc., which should give us all the heebie-jeebies, these may indeed be comprehensible, but they go against all accepted classical and modern grammars and are the product not of a language with its roots in history but of ignorance of how a language works.” But what’s really interesting is that the plutocratic tone and styptic wit of Newman and Safire and the best of the Pop Prescriptivists are modeled after the mandarin-Brit personas of Eric Partridge and H. W. Fowler, the same twin towers of scholarly Prescriptivism whom Garner talks about revering as a kid. 22

Descriptivists, on the other hand, don’t have weekly columns in the Times. These guys tend to be hard-core academics, mostly linguists or Comp theorists. Loosely organized under the banner of structural (or “descriptive”) linguistics, they are doctrinaire positivists who have their intellectual roots in Comte and Saussure and L. Bloomfield 23 and their ideological roots firmly in the US Sixties. The brief explicit mention Garner’s preface gives this crew —


Somewhere along the line, though, usage dictionaries got hijacked by the descriptive linguists, [24] who observe language scientifically. For the pure descriptivist, it’s impermissible to say that one form of language is any better than another: as long as a native speaker says it, it’s OK — and anyone who takes a contrary stand is a dunderhead…. Essentially, descriptivists and prescriptivists are approaching different problems. Descriptivists want to record language as it’s actually used, and they perform a useful function — although their audience is generally limited to those willing to pore through vast tomes of dry-as-dust research. [25]


— is disingenuous in the extreme, especially the “approaching different problems” part, because it vastly underplays the Descriptivists’ influence on US culture. For one thing, Descriptivism so quickly and thoroughly took over English education in this country that just about everybody who started junior high after c. 1970 has been taught to write Descriptively — via “freewriting,” “brainstorming,” “journaling”—a view of writing as self-exploratory and — expressive rather than as communicative, an abandonment of systematic grammar, usage, semantics, rhetoric, etymology. For another thing, the very language in which today’s socialist, feminist, minority, gay, and environmental movements frame their sides of political debates is informed by the Descriptivist belief that traditional English is conceived and perpetuated by Privileged WASP Males 26 and is thus inherently capitalist, sexist, racist, xenophobic, homophobic, elitist: unfair. Think Ebonics. Think Proposition 227. Think of the involved contortions people undergo to avoid using he as a generic pronoun, or of the tense, deliberate way white males now adjust their vocabularies around non-w.m.’s. Think of the modern ubiquity of spin or of today’s endless rows over just the names of things—“Affirmative Action” vs. “Reverse Discrimination,” “Pro-Life” vs. “Pro-Choice,” *“Undocumented Worker” vs. “Illegal Alien,” “Perjury” vs. “Peccadillo,” and so on.

*INTERPOLATION


EXAMPLE OF THE APPLICATION OF WHAT THIS ARTICLE’S THESIS STATEMENT CALLS A DEMOCRATIC SPIRIT TO A HIGHLY CHARGED POLITICAL ISSUE, WHICH EXAMPLE IS MORE RELEVANT TO GARNER’S ADMAU THAN IT MAY INITIALLY APPEAR

In this reviewer’s opinion, the only really coherent position on the abortion issue is one that is both Pro-Life and Pro-Choice.

Argument: As of 4 March 1999, the question of defining human life in utero is hopelessly vexed. That is, given our best present medical and philosophical understandings of what makes something not just a living organism but a person, there is no way to establish at just what point during gestation a fertilized ovum becomes a human being. This conundrum, together with the basically inarguable soundness of the principle “When in irresolvable doubt about whether something is a human being or not, it is better not to kill it,” appears to me to require any reasonable American to be Pro-Life. At the same time, however, the principle “When in irresolvable doubt about something, I have neither the legal nor the moral right to tell another person what to do about it, especially if that person feels that s/he is not in doubt” is an unassailable part of the Democratic pact we Americans all make with one another, a pact in which each adult citizen gets to be an autonomous moral agent; and this principle appears to me to require any reasonable American to be Pro-Choice.

This reviewer is thus, as a private citizen and an autonomous agent, both Pro-Life and Pro-Choice. It is not an easy or comfortable position to maintain. Every time someone I know decides to terminate a pregnancy, I am required to believe simultaneously that she is doing the wrong thing and that she has every right to do it. Plus, of course, I have both to believe that a Pro-Life + Pro-Choice stance is the only really coherent one and to restrain myself from trying to force that position on other people whose ideological or religious convictions seem (to me) to override reason and yield a (in my opinion) wacko dogmatic position. This restraint has to be maintained even when somebody’s (to me) wacko dogmatic position appears (to me) to reject the very Democratic tolerance that is keeping me from trying to force my position on him/her; it requires me not to press or argue or retaliate even when somebody calls me Satan’s Minion or Just Another Shithead Male, which forbearance represents the really outer and tooth-grinding limits of my own personal Democratic Spirit. Wacko name-calling notwithstanding, I have encountered only one serious kind of objection to this Pro-Life + Pro-Choice position. But it’s a powerful objection. It concerns not my position per se but certain facts about me, the person who’s developed and maintained it. If this sounds to you both murky and extremely remote from anything having to do with American usage, I promise that it becomes almost excruciatingly clear and relevant below.


The Descriptivist revolution takes a little time to unpack, but it’s worth it. The structural linguists’ rejection of conventional usage rules in English depends on two main kinds of argument. The first is academic and methodological. In this age of technology, some Descriptivists contend, it’s the scientific method — clinically objective, value-neutral, based on direct observation and demonstrable hypothesis — that should determine both the content of dictionaries and the standards of “correct” English. Because language is constantly evolving, such standards will always be fluid. Philip Gove’s now-classic introduction to Webster’s Third outlines this type of Descriptivism’s five basic edicts: “1—Language changes constantly; 2—Change is normal; 3—Spoken language is the language; 4—Correctness rests upon usage; 5—All usage is relative.”

These principles look prima facie OK — simple, commonsensical, and couched in the bland s.-v.-o. prose of dispassionate science — but in fact they’re vague and muddled and it takes about three seconds to think of reasonable replies to each one of them, viz.:

1—All right, but how much and how fast?

2—Same thing. Is Hericlitean flux as normal or desirable as gradual change? Do some changes serve the language’s overall pizzazz better than others? And how many people have to deviate from how many conventions before we say the language has actually changed? Fifty percent? Ten percent? Where do you draw the line? Who draws the line?

3—This is an old claim, at least as old as Plato’s Phaedrus. And it’s specious. If Derrida and the infamous Deconstructionists have done nothing else, they’ve successfully debunked the idea that speech is language’s primary instantiation. 27 Plus consider the weird arrogance of Gove’s (3) with respect to correctness. Only the most mullah-like Prescriptivists care all that much about spoken English; most Prescriptive usage guides concern Standard Written English. 28

4—Fine, but whose usage? Gove’s (4) begs the whole question. What he wants to suggest here, I think, is a reversal of the traditional entailment-relation between abstract rules and concrete usage: instead of usage’s ideally corresponding to a rigid set of regulations, the regulations ought to correspond to the way real people are actually using the language. Again, fine, but which people? Urban Latinos? Boston Brahmins? Rural Midwesterners? Appalachian Neogaelics?

5—Huh? If this means what it seems to mean, then it ends up biting Gove’s whole argument in the ass. Principle (5) appears to imply that the correct answer to the above “which people?” is: All of them. And it’s easy to show why this will not stand up as a lexicographical principle. The most obvious problem with it is that not everything can go in The Dictionary. Why not? Well, because you can’t actually observe and record every last bit of every last native speaker’s “language behavior,” and even if you could, the resultant dictionary would weigh four million pounds and need to be updated hourly. 29 The fact is that any real lexicographer is going to have to make choices about what gets in and what doesn’t. And these choices are based on … what? And so we’re right back where we started.

It is true that, as a SNOOT, I am naturally predisposed to look for flaws in Gove et al.’s methodological argument. But these flaws still seem awfully easy to find. Probably the biggest one is that the Descriptivists’ “scientific lexicography”—under which, keep in mind, the ideal English dictionary is basically number-crunching: you somehow observe every linguistic act by every native/naturalized speaker of English and put the sum of all these acts between two covers and call it The Dictionary — involves an incredibly crude and outdated understanding of what scientific means. It requires a naive belief in scientific Objectivity, for one thing. Even in the physical sciences, everything from quantum mechanics to Information Theory has shown that an act of observation is itself part of the phenomenon observed and is analytically inseparable from it.

If you remember your old college English classes, there’s an analogy here that points up the trouble scholars get into when they confuse observation with interpretation. It’s the New Critics. 30 Recall their belief that literary criticism was best conceived as a “scientific” endeavor: the critic was a neutral, careful, unbiased, highly trained observer whose job was to find and objectively describe meanings that were right there, literally inside pieces of literature. Whether you know what happened to New Criticism’s reputation depends on whether you took college English after c. 1975; suffice it to say that its star has dimmed. The New Critics had the same basic problem as Gove’s Methodological Descriptivists: they believed that there was such a thing as unbiased observation. And that linguistic meanings could exist “Objectively,” separate from any interpretive act.

The point of the analogy is that claims to Objectivity in language study are now the stuff of jokes and shudders. The positivist assumptions that underlie Methodological Descriptivism have been thoroughly confuted and displaced — in Lit by the rise of post-structuralism, Reader-Response Criticism, and Jaussian Reception Theory, in linguistics by the rise of Pragmatics — and it’s now pretty much universally accepted that (a) meaning is inseparable from some act of interpretation and (b) an act of interpretation is always somewhat biased, i.e., informed by the interpreter’s particular ideology. And the consequence of (a)+(b) is that there’s no way around it — decisions about what to put in The Dictionary and what to exclude are going to be based on a lexicographer’s ideology. And every lexicographer’s got one. To presume that dictionary-making can somehow avoid or transcend ideology is simply to subscribe to a particular ideology, one that might aptly be called Unbelievably Naive Positivism.

There’s an even more important way Descriptivists are wrong in thinking that the scientific method developed for use in chemistry and physics is equally appropriate to the study of language. This one doesn’t depend on stuff about quantum uncertainty or any kind of postmodern relativism. Even if, as a thought experiment, we assume a kind of 19th-century scientific realism — in which, even though some scientists’ interpretations of natural phenomena might be biased, 31 the natural phenomena themselves can be supposed to exist wholly independent of either observation or interpretation — it’s still true that no such realist supposition can be made about “language behavior,” because such behavior is both human and fundamentally normative.

To understand why this is important, you have only to accept the proposition that language is by its very nature public — i.e., that there is no such thing as a private language 32—and then to observe the way Descriptivists seem either ignorant of this fact or oblivious to its consequences, as in for example one Dr. Charles Fries’s introduction to an epigone of Webster’s Third called The American College Dictionary:


A dictionary can be an “authority” only in the sense in which a book of chemistry or physics or of botany can be an “authority”—by the accuracy and the completeness of its record of the observed facts of the field examined, in accord with the latest principles and techniques of the particular science.


This is so stupid it practically drools. An “authoritative” physics text presents the results of physicists’ observations and physicists’ theories about those observations. If a physics textbook operated on Descriptivist principles, the fact that some Americans believe electricity flows better downhill (based on the observed fact that power lines tend to run high above the homes they serve) would require the Electricity Flows Better Downhill Hypothesis to be included as a “valid” theory in the textbook — just as, for Dr. Fries, if some Americans use infer for imply or aspect for perspective, these usages become ipso facto “valid” parts of the language. The truth is that structural linguists like Gove and Fries are not scientists at all; they’re pollsters who misconstrue the importance of the “facts” they are recording. It isn’t scientific phenomena they’re observing and tabulating, but rather a set of human behaviors, and a lot of human behaviors are — to be blunt — moronic. Try, for instance, to imagine an “authoritative” ethics textbook whose principles were based on what most people actually do.

Grammar and usage conventions are, as it happens, a lot more like ethical principles than like scientific theories. The reason the Descriptivists can’t see this is the same reason they choose to regard the English language as the sum of all English utterances: they confuse mere regularities with norms.

Norms aren’t quite the same as rules, but they’re close. A norm can be defined here simply as something that people have agreed on as the optimal way to do things for certain purposes. Let’s keep in mind that language didn’t come into being because our hairy ancestors were sitting around the veldt with nothing better to do. Language was invented to serve certain very specific purposes—“That mushroom is poisonous”; “Knock these two rocks together and you can start a fire”; “This shelter is mine!” and so on. Clearly, as linguistic communities evolve over time, they discover that some ways of using language are better than others — not better a priori, but better with respect to the community’s purposes. If we assume that one such purpose might be communicating which kinds of food are safe to eat, then we can see how, for example, a misplaced modifier could violate an important norm: “People who eat that kind of mushroom often get sick” confuses the message’s recipient about whether he’ll get sick only if he eats the mushroom frequently or whether he stands a good chance of getting sick the very first time he eats it. In other words, the fungiphagic community has a vested practical interest in excluding this kind of misplaced modifier from acceptable usage; and, given the purposes the community uses language for, the fact that a certain percentage of tribesmen screw up and use misplaced modifiers to talk about food safety does not eo ipso make m.m.’s a good idea.

Maybe now the analogy between usage and ethics is clearer. Just because people sometimes lie, cheat on their taxes, or scream at their kids, this doesn’t mean that they think those things are “good.” 33 The whole point of establishing norms is to help us evaluate our actions (including utterances) according to what we as a community have decided our real interests and purposes are. Granted, this analysis is oversimplified; in practice it’s incredibly hard to arrive at norms and to keep them at least minimally fair or sometimes even to agree on what they are (see e.g. today’s Culture Wars). But the Descriptivists’ assumption that all usage norms are arbitrary and dispensable leads to — well, have a mushroom.

The different connotations of arbitrary here are tricky, though — and this sort of segues into the second main kind of Descriptivist argument. There is a sense in which specific linguistic conventions really are arbitrary. For instance, there’s no particular metaphysical reason why our word for a four-legged mammal that gives milk and goes moo is cow and not, say, prtlmpf. The uptown term for this is “the arbitrariness of the linguistic sign,” 34 and it’s used, along with certain principles of cognitive science and generative grammar, in a more philosophically sophisticated version of Descriptivism that holds the conventions of SWE to be more like the niceties of fashion than like actual norms. This “Philosophical Descriptivism” doesn’t care much about dictionaries or method; its target is the standard SNOOT claim that prescriptive rules have their ultimate justification in the community’s need to make its language meaningful and clear.

Steven Pinker’s 1994 The Language Instinct is a good and fairly literate example of this second kind of Descriptivist argument, which, like the Gove-et-al. version, tends to deploy a jr.-high-filmstrip SCIENCE: POINTING THE WAY TO A BRIGHTER TOMORROW- type tone:


[T]he words “rule” and “grammar” have very different meanings to a scientist and a layperson. The rules people learn (or, more likely, fail to learn) in school are called “prescriptive” rules, prescribing how one ought to talk. Scientists studying language propose “descriptive” rules, describing how people do talk. Prescriptive and descriptive grammar are simply different things. [35]


The point of this version of Descriptivism is to show that the descriptive rules are more fundamental and way more important than the prescriptive rules. The argument goes like this. An English sentence’s being meaningful is not the same as its being grammatical. That is, such clearly ill-formed constructions as “Did you seen the car keys of me?” or “The show was looked by many people” are nevertheless comprehensible; the sentences do, more or less, communicate the information they’re trying to get across. Add to this the fact that nobody who isn’t damaged in some profound Oliver Sacksish way actually ever makes these sorts of very deep syntactic errors 36 and you get the basic proposition of N. Chomsky’s generative linguistics, which is that there exists a Universal Grammar beneath and common to all languages, plus that there is probably an actual part of the human brain that’s imprinted with this Universal Grammar the same way birds’ brains are imprinted with Fly South and dogs’ with Sniff Genitals. There’s all kinds of compelling evidence and support for these ideas, not least of which are the advances that linguists and cognitive scientists and AI researchers have been able to make with them, and the theories have a lot of credibility, and they are adduced by the Philosophical Descriptivists to show that since the really important rules of language are at birth already hardwired into people’s neocortex, SWE prescriptions against dangling participles or mixed metaphors are basically the linguistic equivalent of whalebone corsets and short forks for salad. As Steven Pinker puts it, “When a scientist considers all the high-tech mental machinery needed to order words into everyday sentences, prescriptive rules are, at best, inconsequential decorations.”

This argument is not the barrel of drugged trout that Methodological Descriptivism was, but it’s still vulnerable to objections. The first one is easy. Even if it’s true that we’re all wired with a Universal Grammar, it doesn’t follow that all prescriptive rules are superfluous. Some of these rules really do seem to serve clarity and precision. The injunction against two-way adverbs (“People who eat this often get sick”) is an obvious example, as are rules about other kinds of misplaced modifiers (“There are many reasons why lawyers lie, some better than others”) and about relative pronouns’ proximity to the nouns they modify (“She’s the mother of an infant daughter who works twelve hours a day”).

Granted, the Philosophical Descriptivist can question just how absolutely necessary these rules are: it’s quite likely that a recipient of clauses like the above could figure out what they mean from the sentences on either side or from the overall context or whatever. 37 A listener can usually figure out what I really mean when I misuse infer for imply or say indicate for say, too. But many of these solecisms — or even just clunky redundancies like “The door was rectangular in shape”—require at least a couple extra nanoseconds of cognitive effort, a kind of rapid sift-and-discard process, before the recipient gets it. Extra work. It’s debatable just how much extra work, but it seems indisputable that we put some extra interpretive burden on the recipient when we fail to honor certain conventions. W/r/t confusing clauses like the above, it simply seems more “considerate” to follow the rules of correct English … just as it’s more “considerate” to de-slob your home before entertaining guests or to brush your teeth before picking up a date. Not just more considerate but more respectful somehow — both of your listener/reader and of what you’re trying to get across. As we sometimes also say about elements of fashion and etiquette, the way you use English “makes a statement” or “sends a message”—even though these statements/messages often have nothing to do with the actual information you’re trying to communicate.

We’ve now sort of bled into a more serious rejoinder to Philosophical Descriptivism: from the fact that linguistic communication is not strictly dependent on usage and grammar it does not necessarily follow that the traditional rules of usage and grammar are nothing but “inconsequential decorations.” Another way to state this objection is that something’s being “decorative” does not necessarily make it “inconsequential.” Rhetoric-wise, Pinker’s flip dismissal is very bad tactics, for it invites precisely the question it’s begging: inconsequential to whom?

A key point here is that the resemblance between usage rules and certain conventions of etiquette or fashion is closer than the Philosophical Descriptivists know and far more important than they understand. Take, for example, the Descriptivist claim that so-called correct English usages like brought rather than brung and felt rather than feeled are arbitrary and restrictive and unfair and are supported only by custom and are (like irregular verbs in general) archaic and incommodious and an all-around pain in the ass. Let us concede for the moment that these claims are 100 percent reasonable. Then let’s talk about pants. Trousers, slacks. I suggest to you that having the so-called correct subthoracic clothing for US males be pants instead of skirts is arbitrary (lots of other cultures let men wear skirts), restrictive and unfair (US females get to wear either skirts or pants), based solely on archaic custom (I think it’s got to do with certain traditions about gender and leg-position, the same reasons women were supposed to ride sidesaddle and girls’ bikes don’t have a crossbar), and in certain ways not only incommodious but illogical (skirts are more comfortable than pants; 38 pants ride up; pants are hot; pants can squish the ’nads and reduce fertility; over time pants chafe and erode irregular sections of men’s leg-hair and give older men hideous half-denuded legs; etc. etc.). Let us grant — as a thought experiment if nothing else — that these are all sensible and compelling objections to pants as an androsartorial norm. Let us, in fact, in our minds and hearts say yes—shout yes — to the skirt, the kilt, the toga, the sarong, the jupe. Let us dream of or even in our spare time work toward an America where nobody lays any arbitrary sumptuary prescriptions on anyone else and we can all go around as comfortable and aerated and unchafed and motile as we want.

And yet the fact remains that in the broad cultural mainstream of millennial America, men do not wear skirts. If you, the reader, are a US male, and even if you share my personal objections to pants and dream as I do of a cool and genitally unsquishy American Tomorrow, the odds are still 99.9 percent that in 100 percent of public situations you wear pants/slacks/shorts/trunks. More to the point, if you are a US male and also have a US male child, and if that child might happen to come to you one evening and announce his desire/intention to wear a skirt rather than pants to school the next day, I am 100 percent confident that you are going to discourage him from doing so. Strongly discourage him. You could be a Molotov-tossing anti-pants radical or a kilt manufacturer or Dr. Steven Pinker himself — you’re going to stand over your kid and be prescriptive about an arbitrary, archaic, uncomfortable, and inconsequentially decorative piece of clothing. Why? Well, because in modern America any little boy who comes to school in a skirt (even, say, a modest all-season midi) is going to get stared at and shunned and beaten up and called a total geekoid by a whole lot of people whose approval and acceptance are important to him. 39 In our present culture, in other words, a boy who wears a skirt is “making a statement” that is going to have all kinds of gruesome social and emotional consequences for him.

You can probably see where this is headed. I’m going to describe the intended point of the pants analogy in terms that I’m sure are simplistic — doubtless there are whole books in Pragmatics or psycholinguistics or something devoted to unpacking this point. The weird thing is that I’ve seen neither Descriptivists nor SNOOTs deploy it in the Wars. 40,41

When I say or write something, there are actually a whole lot of different things I am communicating. The propositional content (i.e., the verbal information I’m trying to convey) is only one part of it. Another part is stuff about me, the communicator. Everyone knows this. It’s a function of the fact that there are so many different well-formed ways to say the same basic thing, from e.g. “I was attacked by a bear!” to “Goddamn bear tried to kill me!” to “That ursine juggernaut did essay to sup upon my person!” and so on. Add the Saussurian/Chomskian consideration that many grammatically ill-formed sentences can also get the propositional content across—“Bear attack Tonto, Tonto heap scared!”—and the number of subliminal options we’re scanning/sorting/interpreting as we communicate with one another goes transfinite very quickly. And different levels of diction and formality are only the simplest kinds of distinction; things get way more complicated in the sorts of interpersonal communication where social relations and feelings and moods come into play. Here’s a familiar kind of example. Suppose that you and I are acquaintances and we’re in my apartment having a conversation and that at some point I want to terminate the conversation and not have you be in my apartment anymore. Very delicate social moment. Think of all the different ways I can try to handle it: “Wow, look at the time”; “Could we finish this up later?”; “Could you please leave now?”; “Go”; “Get out”; “Get the hell out of here”; “Didn’t you say you had to be someplace?”; “Time for you to hit the dusty trail, my friend”; “Off you go then, love”; or that sly old telephone-conversation-ender: “Well, I’m going to let you go now”; etc. etc.n And then think of all the different factors and implications of each option. 42

The point here is obvious. It concerns a phenomenon that SNOOTs blindly reinforce and that Descriptivists badly underestimate and that scary vocab-tape ads try to exploit. People really do judge one another according to their use of language. Constantly. Of course, people are constantly judging one another on the basis of all kinds of things — height, weight, scent, physiognomy, accent, occupation, make of vehicle 43—and, again, doubtless it’s all terribly complicated and occupies whole battalions of sociolinguists. But it’s clear that at least one component of all this interpersonal semantic judging involves acceptance, meaning not some touchy-feely emotional affirmation but actual acceptance or rejection of someone’s bid to be regarded as a peer, a member of somebody else’s collective or community or Group. Another way to come at this is to acknowledge something that in the Usage Wars gets mentioned only in very abstract terms: “correct” English usage is, as a practical matter, a function of whom you’re talking to and of how you want that person to respond — not just to your utterance but also to you. In other words, a large part of the project of any communication is rhetorical and depends on what some rhet-scholars call “Audience” or “Discourse Community.” 44 It is the present existence in the United States of an enormous number of different Discourse Communities, plus the fact that both people’s use of English and their interpretations of others’ use are influenced by rhetorical assumptions, that are central to understanding why the Usage Wars are so politically charged and to appreciating why Bryan Garner’s ADMAU is so totally sneaky and brilliant and modern.

Fact: There are all sorts of cultural/geographical dialects of American English — Black English, Latino English, Rural Southern, Urban Southern, Standard Upper-Midwest, Maine Yankee, East-Texas Bayou, Boston Blue-Collar, on and on. Everybody knows this. What not everyone knows — especially not certain Prescriptivists — is that many of these non-SWE-type dialects have their own highly developed and internally consistent grammars, and that some of these dialects’ usage norms actually make more linguistic/aesthetic sense than do their Standard counterparts. Plus, of course, there are also innumerable sub- and subsubdialects 45 based on all sorts of things that have nothing to do with locale or ethnicity — Medical-School English, Twelve-Year-Old-Males-Whose-Worldview-Is-Deeply-Informed-by-South-Park English — that are nearly incomprehensible to anyone who isn’t inside their very tight and specific Discourse Community (which of course is part of their function 46).

INTERPOLATION

POTENTIALLY DESCRIPTIVIST-LOOKING EXAMPLE OF SOME GRAMMATICAL ADVANTAGES OF A NON-STANDARD DIALECT THAT THIS REVIEWER ACTUALLY KNOWS ABOUT FIRSTHAND


I happen to have two native English dialects — the SWE of my hypereducated parents and the hard-earned Rural Midwestern of most of my peers. When I’m talking to RMs, I tend to use constructions like “Where’s it at?” for “Where is it?” and sometimes “He don’t” instead of “He doesn’t.” Part of this is a naked desire to fit in and not get rejected as an egghead or fag (see sub). But another part is that I, SNOOT or no, believe that these RMisms are in certain ways superior to their Standard equivalents. For a dogmatic Prescriptivist, “Where’s it at?” is double-damned as a sentence that not only ends with a preposition but whose final preposition forms a redundancy with where that’s similar to the redundancy in “the reason is because” (which latter usage I’ll admit makes me dig my nails into my palms). Rejoinder: First off, the avoid-terminal-prepositions rule is the invention of one Fr. R. Lowth, an 18th-century British preacher and indurate pedant who did things like spend scores of pages arguing for hath over the trendy and degenerate has. The a.-t.-p. rule is antiquated and stupid and only the most ayotolloid SNOOT takes it seriously. Garner himself calls the rule “stuffy” and lists all kinds of useful constructions like “a person I have great respect for” and “the man I was listening to” that we’d have to discard or distort if we really enforced it. Plus, the apparent redundancy of “Where’s it at?” 47 is offset by its metrical logic: what the at really does is license the contraction of is after the interrogative adverb. You can’t say “Where’s it?” So the choice is between “Where is it?” and “Where’s it at?”, and the latter, a strong anapest, is prettier and trips off the tongue better than “Where is it?”, whose meter is either a clunky monosyllabic-foot + trochee or it’s nothing at all. Using “He don’t” makes me a little more uncomfortable; I admit that its logic isn’t quite as compelling. Nevertheless, a clear trend in the evolution of English from Middle to Modern has been the gradual regularizing of irregular present-tense verbs, 48 a trend justified by the fact that irregulars are hard to learn and to keep straight and have nothing but history going for them. By this reasoning, Standard Black English is way out on the cutting edge of English with its abandonment of the 3-S present in to do and to go and to say and its marvelously streamlined six identical present-tense inflections of to be. (Granted, the conjugation “he be” always sounds odd to me, but then SBE is not one of my dialects.) This is probably the place for your SNOOT reviewer openly to concede that a certain number of traditional prescriptive rules really are stupid and that people who insist on them (like the legendary assistant to Margaret Thatcher who refused to read any memo with a split infinitive in it, or the jr.-high teacher I had who automatically graded you down if you started a sentence with Hopefully) are that very most contemptible and dangerous kind of SNOOT, the SNOOT Who Is Wrong. The injunction against split infinitives, for instance, is a consequence of the weird fact that English grammar is modeled on Latin even though Latin is a synthetic language and English is an analytic language. 49 Latin infinitives consist of one word and are impossible to as it were split, and the earliest English Prescriptivists — so enthralled with Latin that their English usage guides were actually written in Latin 50—decided that English infinitives shouldn’t be split either. Garner himself takes out after the s.i. rule in his miniessays on both SPLIT INFINITIVES and SUPERSTITIONS. 51 And Hopefully at the beginning of a sentence, as a certain cheeky eighth-grader once (to his everlasting social cost) pointed out in class, actually functions not as a misplaced modal auxiliary or as a manner adverb like quickly or angrily but as a sentence adverb (i.e., as a special kind of “veiled reflexive” that indicates the speaker’s attitude about the state of affairs described by the rest of the sentence — examples of perfectly OK sentence adverbs are clearly, basically, luckily), and only SNOOTs educated in the high-pedantic years 1940–1960 blindly proscribe it or grade it down. The cases of split infinitives and Hopefully are in fact often trotted out by dogmatic Descriptivists as evidence that all SWE usage rules are arbitrary and dumb (which is a bit like pointing to Pat Buchanan as evidence that all Republicans are maniacs). FYI, Garner rejects Hopefully’s knee-jerk proscription, too, albeit grudgingly, saying “the battle is lost” and including the adverb in his miniessay on SKUNKED TERMS, which is his phrase for a usage that is “hotly disputed … any use of it is likely to distract some readers.” (Garner also points out something I’d never quite realized, which is that hopefully, if misplaced/ mispunctuated in the body of a sentence, can create some of the same two-way ambiguities as other adverbs, as in e.g. “I will borrow your book and hopefully read it soon.”


Whether we’re conscious of it or not, most of us are fluent in more than one major English dialect and in several subdialects and are probably at least passable in countless others. Which dialect you choose to use depends, of course, on whom you’re addressing. More to the point, I submit that the dialect you use depends mostly on what sort of Group your listener is part of and on whether you wish to present yourself as a fellow member of that Group. An obvious example is that traditional upper-class English has certain dialectal differences from lower-class English and that schools used to have courses in elocution whose whole raison was to teach people how to speak in an upper-class way. But usage-as-inclusion is about much more than class. Try another sort of thought experiment: A bunch of US teenagers in clothes that look several sizes too large for them are sitting together in the local mall’s food court, and imagine that a 53-year-old man with jowls, a comb-over, and clothes that fit perfectly comes over to them and says he was scoping them and thinks they’re totally rad and/or phat and asks is it cool if he just kicks it and chills with them here at their table. The kids’ reaction is going to be either scorn or embarrassment for the guy — most likely a mix of both. Q: Why? Or imagine that two hard-core young urban black guys are standing there talking and I, who am resoundingly and in all ways white, come up and greet them with “Yo” and address one or both as “Brother” and ask “s’up, s’goin’ on,” pronouncing on with that NYCish oo — o?249–215? diphthong that Young Urban Black English deploys for a standard o. Either these guys are going to think that I am mocking them and be offended or they are going to think I am simply out of my mind. No other reaction is remotely foreseeable. Q: Why?

Why: A dialect of English is learned and used either because it’s your native vernacular or because it’s the dialect of a Group by which you wish (with some degree of plausibility) to be accepted. And although it is a major and vitally important one, SWE is only one dialect. And it is never, or at least hardly ever, 52 anybody’s only dialect. This is because there are — as you and I both know and yet no one in the Usage Wars ever seems to mention — situations in which faultlessly correct SWE is not the appropriate dialect.

Childhood is full of such situations. This is one reason why SNOOTlets tend to have such a hard social time of it in school. A SNOOTlet is a little kid who’s wildly, precociously fluent in SWE (he is often, recall, the offspring of SNOOTs). Just about every class has a SNOOTlet, so I know you’ve seen them — these are the sorts of six-to-twelve-year-olds who use whom correctly and whose response to striking out in T-ball is to shout “How incalculably dreadful!” The elementary-school SNOOTlet is one of the earliest identifiable species of academic geekoid and is duly despised by his peers and praised by his teachers. These teachers usually don’t see the incredible amounts of punishment the SNOOTlet is receiving from his classmates, or if they do see it they blame the classmates and shake their heads sadly at the vicious and arbitrary cruelty of which children are capable.

Teachers who do this are dumb. The truth is that his peers’ punishment of the SNOOTlet is not arbitrary at all. There are important things at stake. Little kids in school are learning about Group-inclusion and — exclusion and about the respective rewards and penalties of same and about the use of dialect and syntax and slang as signals of affinity and inclusion. They’re learning about Discourse Communities. Little kids learn this stuff not in Language Arts or Social Studies but on the playground and the bus and at lunch. When his peers are ostracizing the SNOOTlet or giving him monstrous quadruple Wedgies or holding him down and taking turns spitting on him, there’s serious learning going on. Everybody here is learning except the little SNOOT 53—in fact, what the SNOOTlet is being punished for is precisely his failure to learn. And his Language Arts teacher — whose own Elementary Education training prizes “linguistic facility” as one of the “social skills” that ensure children’s “developmentally appropriate peer rapport,” 54 but who does not or cannot consider the possibility that linguistic facility might involve more than lapidary SWE — is unable to see that her beloved SNOOTlet is actually deficient in Language Arts. He has only one dialect. He cannot alter his vocabulary, usage, or grammar, cannot use slang or vulgarity; and it’s these abilities that are really required for “peer rapport,” which is just a fancy academic term for being accepted by the second-most-important Group in the little kid’s life. 55 If he is sufficiently in thrall to his teachers and those teachers are sufficiently clueless, it may take years and unbelievable amounts of punishment before the SNOOTlet learns that you need more than one dialect to get along in school.

This reviewer acknowledges that there seems to be some, umm, personal stuff getting dredged up and worked out here; 56 but the stuff is germane. The point is that the little A+ SNOOTlet is actually in the same dialectal position as the class’s “slow” kid who can’t learn to stop using ain’t or bringed. Exactly the same position. One is punished in class, the other on the playground, but both are deficient in the same linguistic skill — viz., the ability to move between various dialects and levels of “correctness,” the ability to communicate one way with peers and another way with teachers and another with family and another with T-ball coaches and so on. Most of these dialectal adjustments are made below the level of conscious awareness, and our ability to make them seems part psychological and part something else — perhaps something hardwired into the same motherboard as Universal Grammar — and in truth this ability is a much better indicator of a kid’s raw “verbal IQ” than test scores or grades, since US English classes do far more to retard dialectal talent than to cultivate it.

Загрузка...