Consider the parts of speech. In Latin, the verb has up to 120 inflections. In English it never has more than five (e.g., see, sees, saw, seeing, seen) and often it gets by with just three (hit, hits, hitting). Instead of using loads of different verb forms, we use just a few forms but employ them in loads of ways. We need just five inflections to deal with the act of propelling a car—drive, drives, drove, driving, and driven—yet with these we can express quite complex and subtle variations of tense: “I drive to work every day,” “I have been driving since I was sixteen,” “I will have driven 20,000 miles by the end of this year.” This system, for all its ease of use, makes labeling difficult. According to any textbook, the present tense of the verb drive is drive. Every junior high school pupil knows that. Yet if we say, “I used to drive to work but now I don’t,” we are clearly using the present tense drive in a past tense sense. Equally if we say, “I will drive you to work tomorrow,” we are using it in a future sense. And if we say, “I would drive if I could afford to,” we are using it in a conditional sense. In fact, almost the only form of sentence in which we cannot use the present tense form of drive is, yes, the present tense. When we need to indicate an action going on right now, we must use the participial form driving. We don’t say, “I drive the car now,” but rather “I’m driving the car now.” Not to put too fine a point on it, the labels are largely meaningless.
We seldom stop to think about it, but some of the most basic concepts in English are naggingly difficult to define. What, for instance, is a sentence? Most dictionaries define it broadly as a group of words constituting a full thought and containing, at a minimum, a subject (basically a noun) and predicate (basically a verb). Yet if I inform you that I have just crashed your car and you reply, “What!” or “Where?” or “How!” you have clearly expressed a complete thought, uttered a sentence. But where are the subject and predicate? Where are the noun and verb, not to mention the prepositions, conjunctions, articles, and other components that we normally expect to find in a sentence? To get around this problem, grammarians pretend that such sentences contain words that aren’t there. “What!” they would say, really means “What are you telling me—you crashed my car?” while “Where?” is a shorthand rendering of “Where did you crash it?” and “How?” translates as “How on earth did you manage to do that, you old devil you?” or words to that effect. The process is called ellipsis and is certainly very nifty. Would that I could do the same with my bank account. Yet the inescapable fact is that it is possible to make such sentences conform to grammatical precepts only by bending the rules. When I was growing up we called that cheating.
In English, in short, we possess a language in which the parts of speech are almost entirely notional. A noun is a noun and a verb is a verb largely because the grammarians say they are. In the sentence “I am suffering terribly” suffering is a verb, but in “My suffering is terrible,” it is a noun. Yet both sentences use precisely the same word to express precisely the same idea. Quickly and sleepily are adverbs but sickly and deadly are adjectives. Breaking is a present tense participle, but as often as not it is used in a past tense sense (“He was breaking the window when I saw him”). Broken, on the other hand, is a past tense participle but as often as not it is employed in a present tense sense (“I think I’ve just broken my toe”) or even future tense sense (“If he wins the next race, he’ll have broken the school record”). To deal with all the anomalies, the parts of speech must be so broadly defined as to be almost meaningless. A noun, for example, is generally said to be a word that denotes a person, place, thing, action, or quality. That would seem to cover almost everything, yet clearly most actions are verbs and many words that denote qualities—brave, foolish, good—are adjectives.
The complexities of English are such that the authorities themselves often stumble. Each of the following, penned by an expert, contains a usage that at least some of his colleagues would consider quite wrong.
“Prestige is one of the few words that has had an experience opposite to that described in ‘Worsened Words.’ ” (H. W. Fowler, A Dictionary of Modern English Usage, second edition) It should be “one of the few words that have had.”
“Each of the variants indicated in boldface type count as an entry.” (The Harper Dictionary of Contemporary Usage) It should be “each . . . counts.”
“It is of interest to speculate about the amount of dislocation to the spelling system that would occur if English dictionaries were either proscribed or (as when Malory or Sir Philip Sidney were writing) did not exist.” (Robert Burchfield, The English Language) Make it “was writing.”
“A range of sentences forming statements, commands, questions and exclamations cause us to draw on a more sophisticated battery of orderings and arrangements.” (Robert Burchfield, The English Language) It should be “causes.”
“The prevalence of incorrect instances of the use of the apostrophe . . . together with the abandonment of it by many business firms . . . suggest that the time is close at hand when this moderately useful device should be abandoned.” (Robert Burchfield, The English Language) The verb should be suggests.
“If a lot of the available dialect data is obsolete or almost so, a lot more of it is far too sparse to support any sort of reliable conclusion.” (Robert Claiborne, Our Marvelous Native Tongue) Data is a plural.
“His system of citing examples of the best authorities, of indicating etymology, and pronunciation, are still followed by lexicographers.” (Philip Howard, The State of the Language) His system are?
“When his fellowship expired he was offered a rectorship at Boxworth . . . on condition that he married the deceased rector’s daughter.” (Robert McCrum, et al., The Story of English) A misuse of the subjunctive: It should be “on condition that he marry.”
English grammar is so complex and confusing for the one very simple reason that its rules and terminology are based on Latin—a language with which it has precious little in common. In Latin, to take one example, it is not possible to split an infinitive. So in English, the early authorities decided, it should not be possible to split an infinitive either. But there is no reason why we shouldn’t, any more than we should forsake instant coffee and air travel because they weren’t available to the Romans. Making English grammar conform to Latin rules is like asking people to play baseball using the rules of football. It is a patent absurdity. But once this insane notion became established grammarians found themselves having to draw up ever more complicated and circular arguments to accommodate the inconsistencies. As Burchfield notes in The English Language, one authority, F. Th. Visser, found it necessary to devote 200 pages to discussing just one aspect of the present participle. That is as crazy as it is amazing.
The early authorities not only used Latin grammar as their model, but actually went to the almost farcical length of writing English grammars in that language, as with Sir Thomas Smith’s De Recta et Emendata Linguae Anglicae Scriptione Dialogus (1568), Alexander Gil’s Logonomia Anglica (1619), and John Wallis’s Grammatica Linguae Anglicanae of 1653 (though even he accepted that the grammar of Latin was ill-suited to English). For the longest time it was taken entirely for granted that the classical languages must serve as models. Dryden spoke for an age when he boasted that he often translated his sentences into Latin to help him decide how best to express them in English.
In 1660, Dryden complained that English had “not so much as a tolerable dictionary or a grammar; so our language is in a manner barbarous.” He believed there should be an academy to regulate English usage, and for the next two centuries many others would echo his view. In 1664, the Royal Society for the Advancement of Experimental Philosophy formed a committee “to improve the English tongue,” though nothing lasting seems to have come of it. Thirty-three years later in his Essay Upon Projects, Daniel Defoe was calling for an academy to oversee the language. In 1712, Jonathan Swift joined the chorus with a Proposal for Correcting, Improving and Ascertaining the English Tongue. Some indication of the strength of feeling attached to these matters is given by the fact that in 1780, in the midst of the American Revolution, John Adams wrote to the president of Congress appealing to him to set up an academy for the purpose of “refining, correcting, improving and ascertaining the English language” (a title that closely echoes, not to say plagiarizes, Swift’s pamphlet of sixty-eight years before). In 1806, the American Congress considered a bill to institute a national academy and in 1820 an American Academy of Language and Belles Lettres, presided over by John Quincy Adams, was formed, though again without any resounding perpetual benefits to users of the language. And there were many other such proposals and assemblies.
The model for all these was the Académie Française, founded by Cardinal Richelieu in 1635. In its youth, the academy was an ambitious motivator of change. In 1762, after many years of work, it published a dictionary that regularized the spellings of some 5,000 words—almost a quarter of the words then in common use. It took the s out of words like estre and fenestre, making them être and fenêtre, and it turned roy and loy into roi and loi. In recent decades, however, the academy has been associated with an almost ayatollah-like conservatism. When in December 1988 over 90 percent of French schoolteachers voted in favor of a proposal to introduce the sort of spelling reforms the academy itself had introduced 200 years earlier, the forty venerable members of the academy were, to quote the London Sunday Times, “up in apoplectic arms” at the thought of tampering with something as sacred as French spelling. Such is the way of the world. Among the changes the teachers wanted and the academicians did not were the removal of the circumflex on être, fenêtre, and other such words, and taking the -x off plurals such as bureaux, chevaux, and chateaux and replacing it with an -s.
Such actions underline the one almost inevitable shortcoming of national academies. However progressive and far-seeing they may be to begin with, they almost always exert over time a depressive effect on change. So it is probably fortunate that the English-speaking world never saddled itself with such a body, largely because as many influential users of English were opposed to academies as favored them. Samuel Johnson doubted the prospects of arresting change and Thomas Jefferson thought it in any case undesirable. In declining an offer to be the first honorary president of the Academy of Language and Belles Lettres, he noted that had such a body been formed in the days of the Anglo-Saxons English would now be unable to describe the modern world. Joseph Priestley, the English scientist, grammarian, and theologian, spoke perhaps most eloquently against the formation of an academy when he said in 1761 that it was “unsuitable to the genius of a free nation. . . . We need make no doubt but that the best forms of speech will, in time, establish themselves by their own superior excellence: and in all controversies, it is better to wait the decisions of time, which are slow and sure, than to take those of synods, which are often hasty and injudicious” [quoted by Baugh and Cable, page 269].
English is often commended by outsiders for its lack of a stultifying authority. Otto Jespersen as long ago as 1905 was praising English for its lack of rigidity, its happy air of casualness. Likening French to the severe and formal gardens of Louis XIV, he contrasted it with English, which he said was “laid out seemingly without any definite plan, and in which you are allowed to walk everywhere according to your own fancy without having to fear a stern keeper enforcing rigorous regulations” [Growth and Structure of the English Language, page 16].
Without an official academy to guide us, the English-speaking world has long relied on self-appointed authorities such as the brothers H. W. and F. G. Fowler and Sir Ernest Gowers in Britain and Theodore Bernstein and William Safire in America, and of course countless others. These figures write books, give lectures, and otherwise do what they can (i.e., next to nothing) to try to stanch (not staunch) the perceived decline of the language. They point out that there is a useful distinction to be observed between uninterested and disinterested, between imply and infer, flaunt and flout, fortunate and fortuitous, forgo and forego, and discomfort and discomfit (not forgetting stanch and staunch). They point out that fulsome, properly used, is a term of abuse, not praise, that peruse actually means to read thoroughly, not glance through, that data and media are plurals. And from the highest offices in the land they are ignored.
In the late 1970s, President Jimmy Carter betrayed a flaw in his linguistic armory when he said: “The government of Iran must realize that it cannot flaunt, with impunity, the expressed will and law of the world community.” Flaunt means to show off; he meant flout. The day after he was elected president in 1988, George Bush told a television reporter he couldn’t believe the enormity of what had happened. Had President-elect Bush known that the primary meaning of enormity is wickedness or evilness, he would doubtless have selected a more apt term.
When this process of change can be seen happening in our lifetimes, it is almost always greeted with cries of despair and alarm. Yet such change is both continuous and inevitable. Few acts are more salutary than looking at the writings of language authorities from recent decades and seeing the usages that heightened their hackles. In 1931, H. W. Fowler was tutting over racial, which he called “an ugly word, the strangeness of which is due to our instinctive feeling that the termination -al has no business at the end of a word that is not obviously Latin.” (For similar reasons he disliked television and speedometer.) Other authorities have variously—and sometimes hotly—attacked enthuse, commentate, emote, prestigious, contact as a verb, chair as a verb, and scores of others. But of course these are nothing more than opinions, and, as is the way with other people’s opinions, they are generally ignored.
So if there are no officially appointed guardians for the English language, who sets down all those rules that we all know about from childhood—the idea that we must never end a sentence with a preposition or begin one with a conjunction, that we must use each other for two things and one another for more than two, and that we must never use hopefully in an absolute sense, such as “Hopefully it will not rain tomorrow”? The answer, surprisingly often, is that no one does, that when you look into the background of these “rules” there is often little basis for them.
Consider the curiously persistent notion that sentences should not end with a preposition. The source of this stricture, and several other equally dubious ones, was one Robert Lowth, an eighteenth-century clergyman and amateur grammarian whose A Short Introduction to English Grammar, published in 1762, enjoyed a long and distressingly influential life both in his native England and abroad. It is to Lowth we can trace many a pedant’s most treasured notions: the belief that you must say different from rather than different to or different than, the idea that two negatives make a positive, the rule that you must not say “the heaviest of the two objects,” but rather “the heavier,” the distinction between shall and will, and the clearly nonsensical belief that between can apply only to two things and among to more than two. (By this reasoning, it would not be possible to say that St. Louis is between New York, Los Angeles, and Chicago, but rather that it is among them, which would impart a quite different sense.) Perhaps the most remarkable and curiously enduring of Lowth’s many beliefs was the conviction that sentences ought not to end with a preposition. But even he was not didactic about it. He recognized that ending a sentence with a preposition was idiomatic and common in both speech and informal writing. He suggested only that he thought it generally better and more graceful, not crucial, to place the preposition before its relative “in solemn and elevated” writing. Within a hundred years this had been converted from a piece of questionable advice into an immutable rule. In a remarkable outburst of literal-mindedness, nineteenth-century academics took it as read that the very name pre-position meant it must come before something—anything.
But then this was a period of the most resplendent silliness, when grammarians and scholars seemed to be climbing over one another (or each other; it doesn’t really matter) in a mad scramble to come up with fresh absurdities. This was the age when, it was gravely insisted, Shakespeare’s laughable ought to be changed to laugh-at-able and reliable should be made into relionable. Dozens of seemingly unexceptionable words—lengthy, standpoint, international, colonial, brash—were attacked with venom because of some supposed etymological deficiency or other. Thomas de Quincey, in between bouts of opium taking, found time to attack the expression what on earth. Some people wrote mooned for lunatic and foresayer for prophet on the grounds that the new words were Anglo-Saxon and thus somehow more pure. They roundly castigated those ignoramuses who impurely combined Greek and Latin roots into new words like petroleum (Latin petro + Greek oleum). In doing so, they failed to note that the very word with which they described themselves, grammarians, is itself a hybrid made of Greek and Latin roots, as are many other words that have lived unexceptionably in English for centuries. They even attacked handbook as an ugly Germanic compound when it dared to show its face in the nineteenth century, failing to notice that it was a good Old English word that had simply fallen out of use. It is one of the felicities of English that we can take pieces of words from all over and fuse them into new constructions—like trusteeship, which consists of a Nordic stem (trust), combined with a French affix (ee), married to an Old English root (ship). Other languages cannot do this. We should be proud of ourselves for our ingenuity and yet even now authorities commonly attack almost any new construction as ugly or barbaric.
Today in England you can still find authorities attacking the construction different than as a regrettable Americanism, insisting that a sentence such as “How different things appear in Washington than in London” is ungrammatical and should be changed to “How different things appear in Washington from how they appear in London.” Yet different than has been common in England for centuries and used by such exalted writers as Defoe, Addison, Steele, Dickens, Coleridge, and Thackeray, among others. Other authorities, in both Britain and America, continue to deride the absolute use of hopefully. The New York Times Manual of Style and Usage flatly forbids it. Its writers must not say, “Hopefully the sun will come out soon,” but rather are instructed to resort to a clumsily passive and periphrastic construction such as “It is to be hoped that the sun will come out soon.” The reason? The authorities maintain that hopefully in the first sentence is a misplaced modal auxiliary—that it doesn’t belong to any other part of the sentence. Yet they raise no objection to dozens of other words being used in precisely the same unattached way—admittedly, mercifully, happily, curiously, and so on. No doubt the reason hopefully is not allowed is that somebody at The New York Times once had a boss who wouldn’t allow it because his professor had forbidden it, because his father thought it was ugly and inelegant, because he had been told so by his uncle who was a man of great learning . . . and so on.
Considerations of what makes for good English or bad English are to an uncomfortably large extent matters of prejudice and conditioning. Until the eighteenth century it was correct to say “you was” if you were referring to one person. It sounds odd today, but the logic is impeccable. Was is a singular verb and were a plural one. Why should you take a plural verb when the sense is clearly singular? The answer—surprise, surprise—is that Robert Lowth didn’t like it. “I’m hurrying, are I not?” is hopelessly ungrammatical, but “I’m hurrying, aren’t I?”—merely a contraction of the same words—is perfect English. Many is almost always a plural (as in “Many people were there”), but not when it is followed by a, as in “Many a man was there.” There’s no inherent reason why these things should be so. They are not defensible in terms of grammar. They are because they are.
Nothing illustrates the scope for prejudice in English better than the issue of the split infinitive. Some people feel ridiculously strongly about it. When the British Conservative politician Jock Bruce-Gardyne was economic secretary to the Treasury in the early 1980s, he returned unread any departmental correspondence containing a split infinitive. (It should perhaps be pointed out that a split infinitive is one in which an adverb comes between to and a verb, as in to quickly look.) I can think of two very good reasons for not splitting an infinitive.
1. Because you feel that the rules of English ought to conform to the grammatical precepts of a language that died a thousand years ago.
2. Because you wish to cling to a pointless affectation of usage that is without the support of any recognized authority of the last 200 years, even at the cost of composing sentences that are ambiguous, inelegant, and patently contorted.
It is exceedingly difficult to find any authority who condemns the split infinitive—Theodore Bernstein, H. W. Fowler, Ernest Gowers, Eric Partridge, Rudolph Flesch, Wilson Follett, Roy H. Copperud, and others too tedious to enumerate here all agree that there is no logical reason not to split an infinitive. Otto Jespersen even suggests that, strictly speaking, it isn’t actually possible to split an infinitive. As he puts it: “ ‘To’ . . . is no more an essential part of an infinitive than the definite article is an essential part of a nominative, and no one would think of calling ‘the good man’ a split nominative” [Growth and Structure of the English Language, page 222].
Lacking an academy as we do, we might expect dictionaries to take up the banner of defenders of the language, but in recent years they have increasingly shied away from the role. A perennial argument with dictionary makers is whether they should be prescriptive (that is, whether they should prescribe how language should be used) or descriptive (that is, merely describe how it is used without taking a position). The most notorious example of the descriptive school was the 1961 Webster’s Third New International Dictionary (popularly called Webster’s Unabridged), whose editor, Philip Gove, believed that distinctions of usage were elitist and artificial. As a result, usages such as imply as a synonym for infer and flout being used in the sense of flaunt were included without comment. The dictionary provoked further antagonism, particularly among members of the U.S. Trademark Association, by refusing to capitalize trademarked words. But what really excited outrage was its remarkable contention that ain’t was “used orally in most parts of the U.S. by many cultivated speakers.”
So disgusted was The New York Times with the new dictionary that it announced it would not use it but would continue with the 1934 edition, prompting the language authority Bergen Evans to write: “Anyone who solemnly announces in the year 1962 that he will be guided in matters of English usage by a dictionary published in 1934 is talking ignorant and pretentious nonsense,” and he pointed out that the issue of the Times announcing the decision contained nineteen words condemned by the Second International.
Since then, other dictionaries have been divided on the matter. The American Heritage Dictionary, first published in 1969, instituted a usage panel of distinguished commentators to rule on contentious points of usage, which are discussed, often at some length, in the text. But others have been more equivocal (or prudent or spineless depending on how you view it). The revised Random House Dictionary of the English Language, published in 1987, accepts the looser meaning for most words, though often noting that the newer usage is frowned on “by many”—a curiously timid approach that at once acknowledges the existence of expert opinion and yet constantly places it at a distance. Among the looser meanings it accepts are disinterested to mean uninterested and infer to mean imply. It even accepts the existence of kudo as a singular—prompting a reviewer from Time magazine to ask if one instance of pathos should now be a patho.
It’s a fine issue. One of the undoubted virtues of English is that it is a fluid and democratic language in which meanings shift and change in response to the pressures of common usage rather than the dictates of committees. It is a natural process that has been going on for centuries. To interfere with that process is arguably both arrogant and futile, since clearly the weight of usage will push new meanings into currency no matter how many authorities hurl themselves into the path of change.
But at the same time, it seems to me, there is a case for resisting change—at least slapdash change. Even the most liberal descriptivist would accept that there must be some conventions of usage. We must agree to spell cat c-a-t and not e-l-e-p-h-a-n-t, and we must agree that by that word we mean a small furry quadruped that goes meow and sits comfortably on one’s lap and not a large lumbering beast that grows tusks and is exceedingly difficult to housebreak. In precisely the same way, clarity is generally better served if we agree to observe a distinction between imply and infer, forego and forgo, fortuitous and fortunate, uninterested and disinterested, and many others. As John Ciardi observed, resistance may in the end prove futile, but at least it tests the changes and makes them prove their worth.
Perhaps for our last words on the subject of usage we should turn to the last words of the venerable French grammarian Dominique Bonhours, who proved on his deathbed that a grammarian’s work is never done when he turned to those gathered loyally around him and whispered: “I am about to—or I am going to—die; either expression is used.”