2


THE ROOTS OF AMERICAN MILITARISM

Overgrown military establishments are under any form of government inauspicious to liberty, and are to be regarded as particularly hostile to Republican liberty.

PRESIDENT GEORGE WASHINGTON,


Farewell Address, September 17, 1796

This conjunction of an immense military establishment and a large arms industry is new in the American experience.... In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist. We must never let the weight of this combination endanger our liberties or democratic processes. We should take nothing for granted.

PRESIDENT DWIGHT D. EISENHOWER,


Farewell Address, January 17, 1961


In the United States, the first militarist tendencies appeared at the end of the nineteenth century. Before and during the Spanish-American War of 1898, the press was manipulated to whip up a popular war fever, while atrocities and war crimes committed by American forces in the Philippines were hidden from public view. As a consequence of the war the United States acquired its first colonial possessions and created its first military general staff. American “jingoism” of that period—popular sentiment of boastful, aggressive chauvinism—took its cue from similar tendencies in imperial England. Even the term jingoism derived from the refrain of a patriotic British music-hall song of 1878, taken up by those who supported sending a British fleet into Turkish waters to counter the advances of Russia.

On the night of February 15, 1898, in Havana harbor, part of the Spanish colony of Cuba, a mysterious explosion destroyed and sank the American battleship USS Maine. The blast killed 262 of its 374 crew members. The Maine had arrived in Havana three weeks earlier as part of a “friendly” mission to rescue Americans caught up in an ongoing Cuban insurrection against Spanish rule. Its unspoken missions, however, were to practice “gunboat diplomacy” against Spain on behalf of the Cuban rebels and to enforce the Monroe Doctrine by warning other European powers like Germany not to take advantage of the unstable situation.

Two official navy investigations concluded that an external blast, probably caused by a mine, had ignited one of the battleship’s powder magazines, though Spain maintained that it had nothing to do with the sinking of the Maine. Later analysts, including Admiral Hyman Rickover, have suggested that spontaneous combustion in a coal bunker may have been the cause of what was likely an accidental explosion.1 Though the navy raised and subsequently scuttled the Maine in 1911, what happened to it in 1898 remains a puzzle to this day.

But there was no puzzle about the reaction to the news back in the United States. Assistant Secretary of the Navy Theodore Roosevelt instantly declared the sinking to be “an act of dirty [Spanish] treachery.” The French ambassador to Washington advised his government that a “sort of bellicose fury has seized the American nation.”2 William Randolph Hearst’s New York Journal published drawings illustrating how Spanish saboteurs had attached a mine to the Maine and detonated it from the shore. Hearst then sent the artist Frederic Remington to Cuba to report on the Cuban revolt against Spanish oppression. “There is no war,” Remington wrote to his boss. “Request to be recalled.” In a famous reply, Hearst cabled, “Please remain. You furnish the pictures, I’ll furnish the war.”3 And so they both did. Thanks to Hearst’s journalism and that of Joseph Pulitzer in his New York World, the country erupted in righteous anger and patriotic fervor. On April 25,1898, Congress declared war on Spain.

On May 1, Admiral George Dewey’s Asiatic Squadron, forced to leave the British colony of Hong Kong because of the declaration of war, attacked the Spanish fleet at Manila Bay and won an easy victory. With Filipino nationalist help, the Americans occupied Manila and began to think about what to do with the rest of the Philippine Islands. President William McKinley declared that the Philippines “came to us as a gift from the gods,” even though he acknowledged that he did not know precisely where they were.4

During the summer of 1898, Theodore Roosevelt left the government and set out for Cuba with his own personal regiment. Made up of cowboys, Native Americans, and polo-playing members of the Harvard class of 1880, Roosevelt’s Rocky Mountain Riders (known to the press as the Rough Riders) would be decimated by malaria and dysentery on the island, but their skirmishes with the Spaniards at San Juan Hill, east of Santiago, would also get their leader nominated for a congressional Medal of Honor and propel him into the highest elected political office.

Peace was restored by the Treaty of Paris, signed on December 10, 1898, a treaty that launched the United States into a hitherto unimaginable role as an explicitly imperialist power in the Caribbean and the Pacific. The treaty gave Cuba its independence, but the Platt Amendment passed by the U.S. Congress in 1901 actually made the island a satellite of the United States, while establishing an American naval base at Guantánamo Bay on Cuba’s south coast. Senator Orville Platt of Connecticut had attached an amendment to the Army Appropriations Bill, specifying the conditions under which the United States would intervene in Cuban domestic affairs. His amendment demanded that Cuba not sign any treaties that could impair its sovereignty or contract any debts that could not be repaid by normal revenues. In addition, Cuba was forced to grant the United States special privileges to intervene at any time to preserve Cuban independence or to support a government “adequate for the protection of life, property, and individual liberty.” The marines would land to exercise these self-proclaimed rights in 1906,1912,1917, and 1920.

In 1901, the United States forced Cuba to incorporate the Platt Amendment into its own constitution, where it remained until 1934—including an article that allowed the United States a base at Guantánamo until both sides should “agree” to its return, a stipulation the American government insisted upon on the grounds that the base was crucial to the defense of the Panama Canal. The Platt Amendment was a tremendous humiliation to all Cubans, but its acceptance was the only way they could avoid a permanent military occupation.

Even though the Canal Zone is no longer an American possession, Guantánamo Bay remains a military colony, now used as a detention camp for people seized in the U.S.-Afghan war of 2001-02 and the Iraq war of 2003. (Because Guantánamo is outside the United States, these prisoners are said to be beyond the protection of American laws, and because the Bush administration has dubbed them “unlawful combatants,” a term found nowhere in international law, it is argued that they are also not subject to the Geneva Conventions on the treatment of prisoners of war. On October 9, 2002, the U.S. government dismissed the commandant at Guantánamo, Brigadier General Rick Baccus, for being “too soft” on the inmates.)5 The United States did not directly annex Cuba in 1898, only because of its pretensions to being an anti-imperialist nation, its desire to avoid assuming Cuba’s $400 million debt as well as Cuba’s large Afro-American population, and Florida’s fears that, as a part of the country, the island might compete in agriculture and tourism.

The Paris treaty also transferred the Spanish territories of Puerto Rico and Guam to American sovereignty, where they remain to this day.* Most important, in exchange for a mere $20 million payment to Spain, the treaty awarded to the United States the entire Philippine archipelago—3, 141 islands located off the coast of China and Vietnam, some 7,952 miles from Los Angeles but less than 2,000 miles from Tokyo. The payment, however modest, was important to America’s leaders, proof that they were not, as some critics charged, engaged in a “land grab” similar to those of the other new imperialist powers of the time—Germany, Russia, Italy, Belgium, and Japan—not to mention the old imperialists, Britain, France, Spain, and the Netherlands.

The Filipinos themselves proved less than eager to be “benevolently assimilated,” as President McKinley put it, and under the leadership of a nationalist patriot, Emilio Aguinaldo, who had aided Admiral Dewey in wresting control of Manila from the Spaniards, they revolted against their new American overlords. Although American troops captured Aguinaldo in 1901 and forced him to swear loyalty to the United States, the fighting went on until 1903. Whereas the Spanish-American War (Cubans call it the Spanish-Cuban-American War) cost only 385 American deaths in combat, some 4,234 American military personnel died while putting down the Filipino rebels. The army, many of its officers having gained their experience in the Indian wars, proceeded to slaughter at least 200,000 Filipinos out of a population of less than eight million. During World War II, in a second vain attempt to escape imperialist rule with the help of a rival imperialist power, Aguinaldo collaborated with the Japanese conquerors of the islands.

Exercising what the historian Stuart Creighton Miller calls its “exaggerated sense of innocence,” the United States portrayed its brutal colonization of the Filipinos as divinely ordained, racially inevitable, and economically indispensable.6 These ideas had a powerful impact on the Japanese, who were attempting both to lead an anti-Western Asian renaissance and to join the imperialists in exploiting the weaker nations of East Asia. Their emulation of other “advanced” nations in taking the imperialist route would lead ultimately to war with the United States.

One prominent American imperialist of the time, Senator Albert Beveridge of Indiana, was fond of proclaiming, “The Philippines are ours forever ... and just beyond the Philippines are China’s illimitable markets. ... The Pacific ocean is ours.” A constant theme in the congressional debate over annexation of the Philippines was that they were the “stepping-stones to China.” Beveridge believed it America’s duty to bring Christianity and civilization to “savage and senile peoples,” never mind that most Filipinos had been Catholics for centuries.7 Even opponents of annexation like Senator “Pitchfork Ben” Tillman of South Carolina argued that it was absurd to talk about teaching self-government to people “racially unfit to govern themselves.”8 At the time Tillman made his comment, the most powerful political force in the United States was New York’s Tammany Hall, not exactly a model of enlightened self-government. President McKinley called the Filipinos his “little brown brothers,” while the troops in the field sang a ditty with the line “They may be brothers of McKinley, but they sure as hell are not brothers of mine.” Such attitudes, high and low, contributed, ironically enough, to an emerging Japanese sense of racial superiority and a growing belief in their divinely ordained “manifest destiny” to liberate Asia from Western influence.

The Spanish-American War not only inaugurated an era of American imperialism but also set the United States on the path toward militarism. In traditional American political thought, large standing armies had been viewed as both unnecessary, since the United States was determined to avoid foreign wars, and a threat to liberty, because military discipline and military values were seen as incompatible with the openness of civilian life.9 In his famous Farewell Address of September 17, 1796, George Washington told his fellow Americans, “The great rule of conduct for us in regard to foreign nations is—in extending our commercial relations—to have with them as little political connection as possible.”10 To twenty-first-century ears, this pronouncement seems highly idealistic and, if perhaps appropriate to a new and powerless nation, certainly not feasible for the world’s only “superpower.” Washington’s name is still sacrosanct in the United States, but the content of his advice is routinely dismissed as “isolationism.”

Nonetheless, Washington had something quite specific in mind. He feared that the United States might develop a state apparatus, comparable to those of the autocratic states of Europe, that could displace the constitutional order. This would inevitably involve a growth in federal taxes to pay for the armies and bureaucracies of the state, a shift in political power from the constituent states of the union to the federal government, and a shift within the federal government from the preeminence of the Congress to that of the president, resulting in what we have come to call the “imperial presidency.” The surest route to these unwanted outcomes, in Washington’s mind, was foreign wars. As James Madison, the primary author of the Constitution, wrote: “Of all enemies to public liberty, war is, perhaps, the most to be dreaded, because it comprises and develops the germ of every other. War is the parent of armies; from these proceed debts and taxes; and armies, and debts, and taxes are the known instruments for bringing the many under the domination of the few.”11 The Declaration of Independence accused the English king of having “affected to render the Military independent of and superior to the Civil Power,” and the First Continental Congress condemned the use of the army to enforce the collection of taxes. These attitudes lasted about a century. With the Spanish-American War, the government began to build a military machine—and to tolerate the accompanying militarism—that by the end of the twentieth century had come to seem invincible.

During the summer of 1898, in Tampa, Florida, where American military forces had gathered for the assault on Santiago, Cuba, no single military or political authority had been in charge. Waste, confusion, and disease were rampant. Theodore Roosevelt had, in fact, exploited this disorganization to raise the Rough Riders. In 1899, President William McKinley appointed Elihu Root secretary of war, and Root, in 1903, made a signal contribution to American militarism by creating a “general staff” of senior military officers directly under him to plan and coordinate future wars. In testimony before Congress and in his annual reports as secretary of war, Root occasionally mentioned the confusion at Tampa in 1898 as evidence of the need for such an organization. But his real purpose was much broader. He argued that “the almost phenomenal success that has attended ... German (Prussian) arms during the last thirty years is due in large degree to the corps of highly trained general staff officers which the German army possesses.” He concluded: “The common experience of mankind is that the things which those general staffs do have to be done in every well-managed and well-directed army, and they have to be done by a body of men especially assigned to them. We should have such a body of men selected and organized in our own way and in accordance with our own system to do those essential things.”12

On February 14, 1903, following Root’s advice, Congress passed legislation that created the predecessor to today’s Joint Chiefs of Staff. Root could hardly have imagined that his modest contribution to military efficiency would result a century later in thousands of military officers toiling away in the Pentagon on issues of weapons, strategic planning, force structure, and, in military jargon, C4ISR (command, control, communications, computers, intelligence, surveillance, and reconnaissance). Back in 1903, a week after setting up the general staff, Root established a complementary institution of militarism, the Army War College, first located in Washington, DC, and later moved to Carlisle, Pennsylvania. In his speech at the laying of the cornerstone for the original college, Root insisted, “It is not strange that on the shore of the beautiful Potomac, in a land devoted to peace, there should arise a structure devoted to increasing the efficiency of an army for wars. The world is growing more pacific; war is condemned more widely as the years go on.... Nevertheless, selfishness, greed, jealousy, a willingness to become great through injustice, have not disappeared, and only by slow steps is man making progress. So long as greed and jealousy exist among men, so long the nation must be prepared to defend its rights.”13 In addition, as part of his modernization effort, Root brought federal standards and methods to the semi-independent state militias and renamed them the National Guard.

Perhaps Root was right that, having achieved the industrial foundations of military might, the United States needed to pay attention to the global balance of power and modify its institutions accordingly. But there is no doubt about what we lost in doing so. Washington’s warnings about the dangers of a large, permanent military establishment to American liberty would be ever more worshiped and less heeded over time, while the government came to bear an ever-vaguer resemblance to the political system outlined in the Constitution of 1787.

In 1912, Woodrow Wilson, then governor of New Jersey, former president of Princeton University, distinguished political scientist, and author of Congressional Government, one of the few genuine classics on the American political system, was elected president on the Democratic ticket. He had benefited greatly from the split in the Republican Party caused by former President Theodore Roosevelt’s attempt to return to politics. As the leader of the first Democratic administration in twenty years, Wilson single-mindedly set out to reform the corruption and inequities associated with America’s Gilded Age. He cut tariffs, imposed an income tax under the Sixteenth Amendment, created the Federal Reserve system to perform central bank functions, enacted a federal child labor law, levied the first estate tax, and inaugurated numerous other changes that moved political power in the United States irreversibly toward Washington and the presidency.

But it was in foreign policy where, for better or worse, he made the greatest innovations. Wilson began with the Mexican revolution that broke out in 1910. He could not resist interfering and backing one faction over another. This was, of course, nothing new for an American government that already had Caribbean colonies and semicolonies. It was the way he justified these acts that distinguished him from the turn-of-the-century Republican imperialists and that ultimately made him the patron saint of the “crusades” that would characterize foreign policy from intervention in the First World War through the 2003 invasion of Iraq. Woodrow Wilson was an idealist and a Christian missionary in foreign policy. He was always more concerned to do good than to be effective.

The child of a chaplain in the Confederate army, Wilson was an elder of the Presbyterian Church and a daily reader of the Bible. As one of his biographers, Arthur S. Link, observes, “He never thought about public matters, as well as private ones, without first trying to decide what faith and Christian love commanded in the circumstances.”14 Born in Virginia, Wilson was also a racist and a prude. Because of America’s republican form of government, its security behind the two oceans, and what he saw as the innate virtues of its people, Wilson strongly believed in the exceptionalism of the United States and its destiny to bring about the “ultimate peace of the world.” He did not see America’s external activities in terms of realist perspectives or a need to sustain a global balance of power. He believed instead that peace depended on the spread of democracy and that the United States had an obligation to extend its principles and democratic practices throughout the world.15

Before he was finished in Mexico, he had ordered the navy to occupy Veracruz in April 1914; provoked Francisco “Pancho” Villa’s raid of March 9, 1916, on Columbus, New Mexico; and dispatched General John J. Pershing on an unsuccessful punitive expedition deep into Mexican territory to capture Villa. Wilson publicly regarded himself as Mexico’s tutor on its form of government, a role that soured Mexican-American relations for decades. A war with Mexico was barely averted, but this heavy-handed meddling in the affairs of a neighbor disguised by a cloud of high-flown rhetoric about liberal, constitutional, and North American ideals did not go unnoticed. Japan repeatedly used the precedent, along with its own rhetoric of “liberation” from Western imperialism, to justify armed interventions in Manchuria and revolutionary China, which were on Japan’s doorstep. The United States had no cogent response—except ultimately to go to war with Japan over behavior the latter had learned from the United States.

With the outbreak of the First World War in Europe, Wilson followed George Washington’s advice and remained neutral. His position was extremely popular with the public, and in 1916 he was reelected on the campaign slogan “He Kept Us out of War.” From the outbreak of war former President Theodore Roosevelt and Elihu Root, by then a senator, had proved outspoken critics of Wilson’s insistence on neutrality. However, Wilson, when he finally did lead the country to war in 1917, turned out to be—as his Mexican adventures indicated—far more than a classic imperialist in the 1898 mold. He was, in fact, precisely the kind of president George Washington had warned against. Roosevelt and his colleagues advocated an American imperialism, modeled on British precedents, that sought power and glory for their own sakes through military conquest and colonial exploitation. Wilson, on the other hand, provided an idealistic grounding for American imperialism, what in our own time would become a “global mission” to “democratize” the world. More than any other figure, he provided the intellectual foundations for an interventionist foreign policy, expressed in humanitarian and democratic rhetoric. Wilson remains the godfather of those contemporary ideologists who justify American imperial power in terms of exporting democracy.

Popular attitudes toward Germany slowly changed, reflecting the public’s underlying pro-British sentiments and the effectiveness of Anglo-American propaganda that Germany’s submarine warfare against English shipping was “uncivilized.” The issue came to a head on May 7, 1915, when a German submarine torpedoed the British Cunard Lines passenger ship Lusitania off the Irish coast. Some 128 Americans, along with several hundred citizens of other countries, lost their lives. The Germans maintained that the ship was carrying Canadian soldiers, which was not technically true (the men had not yet been inducted into the Canadian army) and that the Lusitania’s captain had deliberately failed to zigzag as prescribed by British Admiralty regulations. The German kaiser suggested that the captain had thus invited the sinking of his own vessel to inflame American opinion against Germany. The British were carrying out an equally effective blockade of German ports, but their practice was to stop offending ships and remove the passengers and crew before sinking them. The German U-boat, on the other hand, had given the Lusitania no warning. Wilson’s antiwar and anti-imperialist secretary of state, William Jennings Bryan, was inclined to be conciliatory toward Germany in order to avoid war. On June 9, 1915, however, Bryan resigned and Wilson replaced him with Robert Lansing, a professional diplomat and advocate of entering the war on the Anglo-French-Russian side.

Wilson and Lansing continued to negotiate with Germany for almost two years, trying to obtain a pledge that passenger ships would not be attacked. Instead, on January 31, 1917, Germany declared a policy of unlimited submarine warfare against all ships calling at British ports, neutral as well as belligerent. On February 3, Wilson broke diplomatic relations with Germany. He was also irritated by evidence that German agents were secretly offering to aid Mexican revolutionaries against the United States. In a war message to Congress on April 2, 1917, Woodrow Wilson declared German aggression a threat not simply to the United States but to humanity itself. Germany, he said, was waging “warfare against mankind. It is a war against all nations.” Not satisfied that the defeat of Germany was sufficient justification for American participation, he added a new, more ambitious war aim: “The world must be made safe for democracy.” America, he explained, must fight “for the rights and liberties of small nations, for a universal dominion of right by such a concert of free peoples as shall bring peace and safety to all nations and make the world itself at last free.” According to Wilson, these were purposes “we have always carried nearest to our hearts.”16 He asked for a declaration of war and got it four days later. In the year and a half still remaining in the war, some 130,274 American soldiers lost their lives on the Western Front.

On January 8, 1918, in a speech to Congress, Wilson unveiled his famous Fourteen Points, through which he intended to achieve a peace of reconciliation. The first of these points called for “open covenants openly arrived at,” but at the peace conference itself Wilson discovered that Britain, France, and Japan, all allies in the war, had negotiated a series of secret treaties among themselves transferring parts of China to Japan in return for Japanese recognition of European spheres of influence in Asia. Wilson accepted Japan’s control over a part of China in order to keep Japan in his proposed League of Nations, little realizing that the Chinese revolution was already well advanced and had begun to achieve a popular following. The Bolshevik Revolution of 1917 had inspired many Chinese and the peoples of European and American colonies in East Asia to study Marxism and Leninism and to seek the help of Soviet Russia in setting up local Communist parties. Nothing recommended Bolshevism more than the vociferous fear it seemed to elicit throughout the capitalist world.

When Wilson, however, turned down a Japanese request for an article in the Treaty of Versailles recognizing the principle of racial equality, the Japanese stiffened their positions and determined to obtain everything they could from a peace treaty. But perhaps most disruptive of future peace was the discovery by the colonized peoples of the British, French, Dutch, and American empires that the most famous of Wilson’s Fourteen Points—“self-determination for all peoples”—applied only to the defeated Austro-Hungarian and Ottoman empires, and even there only to white people. Self-determination was not being offered to the peoples of British India, or French Indochina, or the Netherlands East Indies, or the Philippines. On board Wilson’s ship bound for Europe, Secretary of State Lansing had written in his diary, “The more I think about the president’s declaration of the right of self-determination the more convinced I am that it is bound to be the basis of impossible demands on the peace conference—what misery it will cause.”17 Much of the rest of the twentieth century would be devoted to efforts by colonized peoples to achieve, through rebellion, urban insurrection, and guerrilla warfare, what Wilson had denied them in the treaty ending World War I.

These tragedies of hubris and naivete ended in personal tragedy for Wilson. On his arrival in Paris for the peace negotiations, he had declared, “We have just concluded the war to end all wars.” The League of Nations that he intended to create would, he believed, prevent future wars by acting against aggressors. But on November 19, 1919, and again on March 19, 1920, the U.S. Senate, led by Henry Cabot Lodge, declined to ratify the Treaty of Versailles as an encroachment on American sovereignty, and the United States itself never became a member of the League of Nations. Even Secretary of State Lansing had opposed the treaty, and Wilson, now semiparalyzed by a stroke, asked for his resignation. The Republicans returned to power in November 1920, and the new president, Warren G. Harding, quickly concluded a separate peace with Germany. At the end of 1920, Wilson was finally awarded the Nobel Peace Prize, but it was—even more than usual—a meaningless gesture. Marshal Ferdinand Foch of France, supreme commander of all Allied forces at war’s end, remarked of “Wilson’s” peace at Versailles, “This is not a peace treaty, it’s a twenty years armistice.”18 Foch did not live to see how precisely his prediction would be fulfilled.

With Woodrow Wilson, the intellectual foundations of American imperialism were set in place. Theodore Roosevelt and Elihu Root had represented a European-derived, militaristic vision of imperialism backed by nothing more substantial than the notion that the manifest destiny of the United States was to govern racially inferior Latin Americans and East Asians. Wilson laid over that his own hyperidealistic, sentimental, and ahistorical idea that what should be sought was a world democracy based on the American example and led by the United States. It was a political project no less ambitious and no less passionately held than the vision of world Communism launched at almost the same time by the leaders of the Bolshevik Revolution. As international-relations critic William Pfaff puts it, “[The United States was] still in the intellectual thrall of the megalomaniacal and self-righteous clergyman-president who gave to the American nation the blasphemous conviction that it, like he himself, had been created by God ‘to show the way to the nations of the world how they shall walk in the paths of liberty.’”19

If World War I generated the ideological basis for American imperialism, World War II unleashed its growing militarism. It was then, as retired Marine Colonel James Donovan has written, that the “American martial spirit grew to prominence.”20 The wars with Germany and Japan were popular, the public and the members of the armed forces knew why they were fighting, and there was comparatively little dissent over war aims. Even so, the government carefully managed the news to sustain a warlike mood. No photos of dead American soldiers were allowed to be printed in newspapers or magazines until 1943, and the Pentagon gave journalists extensive guidance on how to report the war.21

World War II saw the nation’s highest military participation ratio (MPR)—that is, percentage of people under arms—of any of America’s wars. With some 16,353,700 men and women out of a total population of 133.5 million serving in the armed forces, World War II produced an MPR of 12.2 percent. Only the MPR of the Confederate side in the Civil War was higher, at 13.1 percent, but the overall ratio for both sides in the Civil War was 11.1 percent. The lowest MPRs, both 0.4 percent, were in the Mexican (1846-48) and Spanish-American Wars, followed by the Persian Gulf War of 1991 at 1.1 percent.22 (This latter figure is, however, unreliable since a significant portion of the forces “under arms” at the time of the Gulf War were not engaged in combat or even located in the gulf region but were manning the United States’s many garrisons and ships around the world.)

World War II produced a nation of veterans, proud of what they had achieved, respectful but not totally trusting of their military leaders, and almost uniformly supportive of the use of the atomic bombs that had brought the war to a rapid close. President Franklin Roosevelt played the role of supreme commander as no other president before or since. He once sent a memo to Secretary of State Cordell Hull saying, “Please try to address me as Commander-in-Chief, not as president.”23 Congress did not impose a Joint Committee to Conduct the War on Roosevelt, as it had on President Lincoln during the Civil War, and military institutions like the Joint Chiefs of Staff were still informal and unsupervised organizations created by and entirely responsible to the executive branch. As Colonel Donovan has observed, “With an agreed policy of unlimited war, Congress was also satisfied to abdicate its responsibilities of controlling the military establishment.... Some military leaders believed civilian control of the military was a relic of the past, with no place in the future.”24

The most illustrious of World War II’s American militarists, General Douglas MacArthur, challenged the constitutional authority of President Harry Truman during the Korean War, writing that it was “a new and heretofore unknown and dangerous concept that the members of our armed forces owe primary allegiance or loyalty to those who temporarily exercise the authority of the Executive Branch of the Government rather than to the country and its Constitution which they are sworn to defend. No proposition could be more dangerous.”25 On April 11, 1951, Truman charged MacArthur with insubordination, relieved him of his command, and forced him to retire. Truman’s action was probably the last classic assertion of the constitutional principle that the president and the civilians appointed by him control the military. During the presidencies of John F. Kennedy and Bill Clinton, in particular, the high command would often be publicly restive about the qualities of the commander in chief and come close to crossing the line of constitutional legality without actually doing so. As we shall see, during the Kennedy administration the Joint Chiefs of Staff proposed that the military secretly carry out terrorist incidents in the United States and use them as a pretext for war with Castro’s Cuba, and President Clinton was never able to regain full authority over the high command after the firestorm at the beginning of his administration over gays in the military.

After World War II, high-ranking military officers, including Generals Marshall and Eisenhower, moved into key positions in the civilian hierarchy of political power in a way unprecedented since the Civil War. George C. Marshall, the wartime chief of staff, became the country’s first secretary of state from a military background. (There have been only two others since: General Alexander Haig in the Reagan administration and General Colin Powell in the second Bush administration.) Paradoxically, General Marshall left his name on what is probably the country’s single greatest foreign policy failure, the 1946 Marshall Mission to China, which attempted to mediate between the Communists and the Nationalists in the Chinese civil war, and its single greatest success, the 1947 Marshall Plan, which helped rebuild postwar Europe economically.

But World War II, although a popular war, did not create American militarism, and had the Cold War not ensued it is reasonable to assume that traditional American opposition to standing armies and foreign wars would have forcefully reasserted itself. If there has been a growing trend toward militarism, there also remains a vein of deep suspicion of armies. The military almost totally demobilized in the years immediately after 1945 even though the draft remained in place until 1973, when an all-volunteer military came into being following almost a decade of protests against the war in Vietnam. On a pragmatic level, the public has proved ambivalent about wars because of the casualties they produce. And World War II produced the second-largest number of casualties of all America’s wars.

The Civil War, by far the bloodiest war in our history, had profoundly affected popular attitudes and generated a deep resistance on the part of the American people to sending their sons and daughters into battle. The number of combat deaths for both sides in the Civil War was 184,594, considerably less than the 292,131 American deaths in World War II. However, when one adds in the 373,458 Civil War deaths from other causes—disease, privation, and accidents, including deaths among prisoners of war—the Civil War total becomes 558,052 wartime deaths. The figures for World War II, with 115,185 deaths from other causes, total 407,316.26

World War II was not as bloody as the Civil War, except in one important measure, that of intensity of combat, which is well conveyed by the ratio of those killed in action per month* The Civil War lasted forty-eight months and saw 3,846 killed per month, whereas World War II lasted forty-four months (for the United States) with 6,639 killed per month. It was this intensity of combat that Americans remember from World War II. It made them skeptical about future wars, particularly those in which there was no immediate threat to the United States or in which the United States had not been attacked. The legacy of World War II for the development of militarism was thus ambiguous. More Americans participated in the war effort more enthusiastically than in any other conflict, seemingly breaking the hold of traditional doubt about the value of war making. On the other hand, the country swiftly demobilized after the war and people returned to their normal peacetime pursuits.

In the years immediately following World War II, the great military production machine briefly came to a halt, people were laid off, and factories were mothballed. Some aircraft manufacturers tried their hands at making aluminum canoes and mobile homes; others simply went out of business. With the onset of the Cold War, however, and the rise of a professional military class, many of the norms characteristic of wartime were reinstated, and the armaments industry went into full production. Between 1950 and 2003, the United States experienced four periods of intense military mobilization accompanied by huge spurts in weapons purchases (see graph).



The first and most significant peak in weapons purchases occurred during the Korean War (1950-53), even though only a fraction of it went for armaments to fight that war. Most of the money went into nuclear weapons development and the stocking of the massive Cold War garrisons then being built in Britain, Germany, Italy, Japan, and South Korea. Defense spending rose from about $150 billion in 1950, measured in 2002 purchasing power, to just under $500 billion in 1953. The second buildup financed the Vietnam War. Defense spending in 1968 was over $400 billion in 2002 dollars. The third boom was Ronald Reagan’s splurge, including huge investments in weapons systems like the B-2 stealth bomber and in high-tech research and development for his strategic defense initiative, funds that were largely hidden in the Pentagon’s “black budget.” Spending hit around $450 billion in 1989. The second Bush administration launched the latest binge in new weaponry, fueled in part by public reaction to the 9/11 attacks. On March 14, 2002, the House of Representatives passed a military budget of $393.8 billion, the largest increase in defense outlays in almost twenty years.27

But no less significant is what happened to the military budget between the peaks. At no moment from 1955 to 2002 did defense spending decline to pre-Cold War, much less pre-World War II, levels. Instead, the years from 1955 to 1965, 1974 to 1980, and 1995 to 2000 established the Cold War norm or baseline of military spending in the age of militarism. Real defense spending during those years averaged $281 billion per year in 2002 dollars. Defense spending even in the Clinton years, after the collapse of the Soviet Union, averaged $278 billion, almost exactly the Cold War norm. The frequent Republican charge that Clinton cut military spending is untrue. In the wake of the Reagan defense buildup, which had so ruined public finances that the United States became the world’s largest debtor nation, he simply allowed military spending to return to what had become its normal level.

From the Korean War to the first years of the twenty-first century, the institutionalization of these huge defense expenditures fundamentally altered the political economy of the United States. Defense spending at staggering levels became a normal feature of “civilian” life and all members of Congress, regardless of their political orientations, tried to attract defense contracts to their districts. Regions such as Southern California became dependent on defense expenditures, and recessions involving layoffs during the “normal” years of defense spending have been a standard feature of California’s economy. In September 2002 it was estimated that the Pentagon funneled nearly a quarter of its research and development funds to companies in California, which employed by far the largest number of defense workers in any state. Moreover, this figure is undoubtedly low because many Southern California firms, like Northrop Grumman in Century City, TRW in Redondo Beach, Lockheed Martin in Palmdale, and Raytheon in El Segundo, are engaged in secret military programs whose budgets are also secret.28

Americans are by now used to hearing their political leaders say or do anything to promote local military spending. For example, both of Washington State’s Democratic senators, Patty Murray and Maria Cantwell, as well as a Republican senator from Alaska, Ted Stevens, voted to include in the fiscal year 2003 defense budget some $30 billion to be spent over a decade to lease Boeing 767 aircraft and modify them to serve as aerial tankers for refueling combat aircraft in flight, a project not even listed by the air force in its top sixty priorities or among its procurement plans for the next six years. The bill also provided for the air force’s paying to refit the planes for civilian use and deliver them back to Boeing after the leases were up. “It is in our national interest... to keep our only commercial aircraft manufacturer healthy in tough times,” Murray commented.29 Boeing, of course, builds the planes at factories in Washington State. In 2000, Stevens, an influential member of the Senate Appropriations Committee and its Defense Appropriations Subcommittee, received a $10,000 donation to his personal reelection campaign and $1,000 for his political action committee from Boeing; in 2001, it gave him an additional $3,000. Dennis Hastert, the Speaker of the House of Representatives, so liked the provisions in the bill that he tacked on funds for the leasing of four new Boeing 737 airliners for congressional junkets. Such obvious indifference to how taxpayers’ monies are spent, bordering on corruption, no longer attracts notice. It has become a standard feature of politics.

The military-industrial complex has also become a rich source of places to “retire” for high-ranking military officers, just as many executives of defense contractors receive appointments as high-ranking officials in the Pentagon. This “circulation of elites” tends to undercut attempts at congressional oversight of either the Defense Department or defense contractors. The result is an almost total loss of accountability for public money spent on military projects of any sort. As Insight magazine journalist Kelly O’Meara has noted, in May 2001 the deputy inspector general at the Pentagon “admitted that $4.4 trillion in adjustments to the Pentagon’s books had to be cooked to compile... required financial statements and that $1.1 trillion ... was simply gone and no one can be sure of when, where or to whom the money went.”30 This amount is larger than the $855 billion in income taxes paid by Americans in fiscal 1999. The fact that no one seems to care is also evidence of militarism.

The onset of militarism is commonly marked by three broad indicators. The first is the emergence of a professional military class and the subsequent glorification of its ideals. Professionalism became an issue during the Korean War (1950-53). The goal of professionalism is to produce soldiers who will fight solely and simply because they have been ordered to do so and not because they necessarily identify with, or have any interest in, the political goals of a war. In World War II, the United States fought against two enemies, Nazi Germany and militarist Japan, that, with the aid of government propaganda, could be portrayed as genuinely evil.31

The United States did its best to depict the North Koreans, and particularly the Communist Chinese, who entered the war in late 1950, as “yellow hordes” and “blue ants,” but as James Michener’s novel The Bridges at Toko-Ri (1953) so well described, the public was much less emotionally involved than it had been during World War II. With public support slackening, the military high command turned to inculcating martial values into the troops, making that the most vital goal of all military instruction, superseding even training in the use of weapons. These values were to include loyalty, esprit de corps, tradition, male bonding, discipline, and action—generally speaking, a John Wayne view of the world. And inasmuch as conscripts constituted most of the still citizen army in those years, there was much work to do. Combat veterans of World War II tended to denigrate Wayne for his Hollywood-style machismo displayed in films like Fighting Seabees (1944). William Manchester, the biographer of General Douglas MacArthur and himself a veteran of the war in the Pacific, recalled how, shortly after the Battle of Okinawa, wounded soldiers and marines booed Wayne, who did not serve in the military, off a stage at the Aiea Heights Naval Hospital in Hawaii when he walked out in a Texas hat, bandanna, checkered shirt, two pistols, chaps, boots, and spurs.32

The kind of professionalism the military leadership had in mind was never actually achieved during the Korean War or, for that matter, the Vietnam War primarily because the men asked to do the fighting were mostly conscripts. The inequities of conscription, combined with high levels of casualties among those unable to evade the draft, destroyed much of the pride in being a member of the armed forces. Officers understood this and devoted themselves to furthering their own careers—getting their “tickets punched,” as the phrase went. During the Vietnam years in particular, the military began to employ increasingly rapid cycles of rotation in and out of the war zone to prevent disaffection and even mutiny. Korea and Vietnam did not come close to producing the casualty levels of World War II, but because our soldiers were still fundamentally civilians and did not understand the purposes of these wars, they and their families often became disillusioned or even deeply alienated.

The Korean War had a military participation ratio of 3.8 percent, Vietnam 4.3 percent. There were 33,651 American deaths in Korea, and 47,369 in Vietnam. Nonbattle deaths for the Korean War are unknown; they number 10,799 for Vietnam. Some 2.7 million Americans served in Vietnam, of whom 304,000 were wounded in action and over 75,000 were permanently disabled by their injuries. As of Memorial Day 1996, there were 58,202 names of the dead engraved on the Vietnam War Memorial in Washington, DC. Approximately 1,300 men are still listed as missing in action.33 Both wars were intensely unpopular, and the presidency was won three times by promises to bring them to an end—Eisenhower in the Korean era, and Johnson and Nixon in the Vietnam years (though both men proceeded to expand the war once elected).

When it became apparent during Vietnam that the military draft was being administered in an inequitable manner—university students were exempted while the weight of forced military service fell disproportionately on minorities and those with insufficient means to avoid it—the government chose to abolish the draft rather than apply it equitably. Ever since, service in the armed forces has been entirely voluntary and has become a route of social mobility for those to whom other channels of advancement are often blocked, much as was the case in the former Imperial Japanese Army during the 1930s, where city dwellers were commonly deferred from conscription “for health reasons” and the military was seen as a way out of the impoverished countryside. In the U.S. Army in 1997, 41 percent of enlisted personnel were nonwhite (a subject to which I shall return).

In addition to ending the draft and so turning the military into a strictly “professional” force, Vietnam contributed to the advance of militarism, counterintuitively, exactly because the United States lost the war. This defeat, deeply disillusioning to America’s leadership elites, set off a never-concluded debate about the “lessons” to be learned from it.34 For a newly ascendant far right, Vietnam became a just war that the left wing had not had the will or courage to win. Whether they truly believed this or not, rightist political leaders came to some quite specific conclusions. As Christian Appy observes, “For Reagan and Bush [then Reagan’s vice president], the central lesson of Vietnam was not that foreign policy had to be more democratic, but the opposite: it had to become ever more the province of national security managers who operated without the close scrutiny of the media, the oversight of Congress, or accountability to an involved public.”35 The result has been the emergence of a coterie of professional militarists who classify everything they do as secret and who have been appointed to senior positions throughout the executive branch.

Not all of these militarists wear uniforms. The historian Alfred Vagts defines “civilian militarism” as the “interference and intervention of civilian leaders in fields left to the professionals by habit and tradition.” Its effects are often anything but benign. In general, civilian militarism leads “to an intensification of the horrors of warfare. [In World War II, for example,] civilians not only... anticipated war more eagerly than the professionals, but played a principal part in making combat, when it came, more absolute, more terrible than was the current military wont or habit.”36 Civilians are driven more by ideology than professionals, and when working with the military, they often feel the need to display a warrior’s culture, which they take to mean iron-fisted ruthlessness, since they are innocent of genuine combat. This effect was particularly marked in the second Iraq war of 2003, when many ideologically committed civilians staffing the Department of Defense, without the experience of military service, no less of warfare, dictated strategies, force levels, and war aims to the generals and admirals. Older, experienced senior officers denigrated them as “chicken hawks.”37 This prominent role for civilian militarists was an unintended consequence of the Vietnam War.

During Vietnam, the Joint Chiefs of Staff (JCS) often opposed the decisions of President Lyndon Johnson. They wanted a wider war than the president did, even at the risk of a nuclear war with China. As a historian of the JCS, H. R. McMaster, explains: “The president and [Secretary of Defense Robert] McNamara shifted responsibility for real planning away from the JCS to ad hoc committees composed principally of civilian analysts and attorneys, whose main goal was to obtain a consensus consistent with the president’s pursuit of the middle ground between disengagement and war. ... As American involvement in the war escalated, Johnson’s vulnerability to disaffected senior military officers increased because he was purposely deceiving the Congress and the public about the nature of the American military effort in Vietnam.”38

The old and well-institutionalized American division of labor between elected officials and military professionals who advised elected officials and then executed their policies was dismantled, never to be recreated. During the Reagan administration, an ever-burgeoning array of amateur strategists and star-wars enthusiasts came to occupy the White House and sought to place their allies in positions of authority in the Pentagon. The result was the development of a kind of military opportunism at the heart of government, with military men paying court to the pet schemes of inexperienced politicians and preparing for lucrative postretirement positions in the arms industry or military think tanks. Top military leaders began to say what they thought their political superiors wanted to hear, while covertly protecting the interests of their individual services or of their minifiefdoms within those services.39 The military establishment increasingly became a gigantic cartel, operated to benefit the four principal services—the army, navy, Marine Corps, and air force—much the way the Organization of the Petroleum Exporting Countries (OPEC) functions to maintain the profits of each of its members. Shares of the defense budget for each service have not varied by more than 2 percent over the past twenty-five years, during which time the Soviet Union collapsed and the United States fought quite varied wars in Panama, Kuwait, Haiti, Somalia, Bosnia, Kosovo, Afghanistan, and Iraq. Military needs did not dictate this stability.

During the 1990s and in the opening years of the twenty-first century, lobbyists and representatives of groups wanting to face off against nations like China that might pose future challenges to American hegemony took charge of virtually all politicomilitary policy.40 They often sought to purge the government of experts who stood in their way, and the influence of the State Department notably withered. For example, Kurt M. Campbell, former deputy assistant secretary of defense for East Asian and Pacific affairs in the Clinton administration, notes approvingly that China policy has increasingly been taken over by a new “‘strategic class’—that collection of academics, commentators and policymakers whose ideas help define the national interest.” He says that this new crop of military experts, of which he is a charter civilian member, is likely not to know much about China but instead to have “a background in strategic studies or international relations” and to be particularly watchful “for signs of China’s capacity for menace.”41 These are the attitudes not of prudent foreign policy thinkers but of militarists.

The second political hallmark of militarism is the preponderance of military officers or representatives of the arms industry in high government positions. During 2001, the administration of George W. Bush filled many of the chief American diplomatic posts with military men or militarists, including Secretary of State General Colin Powell, a former chairman of the Joint Chiefs of Staff, and the deputy secretary of state, Richard Armitage, who was undersecretary of defense in the Reagan administration. At the Pentagon, President Bush appointed Peter B. Teets, the former president and chief operating officer of Lockheed Martin Corporation, as undersecretary of the air force; former brigadier general and Enron Corporation executive Thomas E. White as secretary of the army (he resigned in April 2003); Gordon England, a vice president of General Dynamics, as secretary of the navy; and James Roche, an executive with Northrop Grumman and a retired U.S. Navy captain, as secretary of the air force.42 It should be noted that Lockheed Martin is the world’s largest arms manufacturer, selling $17.93 billion worth of military hardware in 1999. On October 26, 2001, the Pentagon awarded Lockheed Martin a $200 billion contract, the largest military contract in our history, to build the F-35 “joint-strike fighter,” an aircraft that conceivably could have been useful during the Cold War but is irrelevant to the probable military problems of the twenty-first century.

Richard Gardner, a former ambassador to Spain and Italy, estimates that, by a ratio of at least sixteen to one, the United States spends more on preparing for war than on trying to prevent it.43 During the 1990s, the United States was notoriously delinquent in paying its dues to the United Nations and at least $490 million in arrears to various multilateral development banks. By comparison, in the wake of the terrorist attacks of September 11, the United States was well on its way to annual defense budgets exceeding $400 billion.

The third hallmark of militarism is a devotion to policies in which military preparedness becomes the highest priority of the state. In his inaugural address, President George W. Bush said, “We will build our defenses beyond challenge, lest weakness invite challenge. We will confront weapons of mass destruction, so that a new century is spared new horrors.” But no nation has the capacity to challenge the United States militarily. Even as the new president spoke, the Stockholm International Peace Research Institute was compiling the 2001 edition of its authoritative SIPRI Yearbook. It shows that global military spending rose to $798 billion in 2000, an increase of 3.1 percent from the previous year. The United States accounted for 37 percent of that amount, by far the largest proportion. It was also the world’s largest arms salesman, responsible for 47 percent of all munitions transfers between 1996 and 2000. The country was thus already well prepared for war when the younger Bush came into office. Since his administration is devoted to further enlarging America’s military capabilities—a sign of militarism rather than of military preparedness—it has had to invent new threats in order to convince people that more is needed. In many ways, the terrorist attacks of 9/11 came as manna from heaven to an administration determined to ramp up military budgets.

At the beginning of the twenty-first century, the United States’s nuclear arsenal comprised 5,400 multiple-megaton warheads atop intercontinental ballistic missiles based on land and at sea; an additional 1,750 nuclear bombs and cruise missiles ready to be launched from B-2 and B-52 bombers; and a further 1,670 nuclear weapons classified as “tactical.” Not fully deployed but available are an additional 10,000 or so nuclear warheads stored in bunkers around the United States.44 One would think this might be more than enough preparedness to deter the three puny nations the president identified in early 2002 as the country’s major potential adversaries—two of which, Iran and North Korea, had been trying unsuccessfully to achieve somewhat friendlier relations with the United States. The staggering overkill in our nuclear arsenal—its ability to destroy the planet several times over—and the lack of any rational connection between nuclear means and nuclear ends is further evidence of the rise to power of a militarist mind-set.

No single war or occurrence caused American militarism. Rather, it sprang from the varied experiences of American citizens in the armed forces, ideas about war as they evolved from one war to the next, and the growth of a huge armaments industry. As the international relations theorist Ronald Steel put it at the height of the Vietnam War: “We believe we have a responsibility to defend nations everywhere against communism. This is not an imperial ambition, but it has led our country to use imperial methods—establishment of military garrisons around the globe, granting of subsidies to client governments and politicians, application of economic sanctions and even military force against recalcitrant states, and employment of a veritable army of colonial administrators working through such organizations as the State Department, the Agency for International Development, the United States Information Agency, and the Central Intelligence Agency. Having grown accustomed to our empire and having found it pleasing, we have come to take its institutions and its assumptions for granted. Indeed, this is the mark of a convinced imperial power: its advocates never question the virtues of empire, although they may dispute the way in which it is administered, and they do not for a moment doubt that it is in the best interests of those over whom it rules.”45

The habitual use of imperial methods over the space of forty years became addictive. It ultimately transformed the defense establishment into a militarist establishment and vastly enlarged the size and scope of the role played by military forces in the political and economic life of the nation.


Загрузка...