7
The Crisis of the American Republic
My administration has a job to do and we’re going to do it. We will rid the world of evildoers.
—PRESIDENT GEORGE W. BUSH,
September 16,2001
The invasion of Iraq was a bandit act, an act of blatant state terrorism, demonstrating absolute contempt for the concept of international law. The invasion was an arbitrary military action inspired by a series of lies upon lies and gross manipulation of the media and therefore of the public; an act intended to consolidate American military and economic control of the Middle East masquerading—as a last resort—all other justifications having failed to justify themselves—as liberation. ... We have brought torture, cluster bombs, depleted uranium, innumerable acts of random murder, misery, degradation and death to the Iraqi people and call it “bringing freedom and democracy to the Middle East.”
—HAROLD PINTER, the 2005 Nobel Prize Lecture in Literature,
Guardian, December 7,2005
When America is no longer a threat to the world, the world will no longer threaten us.
—HARRY BROWNE,
“What Has ‘Victory’ Achieved?”
Antiwar.com, January 11,2002
As a goddess, Nemesis represents a warning that neither men and women nor countries can indefinitely ignore the demands of reciprocal justice and honesty. She is the spirit of retribution, a corrective to the greed and stupidity that sometimes governs relations among people. America’s most famous interpreter of ancient Greek culture, Edith Hamilton, tells us that Nemesis stands for “righteous anger.”1 If that is the case, we should welcome her arrival. For if we do not awaken soon to the wholesale betrayal of our basic political values and offer our own expression of righteous anger, the American republic will be as doomed as the Roman Republic was after the Ides of March that spring of 44 BC.
Several American presidents have been guilty of using excessive power during wartime. Abraham Lincoln suspended the right of habeas corpus; Woodrow Wilson had his “Red Scare” with the illegal jailing or deportation of people who opposed his intervention in World War I; Franklin Roosevelt conducted a pogrom against Americans of Japanese ancestry, incarcerating almost all of them in the continental United States in detention camps. In addition, there is no question that, from the earliest years of the republic to the 1990s, the United States witnessed a huge accretion of power by the executive branch, largely due to the numerous wars we fought and the concomitant growth of militarism. Nonetheless, the separation of powers, even if no longer a true balance of power, continued to serve as a check on any claims of presidential dominance.
When it comes to the deliberate dismantling of the Constitution, however, the events that followed the Supreme Court’s intervention in the election of 2000 that named George W. Bush the forty-third president have proved unprecedented. Bush has since implemented what even right-wing columnist George Will has termed a “monarchical doctrine” and launched, as left-wing commentator James Ridgeway put it, “a consistent and long-range policy to wreck constitutional government.”2 In doing so, Bush has unleashed a political crisis comparable to the one Julius Caesar posed for the Roman constitution. If the United States has neither the means nor the will to overcome this crisis, then we have entered the last days of the republic.
James Madison, the primary author of our Constitution, considered the people’s access to information the basic right upon which all other rights depend. This is the right that, from the moment George W. Bush entered the White House, his administration has most consistently attacked. Its implacable, sweeping claims to executive secrecy, which predate the “Global War on Terror,” go a long way toward explaining why the press and the public have been so passive in the face of this imperial presidency. In 1798, in a resolution in the Virginia legislature defending the first amendment against an act that Congress had passed the previous year, Madison denounced “a power [in the law] which, more than any other, ought to produce universal alarm, because it is levelled against the right of freely examining public characters and measures, and of free communication among the people thereon, which has ever been justly deemed the only effective guardian of every other right.”3 Bush knows that if he can wrap his acts in a cloak of official secrecy, neither Congress nor the public will be able to exercise the slightest oversight.
“A popular government without popular information, or the means of acquiring it,” Madison later wrote, “is but a prologue to a farce or a tragedy, or perhaps both. Knowledge will forever govern ignorance, and a people who mean to be their own governors must arm themselves with the power which knowledge gives.”4 In theory, given our Constitution, we should not need a Freedom of Information Act. Except for keeping the most sensitive details of military or financial operations secret, and only until they have been carried out, we should enjoy easy access to information about the activities of our government. But in the late 1950s and early 1960s, Congressman John Moss (Democrat from California) became so frustrated by his inability to get accurate information out of the federal bureaucracy that he worked virtually single-handedly for years to push the Freedom of Information Act (FOIA) through Congress.
On July 4, 1966, President Lyndon Johnson signed it, expressing “a deep sense of pride that the United States is an open society in which the people’s right to know is cherished and guarded.” As Bill Moyers, Johnson’s press secretary, later reported, “Well, yes, but what few people knew at the time is that LBJ had to be dragged kicking and screaming to the signing ceremony. He hated the very idea of the Freedom of Information Act; hated the thought of journalists rummaging in government closets; hated them challenging the official view of reality. He dug in his heels and even threatened to pocket veto the bill after it reached the White House. Only the courage and political skill of a Congressman named John Moss got the bill passed at all, and that was after a twelve-year battle against his elders in Congress who blinked every time the sun shined in the dark corridors of power.”5
From the start the FOIA exempted from requests for disclosure the federal courts, the Congress (a big mistake), and parts of the Executive Office of the President that function solely to advise and assist the president. It also excluded all classified documents and nine types of information—including national security information, confidential business information, matters of personal privacy, deliberations and decisions of federal financial institutions, geological information (concerning mining and oil rights), and certain law enforcement records. The new law did not work very well. Many agencies simply failed to respond to FOIA requests and others dragged their bureaucratic feet interminably. In 1974, in the wake of revelations that President Nixon had illegally used the CIA, the FBI, and the military to spy on the American people, Congress strengthened the act considerably. Nixon had even ordered his secret gang of personal thugs—”the plumbers”—to break into the office of the psychiatrist of former Defense Department official Daniel Ellsberg seeking material with which the White House could blackmail him.6
In an attempt to force the executive branch to comply with the law, the 1974 reforms required agencies to organize their archives in a standard manner and hold them available for public scrutiny regardless of whether or not a citizen ever asked. This ended the common practice of agencies claiming that they could not provide information requested because their archives were not adequately organized to do so. Donald Rumsfeld, then President Gerald Ford’s chief of staff, and Dick Cheney, Rumsfeld’s deputy, urged him to veto the act as “unworkable and unconstitutional.” Ford did as he was told, but Congress promptly overrode the veto.7
These amendments led to a great deal of litigation in court, making the FOIA a far more formidable oversight instrument. In June 1995, while in Tokyo, I had a conversation about the FOIA with former vice president Walter Mondale, then ambassador to Japan. As a senator, he had been deeply involved in the new law’s passage. The law, he assured me, would never have worked without the power of an applicant to go to court and force the government to comply. For example, virtually all the information now publicly available on prisoner abuse, torture, and other criminal acts by military men and women and CIA operatives at Abu Ghraib, Guantanamo Bay, Bagram Air Base, and elsewhere came via FOIA requests, first denied by government agencies and only fulfilled as a result of a court order.8
The FOIA now depends almost totally on the courts for its viability, as Bush administration officials have done their best to envelop the act in a new web of secrecy and nondisclosure. The San Francisco Chronicle’s Ruth Rosen, in one of her columns, caught the crucial moment when this occurred, itself obscured by official secrecy, “The president didn’t ask the networks for television time. The attorney general didn’t hold a press conference. The media didn’t report any dramatic change in governmental policy. As a result, most Americans had no idea that one of their most precious freedoms disappeared on October 12 [2001].”9 On that day Attorney General John Ashcroft sent a memo to all federal agencies urging them to bring every excuse they could think of to bear in turning down Freedom of Information requests. He offered agency heads backing on this stance: “When you carefully consider FOIA requests and decide to withhold records, in whole or in part, you can be assured that the Department of Justice will defend your decisions unless they lack a sound legal basis.” In marked contrast, his predecessor, Janet Reno, had advised all departments and agencies that they should honor FOIA requests so long as doing so caused “no foreseeable harm.”10
The Bush administration subverted the FOIA in ways large and small. For instance, charges were raised to excessive levels for fulfilling FOIA requests even though the law stipulates that service fees should be minimal. In January 2005, the Justice Department typically informed People for the American Way, a watchdog organization critical of the government’s record on civil rights and other issues, that it would be charged $372,999 for a search of the department’s files and disclosure of 1,200 cases in which court proceedings against immigrants arrested and confined after 9/11 were conducted in secret.11 Needless to say, small grassroots organizations cannot afford such expenses.
Three weeks after Ashcroft tried to shut down FOIA, President Bush made a tone-setting decision when it came to closing off the people’s right to know. Back in 1974, at the height of the Watergate scandal, Congress seized President Nixon’s records and tape recordings because it feared that the former president planned to destroy them. (On May 2, 1972, following the death of the longtime director of the FBI, J. Edgar Hoover, his personal secretary and lover, Clyde A. Tolson, had indeed destroyed decades of official and unofficial FBI records to keep Hoover’s many illegal acts secret.) In light of these developments, in 1978, Congress passed the Presidential Records Act, making the papers of a former president federal property upon his leaving office. It required that such records be transferred to the Archivist of the United States, who was ordered to open them to the public after no more than twelve years. The intent of the law was to lessen abuses of power under the veil of secrecy, or at least to disclose them in history books.
On November 1, 2001, just as a small portion of the Reagan administration’s presidential papers was about to be opened to the public, President Bush issued Executive Order 13233 countermanding the Presidential Records Act.12 It gave him (as well as former presidents) the right to veto requests to see his presidential records. Even if a former president wants his records released—as is the case with Bill Clinton—the order states that access will be granted only at the discretion of the sitting president in consultation with the former president, if still living. It has been widely speculated that Bush’s intent was to protect his father, a former director of the CIA and Reagan’s vice president, from being implicated in the crimes committed during the Iran-Contra affair by Reagan administration officials. Throughout the Iran-Contra investigation, George H. W. Bush argued that he had been “out of the loop” and therefore not involved in the complex illegal fund-raising for and support of the Nicaraguan Contras, who were trying to overthrow the Sandinista government. Reagan’s records might have revealed just how far out of the loop he actually was.
As Thomas Blanton, executive director of the National Security Archive at George Washington University, observes, “The Presidential Records Act was designed to shift power over presidential records to the government and ultimately to the citizens. This [Executive Order] shifts the power back.”13 Historian Richard Reeves, author of President Nixon: Alone in the White House and President Kennedy: Profile of Power, comments, “Post-Nixon, presidential papers were no longer personal property. They belonged to the American people. So, now we live in a new historical reality.”14 The American Historical Association contends that Executive Order 13233 not only violated the 1978 act but functionally canceled the law by executive fiat and so “potentially threatens to undermine one of the very foundations of our nation.” We still await a Supreme Court decision on whether the president can, through an executive order, or what is called a “signing statement,” suspend or modify a law passed by Congress. So far, Bush has gotten away with it many times, and his two 2006 appointees to the court, John Roberts and Samuel Alito, are both believers in the “theory” of “unitary executive power.”
Perhaps the most serious failure of the Supreme Court in this period was its refusal even to consider whether the Bush administration had the legal standing to round up well over a thousand foreigners in the United States in the wake of 9/11 and keep all details of their cases secret, including their names and the charges, if any, against them. We do not know whether these people were illegal aliens, visitors with tourist visas, permanent residents with Green Cards, or naturalized Americans. They were simply seized, incarcerated mostly in New York prisons, beaten by guards, and, after a lengthy time in jail, deported, usually for the most minor of offenses. Kate Martin of the Center for National Security Studies, comments, “We have a situation where the government arrested more than a thousand people in secret, and the courts let them get away with it. There is no accountability for the abuses, and secrecy allowed the abuses.”15 Not one of those arrested turned out to have the slightest connection to the 9/11 attacks.
The costs of such executive megalomania are high. As federal appellate judge Damon Keith wrote in his 2003 ruling against the Bush policy of holding hundreds of deportation hearings in secret, “Democracies die behind closed doors.... A government operating in the shadow of secrecy stands in complete opposition to the society envisioned by the Framers of the Constitution. When government begins closing doors, it selectively controls information rightfully belonging to the people. Selective information is misinformation.”16 The failure of the Supreme Court—and ultimately the public—to take notice of such outrages encouraged the Bush administration to assert ever more grandiose claims for its imperial presidency. According to New York University law professor Noah Feldman, “These claims add up to what is easily the most aggressive formulation of presidential power in our history.”17
For some thirty years, a few Republican politicians from the Ford, Reagan, and Bush pere administrations—including former president George H. W. Bush himself (and through him his son George W.), his secretary of defense, Dick Cheney, and Ford’s secretary of defense, Donald Rumsfeld—have nursed grievances about the way Congress exposed illegal activities in the wake of Watergate, Vietnam, and Iran-Contra. They have never gotten over the public’s demand that presidents should no longer go to war based on lies to Congress, such as the Vietnam-era Tonkin Gulf Resolution; that the CIA and the American military should be stopped from assassinating foreign leaders, such as President Ngo Dinh Diem of South Vietnam in 1963, and overthrowing governments that have done nothing to the United States, as they did in Chile in 1973; and that congressional oversight of our often incompetent and always deceitful intelligence agencies was long overdue.
Over the years, Dick Cheney has inveighed against President Ford’s Executive Order 11905 of February 18, 1976, which stipulated that “No employee of the United States Government shall engage in, or conspire to engage in, political assassination”; the War Powers Act of 1973, which requires that the president obtain congressional approval within ninety days of ordering troops into combat; the congressional Budget Control and Impoundment Act of 1974, which was designed to stop Nixon and any other president from impounding congressionally mandated funds for programs they do not like; the Freedom of Information Act of 1966, which Congress strengthened in 1974; and the Intelligence Oversight Act of 1980, which set up the House and Senate select committees on intelligence. Similarly, in March 2005, former president George H. W. Bush, who headed the CIA from 1975 to 1977, spluttered at a conference on counterintelligence: “It burns me up to see the agency under fire.” He was even more incensed that Congress had “unleashed a bunch of untutored little jerks” to investigate the CIA’s involvement in domestic spying, assassinations, and other illegal activities and subsequently passed laws to prevent their recurrence.18 Those “untutored little jerks” were the members of the Senate Select Committee to Study Governmental Operations with Respect to Intelligence Activities, chaired by Senator Frank Church, Democrat from Idaho, which issued its final report in 1976.
In January 2002, in an interview with ABC News, Cheney argued, “In thirty-four years, I have repeatedly seen an erosion of the powers and the ability of the president of the United States to do his job. One of the things that I feel an obligation on—and I know the president does too—is to pass on our offices in better shape than we found them.”19 But all of the legislation passed in the 1970s represented attempts to deal with crimes committed by government officials. Nonetheless, no president after Nixon has ever acknowledged the legitimacy of the War Powers Act, and most of these “limitations” on presidential power had been gutted, ignored, or violated long before Cheney became vice president. Bruce Fein, a constitutional scholar and former Reagan administration lawyer, calls them “museum pieces.”20 There is simply no evidence that, since the 1970s, there has been any real reduction in the powers of the presidency or that the Bush-Cheney government ever behaved as if it thought there were. “The vice president,” noted Republican senator John E. Sununu, “may be the only person I know of who believes the executive has somehow lost power over the last thirty years.”21
In pursuit of yet more power, Bush and Cheney have unilaterally authorized preventive war against nations they designate as needing “regime change,” directed American soldiers to torture persons seized and imprisoned in various countries, ordered the National Security Agency to carry out illegal “data mining” surveillance of the American people, and done everything they could to prevent Congress from outlawing “cruel, inhumane, or degrading” treatment of people detained by the United States (acts that were, in any case, already illegal under both U.S. law and international agreements the United States had long ago signed and ratified). They have done these things in accordance with something they call the “unitary executive theory of the presidency.”
This “theory” is, in fact, simply a bald-faced assertion of presidential supremacy in all matters relating to foreign affairs dressed up in legalistic mumbo jumbo. Its classic expression is contained in the August 1, 2002, “torture memo” conceived and written by a group of ultraconservative lawyers in the White House, Justice Department, and Vice President’s office. Among them are John Yoo, a young, right-wing Korean-American scholar and a former law clerk for Supreme Court Justice Clarence Thomas, who served as a lawyer in the Justice Department’s Office of Legal Counsel; Alberto Gonzales, then the White House’s legal counsel; and David S. Addington, a former lawyer for the CIA, the Pentagon’s general counsel when Cheney was secretary of defense, and then chief of staff in Cheney’s office.22
The torture memo justified its extreme views by claiming that the commander-in-chief power even overrides U.S. laws: “In light of the president’s complete authority over the conduct of war, without a clear statement otherwise, criminal statutes are not read as infringing on the president’s ultimate authority in these areas.” Ratified treaties, congressionally enacted statutes, and military orders prohibiting torture “must be construed as inapplicable to interrogations undertaken pursuant to his commander-in-chief authority.... Congress may no more regulate the president’s ability to detain and interrogate enemy combatants than it may regulate his ability to direct troop movements on the battlefield.” The same principle holds for “federal officials acting pursuant to the president’s constitutional authority... . The Framers understood the [commander-in-chief] clause as investing the president with the fullest range of power,” including “the conduct of warfare and the defense of the nation unless expressly assigned in the Constitution to Congress.” That “sweeping grant” of power, the memo continued, is given because “national security decisions require the unity in purpose and energy in action that characterize the presidency rather than Congress.”23
Yoo and company have concocted something that looks very much like an American version of the Chinese Communists’ “Two Whatevers.” These were the basic principles that prevailed during the years when the cult of Mao Zedong was ascendant: “We will resolutely uphold whatever policy decisions Chairman Mao makes; and we will unswervingly follow whatever instructions Chairman Mao gives.” Substitute Bush for Mao and you get the idea. Time magazine contends that, according to the White House and the Justice Department, “The Commander in Chief’s pursuit of national security cannot be constrained by any laws passed by Congress, even when he is acting against U.S. citizens.”24 Bruce Schneier, author of Beyond Fear: Thinking Sensibly About Security in an Uncertain World, sees an even more ominous development: “The president can define war however he chooses, and remain ‘at war’ for as long as he chooses. This is indefinite dictatorial power. And I don’t use that term lightly; the very definition of a dictatorship is a system that puts a ruler above the law.”25 The implications for the constitutional separation of powers are thus grave, particularly since the unitary executive theory flies in the face of the Constitution itself.
As Dan Farber, a professor of law at the University of California, Berkeley, and author of Lincolns Constitution, reminds us, “Constitutional law derives from the language of the Constitution, the original understanding, and two centuries of Supreme Court precedent. Often, these three are ambiguous or contradict each other, but not here. All three make it clear that the president must share power with Congress and the courts, in war as well as in peace.”26 Article 2 stresses without qualification that the president “shall take care that laws be faithfully executed.” Many famous Supreme Court justices have emphasized, as Justices Felix Frankfurter and Hugo Black did in 1952, “The power to execute the laws starts and ends with the laws Congress has enacted.” The Constitution explicitly gives Congress the power to declare war, to raise and support armies, to equip the navy, to call out the militia (today, the National Guard), and to “make rules for the Government and Regulation of the land and naval forces.”
Perhaps the closest thing to malpractice in Yoo’s theory is his failure to mention the most important legal precedent defining the balance of power between Congress and the president during wartime: the 1952 case Youngstown Sheet and Tube Company v. Sawyer.27 During the Korean War, faced with the possibility of a strike that threatened to shut down the steel industry, President Harry Truman ordered the Department of Commerce to seize all steel plants and suspend the labor laws. The Supreme Court promptly declared that the president’s commander-in-chief powers did not extend to areas in which Congress had passed legislation—in this case, the Taft-Hartley Act of 1947, which regulated strikes—and that he had exceeded his authority.
Concurring in the judgment and the opinion of the court, Justice Robert H. Jackson wrote, “[T]he Constitution did not contemplate that the title Commander-in-Chief of the Army and Navy will constitute [the president] also Commander-in-Chief of the country, its industries, and its inhabitants. He has no monopoly of ‘war powers,’ whatever they are.... His command power is not such an absolute as might be implied from that office in a militaristic system but is subject to limitations consistent with a constitutional Republic whose law and policy-making branch is a representative Congress. The purpose of lodging dual titles in one man was to insure that the civilian would control the military, not to enable the military to subordinate the presidential office. No penance would ever expiate the sin against free government of holding that a president can escape control of executive powers by law through assuming his military role.” In the Youngstown case, both Justices Robert Jackson and Frankfurter, in their concurring opinions, quoted Justice Louis Brandeis’s dissent in the 1926 case Myers v. United States: “The doctrine of the separation of powers was adopted by the Convention of 1787 not to promote efficiency but to preclude the exercise of arbitrary power. The purpose was, not to avoid friction, but by means of the inevitable friction incident to the distribution of the governmental powers among three departments, to save the people from autocracy.”
Among the many instances in which George W. Bush has ignored his oath of office—”I will faithfully execute the office of President of the United States, and will to the best of my ability, preserve, protect, and defend the Constitution of the United States”—perhaps the most blatant has been the way he secretly authorized the National Security Agency (NSA), the country’s leading cryptological and signals intelligence agency, to eavesdrop on Americans without a court-approved warrant. Such warrants are required by the Fourth Amendment to the Constitution and by the Foreign Intelligence Surveillance Act (FISA), which President Jimmy Carter signed into law on October 25, 1978.28 Except in terms of a raw expansion of basic presidential powers, it is close to inexplicable why Bush chose to ignore the FISA law, since it would have readily facilitated virtually anything he wanted to do in the way of wiretapping. Enacted in the wake of revelations that the federal government had routinely, if illegally, tapped the telephones of people who opposed the war in Vietnam, the FISA law was anything but a strong reaffirmation of the prohibition against unreasonable searches and seizures in the Bill of Rights.
As its title indicates, the Foreign Intelligence Surveillance Act allows the FBI and the NSA to listen in on American citizens in order to collect intelligence, and it set up a secret court to issue warrants based on requests from the intelligence community. From its inception in 1979 through 2004, the FISA court issued 18,742 secret warrants while denying only four government requests.29 The court was originally made up of seven federal judges appointed by the chief justice of the Supreme Court; the USA Patriot Act of 2001 expanded that number to eleven. The judges’ identities are secret. They meet in total privacy behind a cipher-locked door in a windowless, bugproof, vaultlike room guarded twenty-four hours a day on the top floor of the Justice Department’s building in Washington, D.C. Everything they do is “top secret.”
The judges hear only the government’s side. The court makes annual reports to Congress, normally just two paragraphs long, that give only the total number of warrants it has approved. Beyond that, there is no congressional oversight of the court’s activities whatsoever. The law even allows emergency taps and searches for which a warrant can be issued retroactively if the government notifies the court within seventy-two hours. Compared with ordinary wiretaps, for which the government must provide a federal district court judge with evidence of “probable cause” that the person or persons under investigation are likely to commit a crime, the FISA process is weighted toward the government, not the citizen, and not surprisingly the secret court has authorized more warrants than all federal district judges combined.30
Nonetheless, immediately following 9/11, the president issued a secret executive order authorizing the National Security Agency to tap at will into the private communications of American citizens. Unknown bureaucrats at the NSA make the decisions about who is to be tapped without any supervision by a court or elected representatives of the people. When newspaper reporters got wind of what the president had done, the White House intervened to try to keep the information secret. On national security grounds, the New York Times was asked to sit for more than a year on the story of how the NSA was violating the law. Finally, on December 6, 2005, when publication was imminent, President Bush summoned the Times s publisher Arthur Sulzberger Jr., and executive editor Bill Keller to the Oval Office and asked them to desist in the name of national security, the war on terror, and 9/11. But the president was unable to offer any sound legal basis for what he had done nor why the cover-up should continue. On December 16, 2005, a year late in terms of the public’s right to know, the New York Times finally printed the story.31 On December 20, one of the hitherto unknown FISA court judges, James Robertson, resigned in protest, a totally unprecedented action.
There is no obvious reason beyond trying to obtain pure power why the president chose to ignore FISA and go directly against an act of Congress. The syndicated columnist Paul Craig Roberts has speculated that Bush could not ask for warrants for the kinds of spying he wanted done because he had no legitimate reasons to offer even the lenient FISA court. Roberts suggests that he might have been using the spy apparatus of the U.S. government to influence the outcome of the 2004 presidential election or that he might have been collecting information on his Democratic Party opponents in order to blackmail them.32 Former senior adviser to President Clinton and Washington bureau chief of Salon.com Sidney Blumenthal believes the administration simply had no probable cause for the NSA surveillance. The court, after all, must adhere to the law and cannot simply authorize surveillance because the president or an intelligence agency wants to eavesdrop on someone. It is also possible that the administration wanted to avoid the FISA court because what evidence it had supporting probable cause had been obtained by torture, which conceivably might cause the court to reject an application (although these days no one should count on it).33
Intelligence expert Thomas Powers, author of Intelligence Wars: American Secret History from Hitler to Al-Qaeda, has another theory entirely. He believes that the issue was not specific surveillance but the administration s desire to use the NSA to keep alive an ambitious Pentagon data-mining project called Total Information Awareness (TIA) after Congress (and the public) expressed outrage over its existence and in September 2003 ordered it stopped. TIA was the brainchild of John Poindexter, a former admiral and Ronald Reagan’s national security adviser, who was convicted of seven felonies for his part in the Iran-Contra affair but was exonerated on appeal. A computer fanatic’s ideal of “data mining,” TIA, as Poindexter imagined it, was to compile everything that could be known about a vast range of individuals and then comb through such mountains of data for correlations that the government might find suspect. One of TIA’s key collaborators was the National Security Agency, which supplied much of the data that went into its individual profiles.34
On November 14, 2002, the New York Times s conservative columnist William Safire outlined the kind of data TIA sought: “Every purchase you make with a credit card, every magazine subscription you buy and medical prescription you fill, every web site you visit and e-mail you send or receive, every academic grade you receive, every bank deposit you make, every trip you book, and every event you attend—all these transactions and communications will go into what the Defense Department describes as a Virtual centralized grand database.’ “35 Add to that all government information—passport applications, drivers’ licenses, judicial and divorce records, IRS files, complaints by nosy neighbors, plus the latest hidden camera surveillance—and one has the perfect American computer version of Gestapo or KGB files.
There is growing evidence that in 2003 the TIA project was stopped in name only. The National Security Agency continued snooping and collecting data as before, while the analytical work was transferred to a new, totally secret agency inside the Pentagon known as the Counterintelligence Field Activity (CIFA). Its original specialty was illegally watching, photographing, and harassing peaceful public protests outside foreign and domestic military bases. According to Walter Pincus of the Washington Post, CIFA has “grown from an agency that coordinated policy and oversaw the counterintelligence activities of units within the military services and Pentagon agencies to an analytic and operational organization with nine directorates and ever-widening authority.” It has become known as “the superpower of data mining within the U.S. national security community.... Since March 2004, CIFA has awarded at least $33 million in contracts to corporate giants Lockheed Martin, Unisys Corporation, Computer Sciences Corporation, and Northrop Grumman to develop databases that comb through classified and unclassified government data, commercial information, and Internet chatter to help sniff out terrorists, saboteurs, and spies.”36
In 2005, CIFA reportedly “contracted with Computer Sciences Corp. to buy identity-masking software, which could allow it to create fake Web sites and monitor legitimate U.S. sites without leaving clues that it had been there.” A former senior Pentagon official familiar with CIFA told Pincus, “They started with force protection from terrorists, but when you go down that road, you soon are into everything .. . where terrorists get their money, who they see, who they deal with.” Because the National Security Agency is a major source of CIFAs data, that may have been one reason why Bush ordered the NSA to engage in surveillance of citizens completely outside the purview of the FISA court, which probably would not have approved open-ended data mining.37
One further way in which President Bush has shown his contempt for the Constitution is his use of what are called “signing statements.” During the first six years of his presidency, Bush did not exercise his constitutionally authorized veto over a single piece of legislation passed by Congress, but in his first term alone, he issued 505 extraconstitutional challenges to various provisions of legislation that had been enacted by Congress.38 Through “interpretive” statements issued at the time he signs them, the president disagrees with one or more provisions contained in the legislation and therefore reserves the right not to implement them. According to David Golove, a New York University law professor, “The signing statement is saying T will only comply with this law when I want to, and if something arises in the war on terrorism where I think it’s important to torture or engage in cruel, inhuman, and degrading conduct, I have the authority to do so and nothing in this law is going to stop me.’ “39
Many of these statements amount to illegal line-item vetoes. They often have the effect of nullifying legislation that has been passed by both houses of Congress and signed by the president. In 1998, in Clinton v. New York, the Supreme Court held that a line-item veto is unconstitutional because it violates “the Constitution’s Presentment Clause. That Clause says that after a bill has passed both houses, but ‘before it becomes a law,’ it must be presented to the president, who ‘shall sign it’ if he approves, but ‘return it’—that is, veto the bill, in its entirety—if he does not.”40 Bush’s signing statements eliminate the possibility of the Congress overriding his veto since they take effect (whatever that might mean) after the bill has already become law, and they violate the first sentence of the Constitution’s first article: “All legislative powers herein granted” belong to Congress. As the framers carefully explained, this means only the “Senate and House of Representatives”—not the president in the act of signing a bill into law.41
One of the most striking examples of the legal quagmire created by these signing statements lies in the 2006 Defense Appropriation Bill. On the initiative of Republican senator John McCain, who was himself tortured while a prisoner of war in Vietnam, the Senate added an amendment to the defense-spending authorization and called it the Detainee Treatment Act of 2005. It reads, “No individual in the custody or under the physical control of the United States government, regardless of nationality or physical location, shall be subject to cruel, inhuman, or degrading treatment or punishment,” and it provides for “uniform standards” of interrogation. President Bush threatened to exercise his first veto over the whole Pentagon budget because of this amendment. Then he and Vice President Cheney lobbied Congress intensively in order to retain the Pentagon’s and the CIA’s “right” to the secret use of torture (although never termed torture, of course) without fear of domestic prosecution. When the Senate responded by passing McCain’s torture ban by a veto-proof vote of 90-9, the White House turned to extralegal means to get what it wanted.42
On December 15, 2005, in a photo session at the White House, President Bush and Senator McCain shook hands and Bush announced that this landmark legislation would make it “clear to the world that this government does not torture.” However, on Friday evening, December 30, when he actually signed the bill at his Crawford, Texas, ranch, Bush added a signing statement that essentially gutted McCain’s amendment. It said that he would construe the new law “in a manner consistent with the constitutional authority of the president,” that he would order whatever he deemed necessary in his war on terror, and that, as president “in a time of war,” he was beyond any legal constraints. Elisa Massimino, the Washington director of Human Rights First, commented that “[t]he basic civics lesson that there are three coequal branches of government that provide checks and balances on each other is being fundamentally rejected by the executive branch.”43
It is not clear how this muddled situation will ultimately be resolved, but its immediate costs are high. A former army interrogator at Abu Ghraib prison writes, “Those who serve in the prisons of Iraq deserve to know clearly the difference between legal and illegal orders. Soldiers on the ground need a commander in chief who does not seek strained legalisms that ‘permit’ the use of torture.... No slope is more slippery, I learned in Iraq, than the one that leads to torture.”44 As of mid-2006, none of President Bush’s signing statements had been tested in court.
Moreover, it is not just the executive branch that has been tearing at the fabric of the Constitution. Through its partisanship, complacency, and corruption, Congress has done much to ensure that the crisis of the American republic will be fatal to democratic government. As constitutional specialist Noah Feldman writes, “For the last four years, a republican Congress has done almost nothing to rein in the expansion of presidential power. This abdication of responsibility has been even more remarkable than the president’s assumption of new powers.”45 Al Gore, who served eight years in the House, eight years in the Senate, and presided over the Senate for eight years as vice president, observes, “The sharp decline of congressional power and autonomy in recent years has been almost as shocking as the efforts by the executive branch to attain a massive expansion of power.... Moreover, in the Congress as a whole—both House and Senate—the enhanced role of money in the re-election process, coupled with the diminished role for reasoned deliberation and debate, has produced an atmosphere conducive to pervasive institutionalized corruption. ... It is the pitiful state of our legislative branch that primarily explains the failure of our vaunted checks and balances to prevent the dangerous overreach of the executive branch, which now threatens a radical transformation of the American system.”46
I happen to be a registered voter in the Fiftieth Congressional District of California in northern San Diego county, where, in early 2006, our Republican representative for the previous fourteen years, Randy “Duke” Cunningham, received the longest sentence to a federal prison—eight years and four months—ever imposed on a member of Congress. Cunningham, a decorated Vietnam War pilot, confessed to pocketing $2.4 million, the largest bribe ever paid to a member of Congress. He had used his official positions on the Appropriations and Intelligence Committees to see that contracts worth millions of dollars went to defense manufacturers who had paid him off, and he did this primarily by adding classified earmarks to the Defense Appropriations bills and pressuring Pentagon officials to buy things they had made clear they did not want. The term “earmarks” is congressional jargon for spending by a lone representative, who surreptitiously tacks expenditures onto a larger appropriations bill that the House then passes without further scrutiny.
Well before the bribery charges were filed, I described Cunningham in the press as totally bought and paid for by the military-industrial complex.47 However, I did so on the basis of published campaign contributions. It did not occur to me that, in selling his vote to munitions makers, as so many other members of Congress have done—including Cunningham’s friend, neighbor in California’s Fifty-second District, and chairman of the House Armed Services Committee, Republican representative Duncan Hunter—he was so stupid as to have actually accepted material bribes for his corrupt acts. If a member of Congress can claim there was no quid pro quo involved in accepting money from strangers, it is technically legal. Most members who want to line their pockets are content to wait and do so as lobbyists after retiring or being defeated. According to the Center for Responsive Politics, Hunter and Cunningham rank second and third among all members of Congress (first is Pennsylvania Democratic representative John P. Murtha) in terms of the total amount of money they have received from the defense industry.48
In buying Cunningham’s influence, two San Diego-based defense contractors, Mitchell Wade, CEO of MZM Inc., which among other things provided Arabic translators for Abu Ghraib prison in Baghdad, and Brent Wilkes, CEO of ADCS Inc., supplied Cunningham with cash, a down payment and mortgage payments on a 7,628-square-foot mansion in an exclusive San Diego enclave, Persian rugs, antique French armoires, two yachts, a Rolls-Royce, and a college graduation party for his daughter.49 In return, Cunningham arranged for $163 million in Pentagon contracts for MZM, which had not done much business with the Defense Department until Wade met him, and more than $90 million for Wilkes’s company for converting old documents into computer-readable files. (Wilkes wanted to digitize the century-old archives dealing with the building of the Panama Canal, not exactly vital to the Global War on Terror and something Pentagon officials repeatedly insisted they did not need.) Senior correspondent for the American Prospect Laura Rozen notes, “Duncan Hunter [was] identified by a Defense Department Inspector General report—along with Cunningham—as actively intervening with the Pentagon to try to award a contract to a document-conversion company that had given him tens of thousands of dollars in campaign contributions for a program the Pentagon did not request or consider a priority.”50 Wilkes’s technology was imported from Germany.
It is important to stress that the distinction in Congress between a bribe and a legal donation is a bit of sophistry intended to conceal the routine corruption of our elected representatives. As Bill Moyers has put it, “If [in baseball] a player sliding into home plate reached into his pocket and handed the umpire $1000 before he made the call, what would we call that? A bribe. And if a lawyer handed a judge $1000 before he issued a ruling, what do we call that? A bribe. But when a lobbyist or CEO [chief executive officer of a corporation] sidles up to a member of Congress at a fund-raiser or in a skybox and hands him a check for $1000, what do we call that? A campaign contribution.”51
Brent Wilkes was more experienced at buying influence than Wade. He supplied private jet flights for House majority leader Tom DeLay and Republican Roy Blunt and became a “pioneer” in the Bush-Cheney 2004 reelection campaign by raising $100,000. From 1995 to 2005, Wilkes and his associates gave more than $840,000 to at least thirty-two congressional campaigns or their political action committees.52 According to the Federal Election Commission, the recipients included Representative John Doolittle (Republican from California), total $82,000; Representative Randy Cunningham, $76,500; Representative Jerry Lewis (Republican from California), $60,000; Representative Tom DeLay (Republican from Texas), $57,000; Representative Duncan Hunter, $39,200; Senator Larry Craig (Republican from Idaho), $29,000; Representative Jerry Weller (Republican from Illinois), $27,500; Representative Benjamin Gilman (Republican from New York), $25,843; Representative Roy Blunt (Republican from Missouri), $17,000; and Senator Lindsey Graham (Republican from South Carolina), $14,000.53
The culprits are not just Republicans. Consider the actions of the senators from Florida in 2006. In the 2006 federal budget, Republican senator Mel Martinez earmarked defense appropriations for Florida contractors worth $316 million. Since 2003, companies that received defense contracts made $33,000 worth of campaign contributions to Martinez. Democratic senator Bill Nelson, a member of the Armed Services Committee, obtained $916 million for defense projects, about two-thirds of which went to the Florida-based plants of Boeing, Honeywell, General Dynamics, Armor Holdings, and other munitions makers. Since 2003, Nelson has received $108,750 from thirteen companies for which he arranged contracts.54
Under such circumstances, it is still possible to imagine that some congressional votes in areas where money is flowing are not being influenced by campaign contributions, but only if the members are independently wealthy, and even then it is highly unlikely. There are, in addition, other ways to influence Congress, particularly through lobbying. The numbers of lobbyists, the amounts of money involved in lobbying, and the ties between the lobbying industry, the dominant Republicans in Congress, and the White House have all exploded in the Bush years. “Since Bush was elected,” according to Bill Moyers, “the number of lobbyists registered to do business in Washington has more than doubled. That’s 16,342 lobbyists in 2000 to 34,785 [in 2005]. Sixty-five lobbyists for every member of Congress.”55 In September 2005, Tom DeLay was forced to resign as majority leader of the House when he was indicted for channeling corporate contributions to politicians in Texas. He was the chief conduit of master lobbyist Jack Abramoff, who in January 2006 confessed to cheating his clients while spending lavishly on congressional junkets, meals, and campaign contributions. Some twenty-nine former staff members of DeLay’s congressional office have left government service to accept positions as lobbyists in major Washington law firms, the largest number working for any member of Congress.
Typical of the DeLay-Abramoff operations was their lobbying for the Commonwealth of the Northern Mariana Islands. After World War II, these specks of land in the Pacific 5,625 miles west of San Francisco—the largest of which is the island of Saipan—became a United Nations trust territory, administered by the United States Department of the Interior. Under a scheme to make Saipan a sweatshop, the Interior Department exempted the islands from U.S. labor and immigration laws. There is no minimum wage on Saipan. Tens of thousands of Chinese women live in dormitories with no basic political rights; they are prohibited from marrying and are paid almost nothing. They work producing clothes with “Made in the USA” labels for companies like Levi Strauss & Co., the Gap, Eddie Bauer, Reebok, Polo, Nordstrom, Lord & Taylor, and Liz Claiborne, which are then shipped duty-free to the United States. The sweatshop operators, the biggest of whom are naturalized U.S. citizens of Chinese ancestry, paid Abramoff nearly $10 million, part of which he used to book congressmen and their “significant others” into luxury hotels and exclusive golf courses on Saipan, to ensure that Congress did not pass a minimum-wage law for the islands. Abramoff took DeLay and his wife there, and the congressman was moved to declare that the Marianas “represented what is best about America,” calling them “my Galapagos.”56 Other major clients of the Abramoff-DeLay lobbying duo include gambling casinos on Indian reservations, Russian oil and gas interests, and the U.S. Family Network.
The mainstream press regularly refers to members of Congress as “lawmakers,” but that phrase bears little relationship to what they actually do. An excellent example is the Foreign Operations bill for fiscal year 2005. At the time of passage, according to Los Angeles Times correspondent Ken Silverstein, it was “the biggest single piece of pork-barrel legislation in American history.”57 On November 17, 2004, a small group of senators and representatives from their respective appropriations committees folded into the bill funds for the Departments of Justice, State, Energy, Labor, Commerce, Education, Agriculture, Transportation, the Treasury, Interior, Veterans Affairs, Health and Human Services, and Housing and Urban Development, as well as the running expenses for the entire legislative and judicial branches. Around 12:15 a.m. on November 20, 2004, staff members, working frantically, made the 3,320-page bill available to “legislators” on the Web site of the House Rules Committee. The House put it to a vote at approximately 4:00 p.m. in the afternoon of the same day and the Senate followed suit at 8:42 p.m. that evening. The legislation passed the House by a margin of 344 to 51 and the Senate by 65 to 30. It would have been a physical impossibility for any member to have read the entire piece of legislation in the time available, much less thought about what it involved. The bill included 11,772 separate earmarks worth a combined total of nearly $16 billion. Silverstein observes, “Of who added these grants, no public record exists.”58
Earmarking of defense spending has more than tripled since fiscal year 1995, and the Department of Defense’s black budget, which is secret from all citizens and virtually all members of Congress, was estimated at the end of 2005 at $28 billion per year: $14.2 billion for purchases of hardware and $13.7 billion in so-called research and development expenditures. According to Citizens Against Government Waste, in 1995 Congress approved 1,439 earmarked appropriations; in 2005, the number had risen to 13,998. Gordon Adams, director of security studies at George Washington University and a former White House budget director for national security, notes that members of such influential congressional committees as Intelligence and the Defense Subcommittees of the House and Senate Appropriations Committees “have a lot of power . . . and are sitting in a place where a lot of money flows... . There are huge opportunities here for politicians to tweak the system to their advantage. The smell of corruption is in the air.”59
Franklin Spinney, for thirty years a budget analyst in the Pentagon, said in a discussion with Bill Moyers, “The military-industrial-Congressional complex is a political economy with a big P and a little E. It’s very political in nature. Economic decisions, which should prevail in a normal market system, don’t prevail in the Pentagon, or in the military-industrial complex. So what we have is a system that essentially rewards its senior players.... We have a term for it, it’s a self-licking ice cream cone.”60 Moyers pointed out that pay for chief executive officers at Lockheed Martin went up from $5.8 million in 2000 to $25.3 million in 2002, at General Dynamics from $5.7 million in 2001 to $15.2 million in 2002, and at Northrop Grumman from $7.3 million in 2000 to $9.2 million in 2002.
Spinney explained that a main lobbying strategy of the military-industrial complex is to emphasize to members of Congress how many jobs are dependent on a particular contract being approved, rather than the usefulness or feasibility of a weapon. Lobbyists’ letters and presentations to members of Congress always include maps showing precisely the communities that will be enriched by Pentagon spending and the funds they will receive.
Coming at it from a somewhat different political perspective is Winslow Wheeler, from 1996 to 2002 the senior analyst for national security on the Republican staff of the Senate Budget Committee and before that an aide to Senators Pete Domenici (Republican from New Mexico), Jacob Javits (Republican from New York), and Nancy Kassebaum (Republican from Kansas). After thirty years working on Capitol Hill, Wheeler retired and devoted himself to revealing the “systemic problems that reduce government to an exploitative system and make it possible for special interests to manipulate it at will.”61 He has documented how senators added $4 billion in useless “pork” projects to benefit their own states immediately after the 9/11 attacks, including Senator Robert Byrd (Democrat from West Virginia), who strongly opposed going to war against Iraq but nonetheless asked for funds to build an army museum in his home state. Senator Ted Stevens (Republican from Alaska), one of the stalwarts of the missile defense lobby because most of the ground-based interceptors are located in silos in his state, asked for post-9/11 funds to build parking garages (for automobiles, not missiles).
Wheeler’s major study, Wastrels of Defense: How Congress Sabotages U.S. Security, was not brought out by a leftist or liberal publisher but by the Naval Institute Press.62 He draws on his own experience to explain how dependent most members of Congress are on their staffs and how most staff officials spend their time inserting earmarks and add-ons to defense bills rather than actually trying to determine how the money of the people of the United States should be spent to achieve security. Between fiscal years 2001 and 2002, just as Wheeler’s career in the Senate was coming to an end, so-called add-ons—that is, unrequested spending for the Pentagon—jumped from $3.3 billion to $5.4 billion, not including, in 2002, $583 million for thirty-two projects added at the last minute by the House-Senate Conference Committee for items neither requested by the Pentagon nor included in the House or Senate bills.63 Wheeler presents numerous examples of how pork projects inserted into legislation to favor special interests undermined or took the place of serious defense projects. He notes that whereas defense appropriations bills in the 1980s might have had as many as two or three hundred pork items, in 2005 or 2006 a bill contains thousands. His argument is that after more than two centuries, the system of checks and balances built into our government by the Constitution no longer works.
If the corruption of the legislative branch were not enough to scuttle the separation of powers, the Congress regularly goes out of its way to bow down to the president. After the press revealed that the National Security Agency was illegally eavesdropping on the private conversations of American citizens and that President Bush had trashed the Foreign Intelligence Surveillance Act, the majority leadership in Congress introduced legislation that, in essence, would have retroactively forgiven him. As the New York Times editorialized, “Imagine being stopped for speeding and having the local legislature raise the limit so you won’t have to pay the fine. It sounds absurd, but it’s just what is happening to the 28-year-old law that prohibits the president from spying on Americans without getting a warrant from a judge. It’s a familiar pattern. President Bush ignores the Constitution and the laws of the land, and the cowardly, rigidly partisan majority in Congress helps him out by rewriting the law he’s broken.”64 A Congress that is indifferent to the separation of powers has given up its raison d’etre as surely as the Roman Senate became a mere social club for old aristocrats paying obeisance to Augustus Caesar.
Similarly, even before President Bush undercut the McCain amendment to the Defense Appropriations Bill with his signing statement, Republican senator Lindsey Graham contributed another amendment that removed the federal courts’ jurisdiction over Guantanamo prisoners who were hoping to challenge the legality of their detention. It states explicitly that “no court, justice, or judge shall have jurisdiction to hear or consider habeas corpus applications on behalf of those incarcerated by the Department of Defense in prisons at Guantanamo Bay, Cuba.” The Senate passed this remarkably cruel piece of legislation by a vote of 49 to 42. It effectively repudiated the Supreme Court’s 2004 decision in Rasul v. Bush, which gave non-U. S. citizens at Guantanamo the right to file claims based on habeas corpus in the federal courts. The legal scholar Brian Foley explains that habeas corpus forces the executive branch “to justify its detention of any person. It is a check for preventing the Executive from becoming too powerful. After all, an Executive that can jail anyone it dislikes, for as long as it likes, is a formidable power indeed.”65
An authority on American use of torture over the years, Alfred W. McCoy, adds, “Senator McCain’s now-compromised ban on cruel treatment of detainees was effectively eviscerated by Graham’s denial of legal redress. To nullify the landmark Supreme Court ruling that Guantanamo is, in fact, American territory and so falls under the purview of U.S. courts, Graham also stipulated in the final legislation that ‘the term “United States,” when used in a geographic sense, does not include the United States Naval Station, Guantanamo Bay.’ In this way, he tried once again to deny detainees any legal basis for access to the courts. In effect, McCain’s motion more or less bans torture, but Graham’s removes any real mechanism for enforcing such a ban.”66
Senator Graham claimed that he was merely trying to increase the efficiency of the federal courts, that his amendment was necessary “to eliminate a blizzard of legal claims from prisoners that was tying up Department of Justice resources.”67 It is hard to imagine a lamer excuse for dishonoring such a well-established norm of American civil liberties. The Graham provision also explicitly gives the military officers who sit in judgment over Guantanamo prisoners in what the government calls “Combatant Status Review Tribunals” the right to use evidence obtained by torture. The effect of the law is, as Brian Foley argues, “to give the Executive unreviewable power.... A person can be captured, shackled, and sent to Guantanamo and never given a hearing.... He has no right to a hearing because he cannot enforce that right in a court. He can be tortured, because he cannot go to court to enforce a right not to be tortured.”68
On June 29, 2006, the Supreme Court complicated these matters by declaring that the Guantanamo military commissions that Bush had created without congressional authorization were, in fact, unconstitutional. The case, Harridan v. Rumsfeld, also repudiated the main provisions of Senator Graham’s law. However, the justices voted 5-3, with Bush’s new chief justice, John Roberts, not participating because he had ruled on the case as an appeals court justice, and the Republican Congress pledged to enact legislation that would allow Bush to proceed anyway with his drumhead courts.
The separation of powers that the Founders wrote into our Constitution as the main bulwark against dictatorship increasingly appears to be a dead letter, with the Congress no longer capable of asserting itself against presidential attempts to monopolize power. Corrupt and indifferent, the Congress, which the Founders believed would be the leading branch of government, is simply not up to the task of confronting a modern Julius Caesar. As former representative Bob Barr, a conservative from Georgia, concludes, “The American people are going to have to say, ‘Enough of this business of justifying everything as necessary for the war on terror.’ Either the Constitution and the laws of this country mean something or they don’t. It is truly frightening what is going on in this country.”69
If the legislative branch of our government is broken—and it is hard to imagine how it could repair itself, given the massive interests that feed off it—the judicial branch is hardly less limited today in terms of its ability to maintain the balance. Even the Supreme Court’s most extraordinary power, its ability to nullify a law as unconstitutional, rests on precedent rather than constitutional stipulation, and lower courts, increasingly packed with right-wing judges, have little taste for going against the prevailing political winds. For example, on February 16, 2006, U.S. District Court judge David Trager dismissed a suit for damages by a thirty-five-year-old Canadian citizen, Maher Arar, who in 2002 was seized by U.S. government agents at Kennedy Airport, New York, en route to Ottawa. Arar was shackled, hustled aboard a CIA airplane, and delivered to Syria, where he was tortured for ten months before being released. No charges were ever filed against him, and even his torturers declared that they had been unable to discover any evidence that might link him to a terror network. The case for compensation, not to mention an apology, seemed open and shut.
In dismissing Arar’s suit, Judge Trager wrote that foreign policy and national security issues raised by the U.S. government were “compelling” and that such matters were the purview of the executive branch and Congress, not the courts. He acknowledged that in sending Arar to Syria, the U.S. government knew he would be tortured—the State Department had already publicly detailed the Syrians’ capabilities and record as torturers. New York Times columnist Bob Herbert asked, “If kidnapping and torturing an innocent man is O.K., what’s not O.K.?”70
The evidence strongly suggests that the legislative and judicial branches, having become so servile in the presence of the imperial presidency, have largely lost the ability to respond in a principled and independent manner. Could the people themselves restore constitutional government? A grassroots movement to abolish the CIA, break the hold of the military-industrial complex, and establish public financing of elections may be theoretically conceivable but is unlikely given the conglomerate control of the mass media and the difficulties of mobilizing our large and diffuse population.
It is also possible that, at some future moment, the U.S. military could actually take over the government and declare a dictatorship (though they undoubtedly would find a gentler, more user-friendly name for it). That is how the Roman Republic ended. But I think it unlikely that the American military will go that route. In recent years, the officer corps has become more “professional,” as well as more political and more Republican in its sympathies, while the all-volunteer army has become an ever more separate institution in our society, its profile less and less like that of the general populace. Nonetheless, for the military voluntarily to move toward direct rule, its leaders would have to ignore their ties to civilian society, where the symbolic importance of constitutional legitimacy remains potent.
Rebellious officers might well worry about how the American people would react to such a move. Moreover, prosecutions of low-level military torturers from Abu Ghraib prison have demonstrated to enlisted ranks that obedience to illegal orders can result in their being punished, whereas officers go free. No one knows whether ordinary soldiers would obey clearly illegal orders to oust the elected government or whether the officer corps has sufficient confidence to issue such orders. For the time being at least, the highest medal for bravery and sacrifice in the American military is still the Congressional Medal of Honor, not the Victoria Cross, the Iron Cross, or the Order of Lenin. In addition, the present system already offers the military high command so much—in funds, prestige, and future employment via the military-industrial revolving door—that a perilous transition to anything like direct military rule would make little sense under reasonably normal conditions.
The likelihood is that the United States will maintain a facade of constitutional government and drift along until financial bankruptcy overtakes it. Of course, bankruptcy will not mean the literal end of the United States any more than it did for Germany in 1923, China in 1948, or Argentina in 2001-2. It might, in fact, open the way for an unexpected restoration of the American system, or for military rule, or simply for some new development we cannot yet imagine. Certainly, such a bankruptcy would mean a drastic lowering of our standard of living, a loss of control over international affairs, a process of adjusting to the rise of other powers, including China and India, and a further discrediting of the notion that the United States is somehow exceptional compared to other nations. We will have to learn what it means to be a far poorer nation and the attitudes and manners that go with it. As Anatol Lieven, author of America Right or Wrong: An Anatomy of American Nationalism, concludes, “U.S. global power, as presently conceived by the overwhelming majority of the U.S. establishment, is unsustainable.... The empire can no longer raise enough taxes or soldiers, it is increasingly indebted, and key vassal states are no longer reliable The result is that the empire can no longer pay for enough of the professional troops it needs to fulfill its self-assumed imperial tasks.”71
On February 6, 2006, the Bush administration submitted to Congress a $439 billion defense appropriation budget for fiscal 2007. At the same time, the deficit in the United States’ current account—the imbalance in the trading of goods and services as well as the shortfall in all other cross-border payments from interest income and rents to dividends and profits on direct investments—underwent its fastest-ever quarterly deterioration.72 In the fourth quarter of 2005, the deficit hit a staggering $225 billion, up from $185.4 billion in the previous quarter. For all of 2005, the current account deficit was $805 billion, 6.4 percent of national income. In 2005, the U.S. trade deficit, the largest component of the current account deficit, soared to an all-time high of $725.8 billion, the fourth consecutive year that America’s trade debts set records. The trade deficit with China alone rose to $201.6 billion, the highest imbalance ever recorded with any country. Meanwhile, since mid-2000, the country has lost nearly three million manufacturing jobs.73
To try to cope with these imbalances, on March 16, 2006, Congress raised the national debt limit from $8.2 trillion to $8.96 trillion. This was the fourth time since George W. Bush took office that it had to be raised. The national debt is the total amount owed by the government and should not be confused with the federal budget deficit, the annual amount by which federal spending exceeds revenue. Had Congress not raised the debt limit, the U.S. government would not have been able to borrow more money and would have had to default on its massive debts.
Among the creditors that finance this unprecedented sum, two of the largest are the central banks of China ($853.7 billion in reserves of dollars and other foreign currencies) and Japan ($831.58 billion), both of which are the managers of the huge trade surpluses these countries enjoy with the United States.74 This helps explain why our debt burden has not yet triggered what standard economic theory would dictate: a steep decline in the value of the U.S. dollar followed by a severe contraction of the American economy because we could no longer afford the foreign goods we like so much. However, both the Chinese and Japanese governments continue to be willing to be paid in dollars in order to sustain American demand for their exports. For the sake of domestic employment, both countries lend huge amounts to the American Treasury, but there is no guarantee how long they will want or be able to do so.
According to Marshall Auerback, an international financial strategist, “Today, the U.S. economy is being kept afloat by enormous levels of foreign lending, which allow American consumers to continue to buy more imports, which only increases the bloated trade deficits.”75 We have become, in Auerback’s terms, a “Blanche Dubois economy” (named after the leading character in Tennessee Williams’s play A Streetcar Named Desire), heavily dependent on “the kindness of strangers.” Unfortunately, in our case, as in Blanche’s, there are not many strangers left willing to support our illusions.
Even a severe reduction in our numerous deficits (trade, governmental, current account, household, and savings) would still not be enough to save the republic, because of the unacknowledged nature of our economy—specifically our dependence on military spending and war for our wealth and well-being. Ever since we recovered from the Great Depression of the 1930s via massive governmental spending on armaments during World War II, we have become dependent on “military Keynesianism,” artificially boosting the growth rate of the economy via government spending on armies and weapons.
“Keynesianism” is named for the English economist John Maynard Keynes, author of The General Theory of Employment, Interest, and Money, published in 1936, and other influential books. In his writings and his public career, Keynes developed a scheme to save capitalist economies from cycles of boom and bust as well as the severe decline of consumer spending that occurs in periods of depression. He was less interested in what causes these cycles or in whether capitalism itself promotes underemployment and unemployment, than in what to do when an inequitable distribution of income causes people to be unable to buy what their economy produces. To prevent the economy from contracting, a development likely to be followed by social unrest, Keynes thought that the government should step in and, through deficit spending, put people back to work, even if this meant creating jobs artificially. Some of these jobs might be socially useful, but Keynes also favored make-work tasks if that proved necessary, simply to put money in the pockets of potential consumers. Conversely, during periods of prosperity, he thought government should cut spending and rebuild the treasury. He called his plan countercyclical “pump-priming.”
During the New Deal in the 1930s, the United States tried to put Keynesianism into practice. Through various schemes the government attempted to restore morale—if not full employment.76 These included “social security” to provide incomes for retired people; giving unions the right to strike (the Wagner Act); setting minimum wages and hours and prohibiting child labor; creating jobs for writers, artists, and creative people generally (the Works Projects Administration); financing the building of dams, roads, schools, and hospitals across the country, including the Triborough Bridge and Lincoln Tunnel in New York City, the Grand Coulee Dam in Washington, and the Key West Highway in Florida (the Public Works Administration); organizing projects for young people in agriculture and forestry (the Civilian Conservation Corps); and setting up the Tennessee Valley Authority to provide flood control and electric power generation in a seven-state area.
The New Deal also saw the rudimentary beginnings of a backlash against Keynesianism. Conservative capitalists feared, as the German political scientist and sociologist Jürgen Habermas has noted, that too much government intervention would delegitimate and demystify capitalism as an economic system that works by allegedly quasi-natural laws. More seriously, too much spending on social welfare might, they feared, shift the balance of power in society from the capitalist class to the working class and its unions.77 For these reasons, establishment figures tried to hold back countercyclical spending until World War II unleashed a torrent of public funds for weapons.
In 1943, the Polish economist in exile Micha Kalecki coined the term “military Keynesianism” to explain Nazi Germany’s success in overcoming the Great Depression and achieving full employment. Adolf Hitler did not undertake German rearmament for purely economic reasons; he wanted to build a powerful German military. The fact that he advocated governmental support for arms production made him acceptable to many German industrialists, who increasingly supported his regime.78 For several years before Hitler’s aggressive intentions became clear, he was celebrated around the world for having achieved a “German economic miracle.”
Speaking theoretically, Kalecki understood that government spending on arms increases manufacturing and also has a multiplier effect on general consumer spending by raising workers’ incomes. Both of these points are in accordance with general Keynesian doctrine. In addition, the enlargement of standing armies absorbs many workers, often young males with few skills and less education. The military thus becomes an employer of last resort, like the old Civilian Conservation Corps, but on a much larger scale. Increased spending on military research and the development of weapons systems also generates new infrastructure and advanced technologies. Well-known examples include the jet engine, radar, nuclear power, semiconductors, and the Internet, each of which began as a military project that later formed the basis for major civilian industries.79 By 1962-63, military outlays accounted for some 52 percent of all expenditures on research and development in the United States. As the international relations theorist Ronald Steel puts it, “Despite whatever theories strategists may spin, the defense budget is now, to a large degree, a jobs program. It is also a cash cow that provides billions of dollars for corporations, lobbyists, and special interest groups.”80
The negative aspects of military Keynesianism include its encouragement of militarism and the potential to create a military-industrial complex. Because such a complex becomes both directly and indirectly an employer and generator of employment, it comes to constitute a growing proportion of aggregate demand. Sooner or later, it short-circuits Keynes’s insistence that government spending be cut back in times of nearly full employment. In other words, it becomes a permanent institution whose “pump” must always be primed. Governments invariably find it politically hard to reduce military spending once committed to it, particularly when munitions makers distribute their benefits as widely as possible and enlist the support of as many politicians as possible, as they have in the United States. In short, military Keynesianism leads to constant wars, or a huge waste of resources on socially worthless products, or both.
By the mid-1940s, everyone in the United States appreciated that the war boom had finally brought the Great Depression to an end, but it was never understood in Keynesian terms. It was a war economy. State expenditures on arms in 1944 reached 38 percent of gross domestic product (the sum total of all goods and services produced in an economy) or GDP, which seemed only appropriate given the nation’s commitment to a two-front war. There was, however, a profound fear among political and economic elites as well as the American public that the end of the war— despite all the promises of future peacetime wonders like TVs, cars, and washing machines—would mean a return to economic hard times. Such reasoning lay, in part, behind the extraordinary expansion of arms manufacturing that began in 1947. The United States decided to “contain” the USSR and, in the early 1950s, to move from the production and use of atomic bombs to the building and stockpiling of the much larger and more destructive hydrogen bombs.
Between the 1940s and 1996, the United States spent at least $5.8 trillion on the development, testing, and construction of nuclear weapons alone. By 1967, the peak year of its nuclear stockpile, the United States possessed some 32,500 deliverable bombs, none of which, thankfully, was ever used. But they perfectly illustrate Keynes’s proposal that, in order to create jobs, the government might as well decide to bury money in old mines and then pay unemployed workers to dig it up. Nuclear bombs were not just America’s secret weapon but also a secret economic weapon. As of 2006, we still have 9,960 of them.
The Cold War contributed greatly to the country’s sustained economic growth that began in 1947 and lasted until the 1973 oil crisis. Military spending was around 16 percent of GDP in the United States during the 1950s. In the 1960s, the Vietnam War sustained it at around 9 percent, but in the 1970s, strong economic competition from the free riders, Japan and Germany, forced a significant decline in military spending with a consequent U.S. decline into “stagflation” (a combination of stagnation and inflation).
The American response was a classic example of military Keynesianism—namely, Reaganomics. In the 1980s, President Reagan carried out a policy of large tax cuts combined with massive increases in defense spending allegedly to combat a new threat from communism. It turned out that there was no threat, only a campaign of fear-mongering from the White House bolstered by the CIA, which consistently overstated the size and growth of the Soviet armed forces during this period. The USSR was in fact starting to come apart internally because of serious economic imbalances and the deep contradictions of Stalinism. Reagan’s policies drove American military expenditures to 6.2 percent of GDP, which in 1984 produced a growth rate for the economy as a whole of 7 percent and helped re-elect Reagan by a landslide.81 During the Clinton years, military spending fell to about 2 percent of GDP, but the economy rallied strongly in Clinton’s second term due to the boom in information technologies, weakness in the previously competitive Japanese economy, the government’s more nationalistic support of the economy internationally, and serious efforts to reduce the national debt.
With the coming to power of George W. Bush and the launching of his Global War on Terror, military Keynesianism returned with a vengeance. According to Andrew Gumbel, a regular contributor to the Independent newspaper of London, during the second quarter of 2003, when the Iraq war was in full swing, some 60 percent of the 3.3 percent GDP growth rate was attributable to military spending.82 In the U.S. budgets for the years between 2003 and 2007, defense occupied just over 50 percent of all discretionary spending by the government. This is money the president and Congress can actually appropriate, as distinct from mandatory spending in compliance with existing laws (for social security payments, medicare, interest on the national debt, and so on).
The official 2007 Pentagon budget is $439.3 billion—not including the costs of America’s current wars. It essentially covers salaries and weapons—the funds for missile defense and other operations in outer space (between $7.4 billion and $9 billion a year since fiscal year 2002), new ships and submarines for the navy, and aircraft that were designed to fight the former Soviet Union’s air force but that have been kept as active projects because of industry and air force lobbying. As Jonathan Karp of the Wall Street Journal observes, “Weapons spending has swelled faster than the overall Pentagon budget, soaring 43 percent in the past five years to $147 billion, with the majority of the funding going to programs conceived before 9/11. The estimated lifetime cost of the Pentagons five biggest weapons systems is $550 billion, 89 percent more than the top-five programs were projected to cost in 2001.”83
One of the absurdities of the Bush administration’s defense appropriations is that the official defense budget has nothing to do with actual combat in Afghanistan and Iraq. We have built a fantastically high-tech military, but in order to use it, Congress has to appropriate separate annual “supplements” of around $120 billion a year. In the fiscal 2007 budget, the Congressional Research Service estimates that Pentagon spending will be about $9.8 billion per month for Operation Enduring Freedom and Operation Iraqi Freedom, or an extra $117.6 billion for the year.84 As of 2006, the overall cost of the wars in Iraq and Afghanistan since their inception stood at about $450 billion.
To understand the real weight of military Keynesianism in the American economy, one must approach official defense statistics with great care. They are compiled and published in such a way as to minimize the actual size of the official “defense budget.” The Pentagon does this to try to conceal from the public the real costs of the military establishment and its overall weight within the economy. There are numerous military activities not carried out by the Department of Defense and that are therefore not part of the Pentagon’s annual budgets. These include the Department of Energy’s spending on nuclear weapons ($16.4 billion in fiscal 2005), the Department of Homeland Security’s outlays for the actual “defense” of the United States against terrorism ($41 billion), the Department of Veterans Affairs’ responsibilities for the lifetime care of the seriously wounded ($68 billion), the Treasury Department’s payments of pensions to military retirees and widows and their families (an amount not fully disclosed by official statistics), and the Department of State’s financing of foreign arms sales and militarily related developmental assistance ($23 billion).
In addition to these amounts, there is something called the “Military Construction Appropriations Bill,” which is tiny compared to the other expenditures—$12.2 billion for fiscal 2005—but which covers all the military bases around the world. Adding these non-Department of Defense expenditures, the supplemental appropriations for the wars in Iraq and Afghanistan, and the military construction budget to the Defense Appropriations Bill actually doubles what the administration calls the annual defense budget. It is an amount larger than all other defense budgets on Earth combined.85 Still to be added to this are interest payments by the Treasury to cover past debt-financed defense outlays going back to 1916. Robert Higgs, author of Crisis and Leviathan and many other books on American militarism, estimates that in 2002 such interest payments amounted to $138.7 billion.86
Even when all these things are included, Enron-style accounting makes it hard to obtain an accurate understanding of our reliance on a permanent arms economy. In 2005, the Government Accountability Office reported to Congress that “the Pentagon has no accurate knowledge of the cost of military operations in Iraq, Afghanistan, or the fight against terrorism.”87 It said that, lacking a reliable method for tracking military costs, the army merely inserts into its accounts figures that match the available budget. “Effectively, the Army [is] reporting back to Congress what it had appropriated.”
Joseph Stiglitz, the Nobel Prize-winning economist, and his colleague at Harvard Linda Bilmes have tried to put together an estimate of the real costs of the Iraq war. They calculate that it will cost about $2 trillion.88 This figure is several orders of magnitude larger than what the Bush administration publicly acknowledges. Above all, Stiglitz and Bilmes have tried to compile honest figures for veterans’ benefits. For 2006, the officially budgeted amount is $68 billion, which is absurdly low given the large number of our soldiers who have been severely wounded. We celebrate the medical miracles that allow some of our troops to survive the detonation of an “improvised explosive device” hidden in the Earth under a Humvee, but when larger numbers of soldiers who once might have died in such situations are saved, the resulting wounds, often including brain damage, require that they receive round-the-clock care for the rest of their lives.
We almost surely will end up repudiating some of the promises we have made to the men and women who have volunteered to serve in our armed forces. For instance, the government’s medical insurance scheme for veterans and their families, called Tricare, is budgeted for 2007 at a mere $39 billion. But the future demands on Tricare are going to go off the chart. And we cannot afford them unless we radically reorient our economy. The American commitment to military Keynesianism and the nontransparent manner in which it is implemented have combined into a set of fatal contradictions for our country.
In Blowback, I set out to explain why we are hated around the world. The concept “blowback” does not just mean retaliation for things our government has done to and in foreign countries. It refers to retaliation for the numerous illegal operations we have carried out abroad that were kept totally secret from the American public. This means that when the retaliation comes—as it did so spectacularly on September 11, 2001—the American public is unable to put the events in context. So they tend to support acts intended to lash out against the perpetrators, thereby most commonly preparing the ground for yet another cycle of blowback. In the first book in this trilogy, I tried to provide some of the historical background for understanding the dilemmas we as a nation confront today, although I focused more on Asia—the area of my academic training—than on the Middle East.
The Sorrows of Empire was written during the American preparations for and launching of the invasions and occupations of Afghanistan and Iraq. I began to study our continuous military buildup since World War II and the 737 military bases we currently maintain in other people’s countries. This empire of bases is the concrete manifestation of our global hegemony, and many of the blowback-inducing wars we have conducted had as their true purpose the sustaining and expanding of this network. We do not think of these overseas deployments as a form of empire; in fact, most Americans do not give them any thought at all until something truly shocking, such as the treatment of prisoners at Guantanamo Bay, brings them to our attention. But the people living next door to these bases and dealing with the swaggering soldiers who brawl and sometimes rape their women certainly think of them as imperial enclaves, just as the peoples of ancient Iberia or nineteenth-century India knew that they were victims of foreign colonization.
In Nemesis, I have tried to present historical, political, economic, and philosophical evidence of where our current behavior is likely to lead. Specifically, I believe that to maintain our empire abroad requires resources and commitments that will inevitably undercut our domestic democracy and in the end produce a military dictatorship or its civilian equivalent. The founders of our nation understood this well and tried to create a form of government—a republic—that would prevent this from occurring. But the combination of huge standing armies, almost continuous wars, military Keynesianism, and ruinous military expenses have destroyed our republican structure in favor of an imperial presidency. We are on the cusp of losing our democracy for the sake of keeping our empire. Once a nation is started down that path, the dynamics that apply to all empires come into play—isolation, overstretch, the uniting of forces opposed to imperialism, and bankruptcy. Nemesis stalks our life as a free nation.
History is instructive on this dilemma. If we choose to keep our empire, as the Roman Republic did, we will certainly lose our democracy and grimly await the eventual blowback that imperialism generates. There is an alternative, however. We could, like the British Empire after World War II, keep our democracy by giving up our empire. No more than the French and Dutch, the British did not do a particularly brilliant job of liquidating their empire, and there were several clear cases where British imperialists defied their nation’s commitment to democracy in order to keep their foreign privileges. Kenya in the 1950s is a particularly savage example. But the overall thrust of postwar British history is clear: the people of the British Isles chose democracy over imperialism. For this reason, I can only regard Britain’s willingness to join the United States in its invasion of Iraq as an atavistic response.
Britain’s closing down its empire is one of its more admirable legacies. I do not share the nostalgia of contemporary Anglo-American writers who urge the United States to take up the “white man’s burden” and follow in the footsteps of British imperialists. Instead, I have chosen as my role model a Japanese scholar and journalist, Hotsumi Ozaki, about whom I long ago wrote a biography. Ozaki was born in what was then the Japanese colony of Taiwan, and his early childhood was that of a little colonialist, being taken to school by rickshaw. As an adult, he was a prominent journalist and scholar in China, and he accurately foresaw that Japan’s occupation of China would fail disastrously and lead to the blowback of the Chinese Communist revolution.
Ozaki tried to warn his own government about its misguided ventures. For his troubles he was hanged as a traitor by the Japanese government in the waning days of World War II. I hope not to meet a similar fate, but I am as certain as Ozaki was that my country is launched on a dangerous path that it must abandon or else face the consequences.