Tuesday, November 17, 2020

Part 4: The Cult of the Presidency...Hero Takes a Fall

4. 

Hero Takes a Fall 

Out of the gobbledygook, comes a very clear thing: . . . you can’t trust the government; you can’t believe what they say; and you can’t rely on their judgment; and the—the implicit infallibility of presidents, which has been an accepted thing in America, is badly hurt by this, because it shows that people do things the President wants to do even though it’s wrong, and the President can be wrong. 

—H. R. Haldeman to Richard Nixon (June 14, 1971) 

‘‘Our long national nightmare is over,’’ President Gerald Ford declared upon Nixon’s resignation. The metaphor didn’t quite fit. Nightmares can be disturbing, but they’re not real, and parents are right to tell their children to forget all about them. In a sense, the country was asleep during the era of the Heroic Presidency, but the abuses of that period actually happened, and, rather than forget them, most Americans wanted to prevent their repetition. 

During the 1970s, what Americans learned about the presidency would lead to a resurgence of checks and balances and a political culture that would no longer take claims of executive benevolence on faith. Resurgent distrust manifested itself in a newly adversarial press, and, perhaps most importantly, in a Congress and a judiciary now willing to challenge presidential power. 

The period of executive retrenchment was short lived, unfortunately. The Heroic Presidency had fallen. But it would in time be replaced by an office less grand but no less menacing. By the Clinton years, if not well before, the presidency was as imperial as ever, even if lacking entirely the glamour of Camelot. By century’s end, Americans had recovered much of their historic skepticism about power. And yet, even as faith in power has waned, power endures. 

Therapeutic Regicide 

Like his heroic-era predecessors, Richard Nixon had a view of executive power that was vast indeed. The president held total control of the power to make war; he could wiretap at will, without court approval; he could withhold from Congress and the public any information he chose; and he was virtually immune from judicial process aimed at correcting abuses.1 As Nixon saw it, the president also had a sweeping power to impound congressionally appropriated funds, zeroing out whole programs because he disagreed with them or found them wasteful. As one federal court noted, the president’s theory would allow him to ‘‘ignore any and all Congressional authorizations if he deemed them . . . contrary to the needs of the nation.’’2 

Even before Watergate unfolded, Nixon’s legal theories began to look uncomfortably like claims that the president was above the law. Yet, Americans had tolerated—even applauded—similar claims from presidents in the past. Nor was RMN the first president to wiretap his enemies and attempt to subvert the Federal Bureau of Investigation and Central Intelligence Agency for political purposes. How then, could a ‘‘third-rate burglary’’—what John Wayne termed a ‘‘glorified panty raid’’—bring down a president?3 

Context was everything. By the time Watergate happened, public trust in the presidency had already begun to erode, due to the widening ‘‘credibility gap’’ associated with the Vietnam War and serial revelations of past executive abuses, many of which had little to do with Nixon himself. What Americans learned about those abuses punctured the myth of presidential infallibility. 

An article released in the decade’s first month gave a hint of things to come. In the January 1970 issue of the Washington Monthly, former army intelligence officer Christopher Pyle exposed an ongoing program of military surveillance dating from President Johnson’s decision to use the army to quell the 1967 race riots in Detroit. As Pyle put it, ‘‘the Army had assembled the essential apparatus of a police state.’’4 Under pressure from the White House and the Justice Department, the U.S. military became deeply involved in monitoring the peaceful political activities of civilians. By the fall of 1968, more military intelligence officials were monitoring domestic protest groups than were assigned to any foreign theater, including Vietnam.5 In addition to infiltrating peaceful protest groups, the army  kept files on over 100,000 citizens, including such dangerous national security threats as child psychologist Dr. Benjamin Spock and folk singers Arlo Guthrie and Joan Baez.6 The Pyle article spurred Senate Judiciary Committee hearings chaired by Sen. Sam Ervin, in which the senators learned that ‘‘comments about the financial affairs, sex lives, and psychiatric histories of persons unaffiliated with the armed forces appear throughout the various records systems.’’7 

Then, on March 8, 1971, antiwar activists calling themselves ‘‘the Citizen’s Commission to Investigate the FBI’’ broke into an FBI field office in Pennsylvania and stole reams of files on agency ‘‘black ops’’ at home. What they found, they leaked to various media outlets, and soon Americans became familiar with the ungainly term ‘‘COINTELPRO.’’ COINTELPRO, for ‘‘Counterintelligence Program,’’ went far beyond intelligence gathering—embracing burglaries, wiretaps, attempts to provoke street violence between members of targeted groups, and covert actions designed to topple movement leaders by, among other things, tagging them with ‘‘snitch jackets’’—forged documents containing trumped-up evidence of cooperation with the FBI and police. 

The program had begun in 1956 with a focus on the U.S. Communist Party, but soon broadened to include white and black nationalist groups, and eventually ‘‘New Left’’ organizations. The bureau had an expansive definition of ‘‘subversive.’’ Among its targets were liberal Antioch College and Martin Luther King’s Southern Christian Leadership Conference, which the FBI termed a Black Nationalist ‘‘hate group.’’8 

Some of the FBI’s actions during this period had the flavor of high-school pranks, albeit potentially murderous ones. In ‘‘Operation Hoodwink,’’ carried out between the fall of 1966 and the summer of 1968, agents purporting to be Communist Party members sent insulting letters to mob figures Carlo Gambino and Santo Trafficante, among others, hoping to ‘‘provoke a dispute between La Cosa Nostra and the Communist Party, USA.’’9 Other schemes were less amusing. On one occasion, FBI agents kidnapped an antiwar activist to intimidate him into silence.10 On another, agents bugged Martin Luther King’s hotel rooms and sent him a tape containing evidence of his extramarital affairs. With the tape was a letter saying ‘‘King, there is one thing left for you to do. You know what it is’’—that is, commit suicide.11 King was only the most famous of the FBI targets on whom this sort of gutter tactic was employed.  

A few months after COINTELPRO became a household word, former Defense Department analyst Daniel Ellsberg began leaking to the New York Times and the Washington Post portions of a classified DOD history of the Vietnam War. Prepared at the behest of then Defense Secretary Robert McNamara after he’d begun to lose faith in the war, the Pentagon Papers included details of sordid behavior on the part of the Kennedy and Johnson administrations. Among other revelations, the papers showed that JFK was complicit in the military coup that ended in South Vietnamese president Ngo Dinh Diem’s assassination, and that the Johnson administration had lied about the Gulf of Tonkin incident to get congressional authorization for the war. 

Nothing in the Pentagon Papers directly implicated Nixon. Yet, the Nixon team feared such leaks would undermine their efforts to secure ‘‘peace with honor’’ in Vietnam.12 And so the White House ‘‘Plumbers’’ were born. Ex-CIA operative E. Howard Hunt and former FBI agent G. Gordon Liddy warmed up by breaking into Ellsberg’s psychiatrist’s office, hoping to find dirt on the former defense analyst. Then on June 17, 1972, the Plumbers, led by Liddy, botched a burglary of the Democratic National Committee’s headquarters at the Watergate office complex. Over the next two years, the story behind that break-in gradually emerged from the courts, congressional hearings, and the press—leading to Nixon’s resignation. 

Along the way, the fight over Nixon’s presidency kicked up still more dirt, sullying Nixon’s image, and that of the presidency as a whole. When it came to light in 1973 that Nixon and White House Counsel John Dean contemplated ordering Internal Revenue Service audits of Democratic contributors (in Dean’s words, ‘‘the use of the available federal machinery to screw our political enemies’’13), Time magazine revealed that this had been common practice in the Kennedy and Johnson administrations. At Kennedy’s instigation in 1961, the IRS had set up a ‘‘strike force’’ aimed at groups opposing the administration.14 Perhaps it was not mere bad luck that led to Nixon himself getting audited three times during the Kennedy-Johnson years.15 

And as the House Judiciary Committee geared up for impeachment proceedings, it had yet another revelation to consider. In 1973, Air Force Major Hal Knight came forward with information that he had helped President Nixon conceal a 14-month bombing campaign 108 Hero Takes a Fall against neutral Cambodia in 1969 and 1970. Without any authorization to expand the war, in March 1969, Nixon ordered U.S. planes to target North Vietnamese base camps in Cambodian territory along the border with Vietnam. The campaign, which included nearly 4,000 sorties dropping over 100,000 tons of bombs between March 1969 and May 1970, was code-named ‘‘Operation Menu,’’ with the various phases of the campaign going by the monikers ‘‘Breakfast,’’ ‘‘Lunch,’’ ‘‘Snack,’’ ‘‘Dinner,’’ and ‘‘Dessert.’’ The high-altitude, indiscriminate bombing runs caused massive civilian casualties among Cambodian farmers.16 The president kept the bombing secret not only from Congress and the public, but even from Secretary of State William Rogers, the secretary of the air force, and the air force chief of staff.17 Even the classified records of targets selected were falsified. Nixon repeatedly ordered Chairman of the Joint Chiefs of Staff Earle Wheeler not to reveal the campaign ‘‘to any member of Congress.’’ When the facts about the secret bombing became public, Nixon was unapologetic; there had been no secrecy with regard to anyone ‘‘who had any right to know or need to know.’’18 

Rep. John Conyers drafted an article of impeachment based on concealment of the bombing ‘‘in derogation of the power of the Congress to declare war.’’ That article failed to make it into the final bill of particulars, which focused on obstruction of justice and attempting to misuse the CIA to interfere with the Watergate investigation. 

Ironically, Nixon had on the whole proved less successful than Kennedy and Johnson at bending federal intelligence agencies to his will. The CIA had provided false identification, disguises, and cameras for the burglary of Ellsberg’s psychiatrist’s office; but the agency balked at the administration’s request that it lean on the FBI to quash the Watergate investigation for ‘‘national security’’ reasons. The recording of Nixon and Haldeman plotting to involve the CIA became the ‘‘smoking gun’’ that led to the president’s resignation when the Supreme Court forced him to turn over the tape. 

Three Branches, After All 

That case, U.S. v. Nixon, was one of three key cases in which the Court stood up against unconstrained presidential power. A newly assertive Court would be joined by a newly assertive Congress, in a halting attempt to right the constitutional balance. 

Judicial Pushback 

First, in New York Times Co. v. United States, the Court rebuffed Nixon’s attempt to stop publication of the Pentagon Papers.19 Invoking ‘‘the constitutional power of the President over the conduct of foreign affairs and his authority as Commander-in-Chief,’’ the administration argued that the president had the power to suppress ‘‘publication of information whose disclosure would endanger the national security.’’20 On June 30, 1971, barely two weeks after the first Pentagon Papers excerpts were published, the Court held that the government had not met the heavy burden the First Amendment imposes on attempted prior restraints of political speech. 

The Nixon team feared that exposure of the papers would undermine the war effort and threaten the president’s ability to prevent damaging national security leaks. Worse, the papers’ release was a threat to the presidency itself. In an Oval Office meeting discussing what to do about the leak, White House Chief of Staff H. R. Haldeman warned Nixon, in the passage quoted at the beginning of this chapter, that the release of the papers would undermine the public’s perception of ‘‘the implicit infallibility of presidents.’’21 Haldeman was right: the Court’s decision helped clear the way for increased public scrutiny of the executive branch and provided a valuable lesson in the perils of trusting government too much. 

A year later, in United States v. United States District Court (the ‘‘Keith’’ case), the Court rejected another claim of limitless executive power.22 In Keith, three left-wing radicals charged with conspiracy to destroy government property sought disclosure of information on electronic surveillance that the attorney general had ordered without a warrant. The administration claimed the surveillance was lawful, asserting a presidential power to order warrantless wiretaps on anyone he suspected of threatening national security. In oral argument before the Court, Assistant Attorney General Robert C. Mardian declared, ‘‘Now, certainly neither this President nor any prior President has authorized electronic surveillance to monitor the activities of an opposite political group.’’ Similarly, in its brief, the administration suggested that any concerns about possible abuse of wiretap authority should be assuaged by the fact the attorney general would personally approve each wiretap application.23 At the time, of course, the attorney general was John Mitchell, who was neck deep in political wiretapping and would, less than five months after the case was argued, approve G. Gordon Liddy’s plan to bug Democratic National Committee headquarters at the Watergate Hotel.

In fact, the Watergate burglars were arrested the same week the Court handed down the Keith decision, which rejected the administration’s claim of unchecked surveillance powers. In Keith, the Court left open the question of warrantless wiretapping in cases involving a foreign power—that question would later be addressed by Congress with the Foreign Intelligence Surveillance Act. As for surveillance of homegrown security threats, the Court noted, with more wisdom than perhaps it recognized at the time, that ‘‘the Fourth Amendment does not contemplate the executive officers of government as neutral and disinterested magistrates.’’24 Preservation of Fourth Amendment freedoms demanded prior judicial approval. 

But it was the third case, U.S. v. Nixon, that would bring the age of the Heroic Presidency to a close.25 In the spring of 1974, Nixon had refused to release selected Oval Office audiotapes to special prosecutor Leon Jaworski, who sought them as evidence against the Watergate conspirators (Nixon himself being named as an ‘‘unindicted co-conspirator’’ in the case). Resisting discovery of the tapes, Nixon claimed ‘‘absolute privilege.’’ He could decide for himself what to disclose and what to withhold: ‘‘The president is answerable to the Nation, but not to the courts.’’26 In earlier testimony before a Senate committee, Attorney General Richard Kleindienst (Mitchell had by then resigned) asserted that the privilege attached to all 2.5 million employees of the executive branch and could be invoked even against an impeachment inquiry.27 

On July 24, 1974, a unanimous Court rejected Nixon’s claim. Though it allowed for the existence of a qualified executive privilege and recognized the need for deference in national security matters, it held that in this case, the need for evidence in a criminal trial outweighed the president’s interest in confidentiality. Two weeks later, Nixon resigned. 

But the release of the tapes had a broader effect still. In his memoirs, Nixon wrote that ‘‘the American myth that Presidents are always presidential, that they sit in the Oval Office talking in lofty and quotable phrases, will probably never die—and probably never should because it reflects an important aspect of the American character.’’28 Yet, the Nixon tapes did much to kill that myth, showing  that presidents can be foul-mouthed, petty, paranoid—and lawless. With that revelation, the heroic image of the presidency went down hissing like the Wicked Witch of the West. 

A Resurgent Congress Congress joined the Court in its attempt to confine presidential power. The legislative reforms of the Watergate era and its aftermath fell into two broad categories: those that restricted the president’s unilateral powers and those that imposed on the executive branch obligations of openness and disclosure. 

Even before Watergate, Congress had begun restoring important legal protections. In 1971, Congress passed, and Nixon signed, the Non-Detention Act, which repealed the emergency detention provisions of the McCarran Internal Security Act. That act, passed over Truman’s veto in 1950, gave the president authority, in an ‘‘an internal security emergency,’’ to lock up subversives. After the McCarran act passed, FBI Director J. Edgar Hoover sent the White House a plan contemplating the ‘‘permanent detention’’ of 12,000 suspects, almost all of them American citizens.29 

In Nixon’s first term, Japanese American groups, civil libertarians, and, surprisingly enough, Nixon’s own Justice Department, pushed for a repeal of the detention law. The new law provided that ‘‘no citizen shall be imprisoned or otherwise detained by the United States except pursuant to an Act of Congress.’’30 In that case, Congress acted with the support of the Nixon administration; not so with most of the other reforms of the period. 

In 1973, a Senate special committee identified 470 statutory provisions that delegated broad authority to the president in times of national emergency. With four open-ended presidential declarations of national emergency dating back to 1933 still in effect, most of those provisions remained at the president’s disposal.31 The National Emergencies Act of 1974 decreed that all the statutory delegations of emergency power would expire by 1976, and it provided a one year limit to all future emergency powers—shorter, if Congress ended the state of emergency by joint resolution. With the Impoundment Control Act, passed the same year, Congress moved to reassert its power of the purse, putting curbs on the president’s ability to override congressional spending decisions.[well that is a lie about congress having the ability to end national emergencies, because here we are in 2020 and still under FDR's emergency of March 6th I believe of 1933 DC]  

Congress also took a number of measures to reassert control over the power to go to war, most importantly with the War Powers 112 Hero Takes a Fall Resolution. Passed in 1973 over Nixon’s veto, the WPR attempted ‘‘to fulfill the intent of the framers of the Constitution of the United States and insure that the collective judgment of both the Congress and the President will apply to the introduction of United States Armed Forces into hostilities.’’32 In essence, the WPR provides that if the president introduces U.S. armed forces into hostilities or ‘‘situations where imminent involvement in hostilities is clearly indicated by the circumstances,’’ he must remove those forces within 60 days (90, if necessary to ensure safe withdrawal) absent a congressional declaration of war, specific statutory authorization for the action, or a situation in which Congress is physically unable to meet because of an armed attack on the United States.33 The Hughes-Ryan Act, passed the next year, sought to rein in the president’s ability to order covert actions unilaterally, requiring notification to select committees of Congress when such actions were undertaken.34 [well if the constitution was actually the blueprint for what goes on in this country,we would not have a standing army,but we do,so clearly IT is not DC]

Congress had also moved to limit the president’s powers to classify information. In this, it faced strong opposition from two young Ford administration aides, Chief of Staff Donald Rumsfeld and his deputy Dick Cheney. In 1966, Congress had passed a Freedom of Information Act that was essentially toothless: through a variety of tactics, executive branch officials managed to withhold vast amounts of material that the act required to be provided at citizens’ request. In 1974, Congress passed amendments to the act designed to overcome executive intransigence, most importantly, strengthening judicial review of executive branch determinations that records are properly classified. At Rumsfeld and Cheney’s urging, President Ford vetoed the bill on October 17, 1974.35 A month later, Congress overrode Ford’s veto. 

Finally, with the Foreign Intelligence Surveillance Act of 1978, Congress took up the Supreme Court’s invitation, in the Keith case, to set up a framework for national security surveillance involving Americans at home.36 Under FISA, such surveillance would require the executive branch to secure a warrant from a special court. The standard for granting FISA warrants was a lenient one, but since it at least required the approval of an independent branch of government, FISA put an important check on the executive’s ability to conduct domestic spying under the rubric of national security. 

Like most periods of reformist fervor, the post-Watergate era generated its share of ill-considered schemes.37 In some cases, Congress and the courts went too far, in others not far enough, to right the constitutional balance. Yet, by the mid-1970s, for the first time in decades, the country had a Congress and a judiciary awake to the problem of unchecked executive power. 

A Culture of (Justified) Distrust 

Important though they were, the legislative and judicial reforms of the 1970s were only a reflection of broader changes in the American attitude toward executive power. In How We Got Here, his cultural history of the 1970s, David Frum describes the public mood after the last decade of the Heroic Presidency: 

Of the three presidents after 1960, the first stood exposed as a womanizing rogue who abused the FBI and IRS, who was implicated in assassinations and attempted assassinations, and who wiretapped Martin Luther King, Jr. The second owed his political career to stuffed ballot boxes, had corruptly enriched himself, had lied the country into Vietnam, and had also wiretapped King. The third had orchestrated a campaign of lies to cover up multiple crimes, had chiseled on his income tax, had chosen a corrupt governor as his vice president, and had bankrolled his campaigns with illegal corporate gifts. ‘‘I am not a crook’’? It was looking like a good working assumption that everybody was a crook.38 

New Revelations 

Even after Nixon’s departure, there was no respite from the horror show of continuing disclosures. In 1974, investigative reporter Seymour Hersh revealed in the New York Times that under pressure from presidents Johnson and Nixon, the CIA had been running something called Operation CHAOS, a domestic surveillance and espionage program aimed at antiwar groups.39 Despite the fact that the agency itself had concluded that the New Left and Black Power groups it targeted were not controlled or manipulated by foreign governments—and that the program violated the CIA’s charter— Operation CHAOS continued until it was publicly exposed.40 

The disclosure of the CHAOS program, coming as it did after the steady stream of early 1970's reports of federal abuses, prompted the formation of the Senate Select Committee to Study Governmental Operations with Respect to Intelligence Activities, chaired by Idaho senator Frank Church.41 In 1975 and 1976, the Church Committee  published 14 reports on CIA and FBI abuses. The committee uncovered new details on everything from army and National Security Agency spying, to the wiretapping of Martin Luther King, to the Kennedy administration’s attempts to get the Mafia to assassinate Castro. In a report published in April of America’s bicentennial year, the committee concluded: 

For decades Congress and the courts as well as the press and the public have accepted the notion that the control of intelligence activities was the exclusive prerogative of the Chief Executive and his surrogates. The exercise of this power was not questioned or even inquired into by outsiders. Indeed, at times the power was seen as flowing not from the law, but as inherent in the Presidency. Whatever the theory, the fact was that intelligence activities were essentially exempted from the normal system of checks and balances. 

Such Executive power, not founded in law or checked by Congress or the courts, contained the seeds of abuse and its growth was to be expected.42 

All told, this did not make for a political environment that encouraged confidence in government. By the mid-1970s, it had become clear that trust in government was a sucker’s game, and there were far fewer suckers around. 

Just Because You’re Paranoid . . . 

In 1964, 62 percent of respondents to the University of Michigan’s National Election Studies survey affirmed that they trusted the federal government to do what was right ‘‘most of the time.’’ That number dropped to 34 during the year of Nixon’s resignation, and bottomed out at 23 percent at the tail end of the 1970s.43 Asked to pick their poison as to who should take the lead on policy matters, in the 1970s Americans answered, Congress. Where 61 percent had agreed in 1959 that ‘‘the president is in the best position to see what the country needs,’’ by 1977 the numbers had nearly reversed: 58 percent of Americans agreed that ‘‘it is up to Congress to decide what is to be done’’ and only 26 percent stubbornly remained presidentialists. By 1975, even children had begun to display a grownup attitude toward presidential power. Grade-schoolers of the 1970s no longer viewed the president as an unambiguously benevolent leader.44 

In popular entertainment, distrust often manifested itself as ridicule. Previously, in the age of the Heroic Presidency, even comedians had felt obliged to portray the president positively. Comedian Eddie Cantor asked FDR’s approval for a woefully tame 1934 radio bit where ‘‘Dr. Roosevelt’’ heals ‘‘Mrs. America.’’45 Vietnam and Watergate put an end to that sort of deference. ‘‘Final Days,’’ a skit from Saturday Night Live’s first season, portrayed Richard Nixon as a raving loon. A drunk Nixon, played by Dan Aykroyd, wanders the White House, calling Kissinger a ‘‘Christ-killer,’’ praying for a heart attack, and shouting to JFK’s portrait: ‘‘They’re gonna find out about you, too. The president! Having sex with women within these very walls. That never happened when Dick Nixon was in the White House! Never! Never! Never!’’ 

Now presidents would seek entertainers’ favor, instead of the other way around. In April 1976, Gerald Ford’s press secretary, Ron Nessen, hosted SNL, with the president himself contributing the opening line: ‘‘Live from New York, it’s Saturday Night!’’ To embarrass the administration, SNL’s writers kicked the vulgarity up a notch, including parody commercials featuring a douche called ‘‘Autumn Fizz’’ and a jam called ‘‘Painful Rectal Itch.’’46 

Pop culture increasingly reflected an anti-government sensibility that at times verged on the paranoid. Thrillers like The Parallax View (1974), Three Days of the Condor (1975), and Capricorn One (1978)— in which federal officials fake a Mars landing and then attempt to cover it up by killing the astronauts—all portrayed a common enemy: the U.S. government. In a 2000 study called ‘‘Government Goes Down the Tube,’’ researchers at the Center for Media and Public Affairs looked at portrayals of public officials over four decades of American television. ‘‘Television increasingly focused on the dark side of political life after the mid-1970s,’’ they wrote; from 1975 to 2005, ‘‘not a single show presented the political system as functioning to uphold the public good rather than private interests.’’47 

A Newly Empowered Press 

Vietnam and Watergate wrought equally significant changes in American journalism. Bob Woodward and Carl Bernstein, the reporters who helped break the Watergate story, showed a rising generation of journalists that exposing abuses of power could turn reporters into movie stars (If you were lucky, Robert Redford; not so lucky,  Dustin Hoffman). The inflated self-regard some journalists displayed was irritating, to be sure—as when Washington Post Executive Editor Ben Bradlee boasted that ‘‘the press won on Watergate.’’48 Yet, it was, in a way, Madison’s theory of ambition counteracting ambition applied to the so-called fourth estate. By serving their own interests, fame-hungry reporters would serve the public’s as well. 

Recall Bradlee’s behavior some 10 years earlier. Chummy with the Kennedys, intoxicated by Camelot, the Post reporter refrained from writing up the information he had on illegal wiretaps of steel executives. The brothers viewed ‘‘Benjy’’ as reliable enough to feel safe joking about it in front of him. Reporters in the White House press corps were also willing to hush up Kennedy’s womanizing, which reflected a sexual appetite that rivaled Motley Crue on world tour. One could argue that Kennedy’s affairs were his own affair, but given that the president shared a mistress with Chicago mob boss Sam Giancana, it’s hard to maintain that no issues of public concern were involved.49 

After Vietnam and Watergate, few reporters would follow Kennedy pal Bradlee and sit on a scoop, regardless of which party it would hurt. Aided by the FOIA, post-Watergate investigative reporters would make it harder for presidents to hide corruption, incompetence, and abuses of power. The press’s changed attitude could be seen in the sorts of questions that the White House press corps put to the president. A 2006 study sampling presidential press conferences from Eisenhower through Clinton finds that ‘‘the Nixon era marks the beginning of an extended period of increasingly vigorous questioning,’’ with deference declining and reporters growing more assertive and adversarial.50 

Like the declining trust numbers, the newly adversarial journalism gave rise to much hand wringing on the part of those earnest souls who saw muckraking as an impediment to government doing great works. In books with titles like Feeding Frenzy: How Attack Journalism Has Transformed American Politics, and Spiral of Cynicism: The Press and the Public Good, we continue to hear complaints that the cynicism stoked by scandal-driven journalism has made it ‘‘impossible to govern.’’ 

The governing class tends to agree. When Bob Woodward requested an interview with George H. W. Bush in 1998, Bush declined, writing, ‘‘I think Watergate and the Vietnam War are the two things that moved beltway journalism into this aggressive, intrusive, ‘take no prisoners’ kind of reporting that I can now say I find offensive.’’51 No doubt that kind of reporting was offensive to people in power; but it helped expose and deter presidential abuses. 

How Conservatives Learned to Stop Worrying and Love the Imperial Presidency 

In a 1984 speech, looking back on his experience as a top Ford administration official, Dick Cheney complained that during the 1970s legislators no longer wanted ‘‘to help presidents accrue power in the White House—so that they could achieve good works in the society.’’ Instead, Congress sought ‘‘to limit future presidents so that they would not abuse power the way it was alleged some had abused power in the past.’’52 ‘‘Alleged’’ was a nice touch. 

Cheney’s remarks reflected the enormous ideological shift that had occurred in the Nixon years. In the 1970s, while liberals were having second thoughts about the need for a powerful, activist presidency, conservatives were warming up to the idea. Nixon had hardly governed as a conservative, but in some ways—serving as ‘‘tribune’’ of the ‘‘silent majority,’’ aggressively impounding funds and asserting control over administrative agencies—he showed conservatives how the office could be used to serve their political ends. 

Still, the Right’s growing affinity for presidential power was at odds with the movement’s political heritage. It was conservatives, after all, who, troubled by the growth of presidential power during FDR’s 12-year reign, had led the fight for the Twenty-Second Amendment, limiting presidential terms.53 And it was conservatives who had the best claim to be heirs to the Founders’ views on human nature and concentrated power. Russell Kirk, whose 1953 book The Conservative Mind helped galvanize the postwar Right, insisted that ‘‘the need for prudent restraints upon power and upon human passions’’ was a core conservative principle: 

The conservative endeavors to so limit and balance political power that anarchy or tyranny may not arise. In every age, nevertheless, men and women are tempted to overthrow the limitations upon power, for the sake of some fancied temporary advantage. It is characteristic of the radical that he thinks of power as a force for good—so long as the power falls into his hands. . . .

Knowing human nature for a mixture of good and evil, the conservative does not put his trust in mere benevolence. Constitutional restrictions, political checks and balances, adequate enforcement of the laws, the old intricate web of restraints upon will and appetite—these the conservative approves as instruments of freedom and order.54 

Almost to a man, the intellectuals who had coalesced around William F. Buckley’s National Review associated presidential power with liberal activism and saw Congress as the ‘‘conservative’’ branch. In 1960 NR senior editor Willmoore Kendall, who had been one of Buckley’s professors at Yale, made the case for Congress in an article titled ‘‘The Two Majorities.’’ Kendall viewed Congress’s deliberative and incrementalist character as ‘‘a highly necessary corrective against the bias toward quixotism inherent in our presidential elections.’’55 In 1967, Russell Kirk and coauthor James McClellan praised the late Robert A. Taft, ‘‘Mr. Conservative,’’ for insisting that war had to be a last resort, threatening as it did to ‘‘make the American President a virtual dictator, diminish the constitutional powers of Congress, contract civil liberties, injure the habitual self-reliance and self-government of the American people, distort the economy, sink the federal government in debt, [and] break in upon private and public morality.’’56 Even so ardent a cold warrior as NR’s James Burnham wrote a book, Congress and the American Tradition, warning that the erosion of congressional power risked bringing about ‘‘plebiscitary despotism for the United States in place of constitutional government, and thus the end of political liberty.’’57 

Sen. Barry Goldwater, who represented postwar conservatives’ highest hopes for political success, could sound as extremist in opposition to presidential power as he did on other matters involving the defense of liberty. In his 1964 campaign manifesto ‘‘My Case for the Republican Party,’’ Goldwater wrote: 

We hear praise of a power-wielding, arm-twisting President who ‘‘gets his program through Congress’’ by knowing the use of power. Throughout the course of history, there have been many other such wielders of power. There have even been dictators who regularly held plebiscites, in which their dictatorships were approved by an Ivory-soap-like percentage of the electorate. But their countries were not free, nor can any country remain free under such despotic power. Some of the current worship of powerful executives may come from those who admire strength and accomplishment of any sort. Others hail the display of Presidential strength . . . simply because they approve of the result reached by the use of power. This is nothing less than the totalitarian philosophy that the end justifies the means. . . . If ever there was a philosophy of government totally at war with that of the Founding Fathers, it is this one.58 

Goldwater’s 1964 bid for the presidency failed disastrously, but out of the wreckage emerged a new conservative hero. In Ronald Reagan’s famous televised speech supporting Goldwater, Reagan identified a number of political figures who would ‘‘trade freedom for security’’ and whose philosophy threatened to take America ‘‘down to the antheap of totalitarianism.’’ Among them was ‘‘Senator Fulbright [who] has said at Stanford University that the Constitution is outmoded. He referred to the president as our moral teacher and our leader, and he said he is hobbled in his task by the restrictions in power imposed on him by this antiquated document. He must be freed so that he can do for us what he knows is best.’’59 

Of course, Reagan and Goldwater also advocated a hyper aggressive posture in the struggle against the Soviet Union, a position that sat uneasily with their distrust of presidential power.60 Rollback of communist gains demanded presidential activism abroad, and those demands began to weaken conservative opposition to powerful presidents. In an article examining congressional voting patterns on presidential power, political scientist J. Richard Piper found that ‘‘what erosion occurred in conservative support for a congressionally-centered federal system [from 1937 to 1968] occurred most frequently on foreign policy matters and among interventionist antiCommunists.’’61 Even so, Piper noted, congressional conservatives of the period ‘‘were more likely to favor curbing presidential powers than were moderates or liberals.’’62 

In 1966, conservative opposition to the activist presidency remained strong enough that Willmoore Kendall and George W. Carey could write that ‘‘the two camps [i.e., conservatives and liberals] appear to have made permanent and well-nigh irreversible commitments on the President-versus-Congress issue.’’ What would happen, Kendall and Carey wondered, if the future brought a changed political alignment: conservative presidents and liberal Congresses? ‘‘Would liberal and conservative spokesmen . . . be able  to switch sides? That, we may content ourselves with saying, would now take some doing!’’63 

In fact, the two camps did switch sides not long after Kendall and Carey wrote those words. The 1970s brought increasing tension over foreign policy and, perhaps more importantly, the emergence of what political analyst Kevin Phillips called ‘‘the Emerging Republican Majority’’ in the Electoral College. Right-wing ressentiment over Nixon’s downfall helped drive the shift; as right-wing writer M. Stanton Evans later quipped, ‘‘I didn’t like Nixon until Watergate.’’64 By the 1970s, prominent conservatives had begun to see the executive as the conservative branch, and they set to work developing a case for the Imperial Presidency. 

Three months after Nixon resigned, National Review featured a cover story by Jeffrey Hart, ‘‘The Presidency: Shifting Conservative Perspectives?’’ Hart began by noting the ‘‘settled and received view’’ among American conservatives, who ‘‘have been all but unanimously opposed to a strong and activist presidency.’’ It was time, Hart argued, to rethink that view. Foreshadowing the conservative embrace of unitary executive theory in the 1980s, Hart suggested that the growth of the regulatory state demanded a powerful president who could hold the bureaucracy in check. Even more important, according to Hart, was the emergence of a ‘‘fourth branch of government’’ in the form of an activist, left-leaning press. Only a centrist or conservative president willing to use the bully pulpit could compete with the liberal media in the fight for American public opinion.65 

While right-wing intellectuals made the case for presidential dominance, conservatives in Congress worked to defend and enhance the president’s powers. As Piper noted, of ‘‘thirty-seven major roll call [votes] concerning presidential powers of greatest long-term significance [from 1968 to 1986] conservatives took the most pro presidential power position . . . often (as on the item veto, impoundment, and war powers) contradicting conservative positions of the past.’’66 

Another factor in the ideological shift was the growing influence of the neoconservatives, zealous cold warriors who came over from the Left and ‘‘took many of their conceptions of presidential government with them when they left the liberal fold.’’67 In 1974, the ‘‘godfather’’ of the neocons, Irving Kristol, charged (not without reason) that much of the ongoing liberal hostility toward the strong presidency should be understood as distrust of strong Republican presidents. In any event, Kristol wrote, the Imperial Presidency was ‘‘here to stay,’’ and there was ‘‘no reason why this latest version of the democratic republic shouldn’t be a reasonably decent form of government.’’68 

By the Reagan era, prominent conservatives were calling for a repeal of presidential term limits, and for scrapping various post Watergate reforms that they believed had neutered the executive branch. The new conventional wisdom on the Right held that the real threat to separation of powers lay not in an Imperial Presidency, but in an Imperial Congress.69 In 1988, Rep. Newt Gingrich, then a mere backbencher with a gleam in his eye, contributed a foreword to the Heritage Foundation book of that name. In it, Gingrich quoted the Founders on the dangers of concentrating all powers within a single branch, and declared, ‘‘The 100th Congress approaches the despotic institution about which James Madison and Thomas Jefferson wrote.’’70 

The Post-Imperial Presidency? 

Whatever one thought of the trend toward congressional assertiveness, conservatives were right that in the immediate post-Watergate era, the presidency appeared much diminished. It was hard to maintain reverence for the office with Chevy Chase’s Gerald Ford stapling his ear, stabbing himself with a letter opener, and pratfalling all over the set of Saturday Night Live every week. Pundits and political scientists began to speak of the ‘‘post-imperial presidency.’’71 

Even after Ronald Reagan restored an air of competence and command to the office, many continued to lament the state of the American presidency—and conservatives still led the lamenters. After 1986, much of the Right’s ire focused on the separation-of powers fight forced by the Iran-Contra affair. The Reagan administration provoked a constitutional crisis when it sold weapons to Iran in exchange for the release of hostages and then diverted some of the proceeds to the Nicaraguan Contras. In the process, the administration violated a clear statutory ban on ‘‘supporting, directly or indirectly, military or paramilitary operations in Nicaragua by any nation, group, organization, movement, or individual.’72 However desirable it might have been to combat communist influence in the Western Hemisphere, defending the administration’s behavior was  an odd stance for self-described constitutionalists to take. In Iran-Contra, the administration had attempted in secret to combine purse and sword within the executive branch, in defiance of the Framers’ insistence that those powers should never fall into the same hands. 

Conservatives also decried the War Powers Resolution as another instance of an ‘‘Imperial Congress’’ tying the president’s hands. Yet, it’s hard to understand why the WPR upset them so. By implicitly allowing the president the ability to launch a war and prosecute it for at least 60 days, the resolution cedes more power to the president than the Constitution allows. Nor has any president felt much constrained by the law. Since its passage, the WPR has run aground on presidential intransigence and judicial unwillingness to enforce it.73 

Indeed, throughout the 1980s and 1990s, presidents made war more or less at will. On October 25, 1983, 48 hours after the truck bombing that killed 241 marines stationed in Lebanon, President Reagan ordered some 2,000 U.S. troops into the tiny island nation of Grenada, to overthrow a communist-aligned military government.74 In December 1989, his successor, George H. W. Bush, overthrew the Noriega government in Panama without congressional authorization, in the rather defensively titled ‘‘Operation Just Cause.’’ 

President Bush did secure congressional authorization for the 1991 Gulf War, yet for all intents and purposes, that authorization merely ratified the president’s unilateral decision. The president alone had made the decision to send U.S. troops into Saudi Arabia after Saddam Hussein’s invasion of Kuwait. He alone decided that Iraqi aggression would not stand, and insisted that no authorization was needed to send half a million Americans into battle. Dick Cheney, who had returned to the executive branch to serve as Bush’s secretary of defense, told the Senate Armed Services Committee in December 1990 that the president had all the constitutional power he required to expel Iraqi forces from Kuwait. In private, Cheney advised Bush that even asking for support conceded too much to Congress. (Years later, Cheney confirmed that even if Congress refused to authorize the war, he would have advised the president to go ahead anyway.)75 Cheney’s Pentagon fed the war fever with disinformation, warning that a quarter of a million Iraqi troops and 1,500 tanks were massed at the Saudi border, ready to invade. Yet, contemporaneous commercial satellite photos of the region purchased by the St. Petersburg Times told a different story. They showed nothing but desert in the areas where the Iraqi buildup was supposedly taking place.76 

Given the crisis atmosphere promoted by the administration, it’s surprising that the Gulf War vote was as close as it was; Congress authorized the use of force by votes of 250 to 183 in the House and 52 to 47 in the Senate. Following the passage of the use-of-force resolution, the president declared that ‘‘as a democracy, we’ve debated this issue openly and in good faith.’’77 The extent of that good faith can be judged by the president’s behavior on the campaign trail in 1992. At one appearance, he told a Texas audience, ‘‘I didn’t have to get permission from some old goat in Congress to kick Saddam Hussein out of Kuwait.’’78 As his successor’s behavior would show, that attitude was a bipartisan one. 

The Clinton Years: Arrogance of Power Redux 

Strange as it might now seem, opponents of the Imperial Presidency had reasons for cautious optimism upon Bill Clinton’s accession to the presidency in January 1993. As the first Democrat elected to the nation’s highest office in 16 years, the new president belonged to the political party that had since Watergate and Vietnam sought to rein in the executive’s ability to conduct foreign policy without congressional authorization and oversight. Clinton had come of age during Vietnam, a war he vehemently opposed, in part because it was undeclared. He began his political life working on the Senate Foreign Relations Committee for Senator Fulbright, who by then had become one of the Imperial Presidency’s sharpest critics.79 

Grandiose Visions of Leadership 

By the time Clinton took office in 1993, the prevailing rationale for the Imperial Presidency had vanished with the collapse of Soviet communism. Conditions were ideal for a more modest approach to presidential leadership, one that recognized constitutional limits to unilateral action. 

Of course, the Clintons retained the Progressive-Era fascination with the executive branch as the catalyst of moral leadership. On the campaign trail, Governor Clinton promised a ‘‘New Covenant’’ between the government and the governed—a metaphor that had the state stepping in for Yahweh. First Lady Hillary Rodham Clinton proclaimed that America suffered from ‘‘a sleeping sickness of the soul,’’ a deep existential angst stemming from our inability to redefine ‘‘who we are as human beings in this postmodern age.’’ To  heal our spiritual wounds, we’d need ‘‘a new politics of meaning’’ engendered by bold executive action.80 

After the collapse of the Clinton Health Security Act and the Republican sweep of Congress in 1994, the ‘‘politics of meaning’’ gave way to the politics of the poll-tested micro-initiative, courtesy of presidential adviser Dick Morris. Faced with a legislative majority opposed to most of his policies, President Clinton also relied on executive orders to work his will. In 1998, after tobacco control legislation failed in the Senate, Clinton laid the groundwork for successful prosecution of the industry by ordering federal agencies to gather data on teen smoking habits. Later, he nationalized millions of acres of western land by executive fiat, over the objections of Congress and the state governors.81 

Arrogance Abroad 

In foreign affairs, President Clinton was able to operate with still fewer checks on his power. And if he had learned anything from his mentor Senator Fulbright’s critique of foreign policy crusades led by ‘‘high-minded men bent on the regeneration of the human race,’’ it didn’t show.82 In mid-1994, Clinton prepared to invade Haiti to restore ousted president Bertrand Aristide to power. He did so while asserting that he was not ‘‘constitutionally mandated’’ to get congressional approval for a 20,000-troop invasion of a tiny island nation that represented no threat, imminent or otherwise, to America’s security.83 Likewise, in 1994 Clinton unilaterally ordered air strikes in Bosnia and in 1995 ordered 20,000 troops there to enforce a peacekeeping agreement. 

Though some Republicans objected to Clinton’s usurpation of congressional prerogatives, their leadership, for the most part, did not. On June 7, 1995, the House narrowly voted down a bill introduced by Rep. Henry Hyde (R-IL) that would have repealed the War Powers Resolution. In endorsing the measure, then Speaker Newt Gingrich urged the House Republicans to ‘‘increase the power of President Clinton. . . . I want to strengthen the current Democratic President because he is President of the United States.’’84 

But President Clinton had all the power he needed to conduct presidential wars. Operation Allied Force, the air war carried out over Serbia in 1999, was the largest commitment of American fighting forces and material since the Gulf War. As the first war since  Vietnam to continue beyond 60 days without statutory authorization, it also demonstrated that repealing the War Powers Resolution would have been entirely superfluous.85 

U.S.-led NATO air forces flew over 37,000 sorties during the conflict, an average of 486 missions per day. But, echoing Harry Truman’s ‘‘police action’’ word games, administration officials refused to characterize U.S. actions as war. 

Given that the United States was dropping bombs on Serbia and its people, how could we be said not to be at war? White House spokesman Joe Lockhart explained that, much like the president’s definition of the word ‘‘is,’’ before the Starr grand jury, it all depended on what your definition of the word ‘‘war’’ was: 

Q: Is the President ready to call this a low-grade war? 

Lockhart: No. Next question. 

Q: Why not? 

Lockhart: Because we view it as a conflict. 

Q: How can you say that it’s not war? 

Lockhart: Because it doesn’t meet the definition as we define it.86 

However you defined it, this large-scale application of military force wasn’t authorized by Congress. On April 28, 1999, the House voted no on declaring war, 427 to 2; no on authorizing the use of ground troops, 249 to 180; and no on authorizing the president to continue airstrikes, 213 to 213. Because the House voted down legislation that would have authorized the air war, this was not simply another war carried out amidst congressional silence. Congress had considered and rejected authorization—but Clinton continued in defiance of congressional will.87 As National Security Council spokesman David Leavy put it: ‘‘There’s broad support for this campaign among the American people, so we sort of just blew by’’ the House votes.88 

The end of the cold war should have brought the era of crisis government to a close. Yet, it did not end the president’s incentive to gin up emergencies when he finds himself in political trouble. President Clinton’s behavior during the Starr investigation and the impeachment debates makes that clear. On the day the president’s testimony before the Starr grand jury was released to the public, Clinton gave a speech—in the midst of a booming economy—proclaiming that the United States faced the greatest economic crisis in 50 years.89 

Wagging the Dog? 

Far more troubling were what some have called the ‘‘Wag the Dog’’ bombings, after the 1997 film starring Robert De Niro and Dustin Hoffman. In the movie, the Dick Morrisesque spin doctor played by De Niro diverts attention from a presidential sex scandal by enlisting a Hollywood producer (played by Dustin Hoffman) to create a fake war. Unlike the Hollywood version, though, the Washington production used real missiles. 

The third week of August 1998 was a tumultuous one for President Clinton. On Monday, he went on national television to admit his affair with Monica Lewinsky; the president’s non-apology wasn’t well received. On Thursday, with the media reporting that independent counsel Kenneth Starr had obtained a DNA sample from the president, and Lewinsky starting her second round of testimony before the grand jury, President Clinton ordered surprise missile strikes on Sudan and Afghanistan. 

The Sudan strike soon proved to be an early case of missing WMDs. The administration refused to release the evidence it claimed to have relied on for its assertion that the Sudanese pharmaceutical plant targeted in the strike manufactured nerve gas. Independent tests conducted by the head of Boston University’s chemistry department confirmed, contrary to the administration’s claims, that no nerve gas precursors could be found in the soil surrounding the factory.90 The Clinton administration later issued an order unfreezing the plant owner’s assets, rather than coming forward with evidence supporting the owner’s purported connection to Osama Bin Laden. 

Absent the dubious timing, one might, with post-9/11 hindsight, see the missile strike as a laudable attempt to do something about a gathering threat. As it was, apart from shifting the news cycle toward less prurient matters, the administration managed only to knock over some empty tents in Afghanistan and wipe out an important source of medicine in a desperately poor country. 

If the timing of the Afghanistan and Sudan strikes was suspicious, the timing of the ‘‘Desert Fox’’ airstrikes on Iraq could hardly have been more so. The Desert Fox operation began on the eve of the House impeachment debate. President Clinton asserted that ‘‘we had to act and act now [because] without a strong inspections system, Iraq would be free to retain and begin to rebuild its chemical, biological, and nuclear weapons programs—in months, not years.’’91 However, as a direct result of the president’s action, we went nearly four years without any weapons inspection system, strong or otherwise. The inspectors withdrew shortly before the bombing and did not return until November 2002. The urgent need to reestablish inspections seemed to have vanished as soon as the threat of impeachment did.

The timing of President Clinton’s actions inevitably gave rise to suspicion about his motives. Some pundits found those suspicions distressingly cynical. Washington Post columnist David Broder professed to be shocked that then Senate Majority Leader Lott would question the timing of President Clinton’s attack on Iraq,92 and former Nixon speechwriter William Safire could not ‘‘bring [him]self to think’’ that a U.S. president would ‘‘stoop to risking lives to cling to power.’’93 

Is it really so cynical to suppose that embattled presidents might be tempted to distract the public by waging war abroad? Perhaps so, but only in the sense offered by Ambrose Bierce in his Devil’s Dictionary: ‘‘Cynic, n.: a blackguard whose faulty vision sees things as they are, not as they ought to be.’’94 

In 1995, the American Economic Review published an article examining the relationship between military conflict, national economic health, and the presidential election cycle from Eisenhower through Reagan. The authors postulated that conflict initiation or escalation would be more likely in the case of a first-term president up for reelection in the midst of a weak economy, then tested that prediction using data on military conflict and the business cycle. Their results were robust, to say the least; based on the data from 1953 to 1988, ‘‘the probability of conflict initiation or escalation exceeds 60 percent in years in which a president is up for reelection and the economy is doing poorly. By contrast, the probability is only about 30 percent in years in which either the economy is healthy or a president is not up for reelection.’’95 Beleaguered first-term presidents are about twice as likely to resort to the sword as second termers or boomtime leaders. The erosion of Congress’s power ‘‘to declare War’’ means that nothing stands in their way. 

The Framers, too, were cynics in the Biercean sense. They saw human nature for what it is, and rejected unchecked war power for that reason. As Madison put it, the power to start a war had been lodged in Congress because otherwise ‘‘the trust and the temptation would be too great for any one man.’’96 

Our Modern Dilemma 

By the last years of the 20th century, Americans were not a particularly trusting bunch. The long decline in confidence in government that started during the Vietnam era and bottomed out during Watergate had become a permanent feature of the political landscape. The numbers on the University of Michigan’s Trust in Government Index never came close to recovering their Camelot-era vigor; the trust the-feds ‘‘most of the time’’ answer hit a new low of 19 percent in 1994 before the Republican takeover of Congress.97 Trust in the presidency saw a similar decline; those Americans investing ‘‘a great deal’’ of confidence in the executive branch fell from 42 percent in 1966 to 12 percent in 1997.98 

Some among the cognoscenti watched the trust indicators as if they were fading vital signs on a body politic in critical condition. Concern over low levels of faith in government periodically gave rise to solemn conferences at places like the Brookings Institution, the think tank that the Nixon administration had considered firebombing and burglarizing in an attempt to recover classified documents related to Vietnam.99 

Yet, it’s never been clear why a healthy—and, by the 1970s, manifestly justified—distrust of unchecked power should be cause for so much angst. That sort of distrust, after all, is the core of our political heritage. If their fellow citizens’ lack of faith in their leaders troubled late-20th-century bien pensants, one wonders what they would have made of the Founding Generation’s killjoy attitude. As Bernard Bailyn explains in The Ideological Origins of the American Revolution, ‘‘Federalists and antifederalists both agreed that man in his deepest nature was selfish and corrupt; that blind ambition most often overcomes even the most clear-eyed rationality; and that the lust for power was so overwhelming that no one should ever be trusted with unqualified authority.’’100 

Americans’ drift away from that perspective in the early postwar era served as a presidential enabler. Unwarranted trust had allowed unrestrained spying at home and disastrous presidential adventurism abroad. The recovery of our native skepticism helped restrain the former, even if it has not, as yet, had much effect on the latter. 

What’s problematic is that this resurgent skepticism exists side by side with inordinately high expectations for the office. None of Rossiter’s roles has passed to any other institutional actor or  been abandoned as beyond the proper scope or competence of government. 

Father-Protector and National Nursemaid 

The post-Watergate president remained Rossiter’s Protector of the Peace, America’s guardian against everything from natural disasters to ordinary street crime. Despite the collapse of public trust, the president’s authority over disaster relief and crime continued to grow throughout the last three decades of the 20th century. 

In 1979, President Carter further centralized authority for responding to natural disasters, creating the Federal Emergency Management Agency by executive order, combining the responsibilities of various federal agencies under one heading on the bureaucratic chart. President Clinton bumped FEMA up to cabinet status in 1993, but the most significant change in presidential responsibility for natural disasters occurred in 1988 with the passage of the Stafford Act, which gave the president enormous discretion to issue disaster declarations and award federal aid as he pleases.101 

Demand for such aid is virtually limitless—‘‘In Texas they want a declaration every time a cow pisses on a flat rock,’’ one FEMA official groused in the mid-1990s.102 So it’s not surprising that unfettered presidential discretion has led to some dubious expenditures, as in 1996, when President Clinton funneled federal funds to 16 states affected by unseasonably heavy snow.103 Presidents have made liberal use of their Stafford Act powers to bolster their political support in electorally significant states. Political scientist Andrew Reeves studied disaster declarations from 1981 to 2004 and found that ‘‘a highly competitive state can expect to receive over 60% more presidential disaster declarations than an uncompetitive state, holding all else constant including the damage caused by the disaster.’’104 

The FEMA pork barrel allowed presidents to use the public purse as their personal campaign war chest; but it did not represent a threat to civil liberties. In contrast, the burgeoning war on crime, stoked by presidential promises to keep America’s streets safe, had by the last decades of the 20th century seriously undermined the rule of law. As the American Bar Association’s Task Force on the Federalization of Criminal Law put it in 1998, ‘‘So large is the present body of federal criminal law that there is no conveniently accessible, complete list of federal crimes.’’105 By the turn of the 21st century, there were over 4,000 federal crimes, an increase of one-third since 1980.106 As a result, even teams of legal researchers—let alone ordinary citizens—cannot reliably ascertain what federal law prohibits. Though the Constitution mentions only three federal crimes, in the 1970s, 1980s, and 1990s, presidential races increasingly focused on ‘‘law and order,’’ and presidential candidates promised new federal initiatives to keep America safe. 

And even after the shame of Watergate, presidents continued to view themselves as the Voice of the People, using the bully pulpit to stimulate demand for executive action on all matters of public concern. Anyone searching for limits to presidential power or responsibility would be hard pressed to find them in the speeches of post-Watergate presidents. As Elvin T. Lim noted in his 2002 study of presidential rhetoric, by the late 20th century, it was ‘‘all about the children,’’ with ‘‘Presidents Carter, Reagan, Bush, and Clinton [making] 260 of the 508 references to children in the entire speech database, invoking the government’s responsibility to and concern for children in practically every public policy area.’’ Granted, George Washington had mentioned children in his seventh annual message, protesting ‘‘the frequent destruction of innocent women and children’’ by Indian marauders.107 But in the modern State of the Union address, references to children have a different tenor, as when George H. W. Bush told the country in 1992 that ‘‘when Barbara [Bush] holds an AIDS baby in her arms and reads to children, she’s saying to every person in this country, ‘Family Matters,’ ’’ or when Bill Clinton used his 1997 State of the Union to declare, ‘‘We must also protect our children by standing firm in our determination to ban the advertising and marketing of cigarettes that endanger their lives.’’108 

I Hate You; Don’t Leave Me 

Vietnam, Watergate, and the revelations of the Church Committee had reminded Americans about power’s corrupting tendencies. Yet, as the 20th century drew to a close, Americans still seemed to want a president who promised all things to all people. Declining trust had not caused the public to demand less from government as a whole. As the Pew Research Center noted in a 1998 survey, ‘‘Public desire for government services and activism has remained nearly 131  steady over the past 30 years.’’109 The Pew study featured intensive polling carried out between the Republican takeover of Congress and the Clinton impeachment, and it revealed some puzzling tensions in Americans’ attitudes toward government. Sixty-four percent of respondents agreed that ‘‘government controls too much of our daily lives.’’110 Yet, 65 percent also said the government did not pay enough attention to poor people, and 54 percent complained that even the middle class got less attention than it deserved.111 Overwhelming majorities also said that government did not place a high enough priority on ‘‘ensuring access to affordable health care,’’ ‘‘providing the elderly a decent standard of living,’’ ‘‘reducing poverty,’’ or ‘‘reducing juvenile delinquency.’’112 How such responsibilities could be fulfilled without further extension of government controls is a mystery beyond the ken of any pollster. 

Wail to the Chief 

The demand for presidential salvation hit its rhetorical nadir in the 1992 presidential debates, when a ponytailed social worker named Denton Walthall rose to ask Ross Perot, Bill Clinton, and President Bush the following question: 

The focus of my work as a domestic mediator is meeting the needs of the children that I work with, by way of their parents, and not the wants of their parents. And I ask the three of you, how can we, as symbolically the children of the future president, expect the two of you, the three of you to meet our needs, the needs in housing and in crime and you name it. . . . 

‘‘You name it,’’ indeed. Walthall followed up by asking, 

Could we cross our hearts; it sounds silly here, but could we make a commitment? You know, we’re not under oath at this point, but could you make a commitment to the citizens of the United States to meet our needs, and we have many, and not yours. Again, I have to repeat that, it’s a real need, I think, that we all have.113 

Denton Walthall came in for a fair amount of criticism on the oped pages and talk-radio airwaves.114 Yet, under the hot lights, none of the candidates risked chastising him, however gently, for having an overly capacious view of presidential responsibility. Instead, they accepted his premise. Ross Perot said he’d take Walthall’s pledge,  ‘‘no hedges, no ifs, ands and buts.’’ Governor Clinton argued with Perot about who was more authentic and less dependent on ‘‘spin doctors,’’ and noted that as governor, he’d ‘‘worked 12 years very hard . . . on the real problems of real people.’’ ‘‘It depends on how you define it,’’ President George H. W. Bush stammered his reply to Walthall, . . . 

I mean I—I think, in general, let’s talk about these—let’s talk about these issues; let’s talk about the programs, but in the Presidency a lot goes into it. Caring is—goes into it; that’s not particularly specific; strength goes into it, that’s not specific; standing up against aggression, that’s not specific in terms of a program. So I, in principle, I’ll take your point and think we ought to discuss child care, or whatever else it is.115 

Indeed, Walthall’s formulation of the American people as ‘‘symbolically the children of the future president’’ is not far off from how presidents and presidential aspirants—whether of the ‘‘mommy party’’ or ‘‘daddy party’’ variety—in their franker moments describe the relationship between the government and the governed. ‘‘The average American is just like the child in the family,’’ Richard Nixon told an interviewer in 1972, ‘‘you give him some responsibility and he is going to amount to something.’’116 In 1997, then Vice President Al Gore told an audience at George Washington University that the federal government should act ‘‘like grandparents in the sense that grandparents perform a nurturing role.’’117 

One has difficulty imagining a Grover Cleveland or a Calvin Coolidge in a late-20th-century town hall–style debate, perched awkwardly on a stool, trying to look relaxed and amicable. But forced into such an undignified posture, if they restrained themselves from insulting the ponytailed fellow burbling about national needs and likening Americans to children, one can picture a Cleveland or a Coolidge giving a far more modest description of the president’s constitutional responsibilities: execute the laws, defend the Constitution, protect the country from foreign attack and domestic insurrection—and little else. 

In the context of the modern presidency, though, such an answer would make little sense. President Bush’s halting reply to Denton Walthall can’t be blamed merely on the pressure of the moment or on the Bush family’s notorious difficulty with words. Presidential responsibility in the modern era really is that diffuse and unconfined. ‘‘Caring,’’ ‘‘standing up against aggression,’’ ‘‘child care and whatever else it is,’’ are a decent approximation of the modern president’s job description. The president, as Clinton Rossiter put it in The American Presidency, is expected ‘‘to watch like a mother hen over all the eggs in all our baskets,’’ and perhaps, as presidential responsibility has expanded over the four decades since Rossiter’s observation, to provide us with still more eggs.118 Chief Legislator, Manager of Prosperity, shield against disaster, defender of the free world, living embodiment of the general will—the burden of these expanded functions, Rossiter noted, ‘‘is monstrous.’’119 

With Great Responsibility Comes Great Power 

Monstrous, yes—and dangerous. No one man, however powerful, can meet responsibilities so vast. Thus, we should not be surprised that presidential approval ratings have been in a steady 40-year decline.120 The office, as it has evolved, is set up to fail. Worse, the incentives for the officeholder are to seek still more power as a result of the failure. 

Surveying the pedagogical materials of the late 1960s, political scientist Thomas Cronin announced that the president described in America’s textbooks was ‘‘Superman.’’ Nearly 40 years later, Americans no longer fully believe in the heroic president. Yet, the president’s job description still requires a superhero. And to reverse the credo of another comic book hero: with great responsibility comes great power. If the president is charged with righting all the country’s and the world’s wrongs, he’s going to seek the vast power needed to discharge those responsibilities. In peacetime, he’ll ask for that power; faced with an emergency, he may seize it. 

‘‘War is the health of the state,’’ wrote Randolph Bourne as the Great War raged across Europe, and America slipped toward entanglement in that vast continental tragedy. Throughout the 20th century, real wars and ersatz wars on various social maladies— crime, domestic subversion, poverty, drugs—validated Bourne’s dictum, delivering enormous power to government in general and the presidency in particular. In time of crisis, real or imagined, presidential responsibility has relentlessly expanded, as Americans have turned to the president for deliverance. 

Crisis was far from the public mind in the bright early fall of 2001. Americans followed the hunt for Chandra Levy, wondered about Rep. Gary Condit, and watched their new president’s difficulties with a Senate that had recently lost its Republican majority. Politics had rarely seemed so pleasantly inconsequential. 

In one terrible morning, all that would change.

next

Superman Returns


source and footnotes for chapter 4

https://www.cato.org/sites/cato.org/files/documents/cult-of-the-presidency-pb.pdf

No comments:

Part 1 Windswept House A VATICAN NOVEL....History as Prologue: End Signs

Windswept House A VATICAN NOVEL  by Malachi Martin History as Prologue: End Signs  1957   DIPLOMATS schooled in harsh times and in the tough...