Aquest lloc utilitza galetes per a oferir els nostres serveis, millorar el desenvolupament, per a anàlisis i (si no has iniciat la sessió) per a publicitat. Utilitzant LibraryThing acceptes que has llegit i entès els nostres Termes de servei i política de privacitat. L'ús que facis del lloc i dels seus serveis està subjecte a aquestes polítiques i termes.
A companion to the ten-part documentary series outlines provocative arguments against official American historical records to reveal the origins of conservatism and the obstacles to progressive change.
Too many errors. High school history plus some bad bits - a REAL limited hangout.
Many incorrect claims, without discussion, reference or proof. Such as the Japanese attack on Pearl Harbor being a surprise, despite the intercepted cables - except for one - being in the public domain.
And repeats the state's moronic propaganda line on 9/11.
Oliver Stone, ganador de un Óscar de la Academia, y el historiador Peter Kuznick nos desvelan la otra cara de la historia de Estados Unidos analizando los grandes acontecimientos que desde la Guerra de Secesión y hasta la actualidad han marcado el «siglo americano» a través de un prisma crítico y constructivo. El resultado es un libro que cuestiona el discurso oficial transmitido dentro y fuera de las fronteras de la superpotencia —centrándose en los errores porque los grandes aciertos ya han sido glorificados— que han marcado la historia de Estados Unidos y, por tanto, del mundo.
La Primera Guerra Mundial, el New Deal, la bomba atómica, el asesinato de Kennedy, la carrera armamentística de Reagan, el 11-S, la llegada de Obama al poder… son solo algunos de los importantes hitos que los autores revisitan y examinan. Porque tal y como ellos mismos afirman en la introducción: «Somos esclavos de nuestra concepción del pasado y rara vez nos damos cuenta de hasta qué punto esa forma de entender la historia determina nuestro comportamiento aquí y ahora. La comprensión de la historia define nuestra idea de lo concebible, de lo realizable».
Even if this book is only 50% true it is still horrifying.
How a nation who has set itself up as the keeper of the peace and the saviour of smaller nations can have caused the deaths and hardships to all these people. All the time we were reading the news, but did not know the news behind the news.
The biggest tragedy is the last chapter about Barak Obama who was elected with the hope of the people that the US would cease sending their young men to die in war, but he is powerless against the moneymakers who really run the country.
Informació del coneixement compartit en anglès.Modifica-la per localitzar-la a la teva llengua.
This book and the documentary film series it is based on challenge the basic narrative of U.S. history that most Americans have been taught. That popular and somewhat mythic view, carefully filtered through the prism of American altruism, benevolence, magnanimity, exceptionalism, and devotion to liberty and justice, is introduced in early childhood, reinforced throughout primary and secondary education, and retold so often that it becomes part of the air that Americans breathe. It is consoling; it is comforting. But it only tells a small part of the story. It may convince those who don’t probe too deeply, but like the real air Americans breathe, it is ultimately harmful, noxious, polluted. It not only renders Americans incapable of understanding the way much of the rest of the world looks at the United States, it leaves them unable to act effectively to change the world for the better. For Americans, like people everywhere, are in thrall to their visions of the past, rarely realizing the extent to which their understanding of history shapes behavior in the here and now. Historical understanding defines people’s very sense of what is thinkable and achievable. As a result, many have lost the ability to imagine a world that is substantially different from and better than what exists today.
Citacions
Informació del coneixement compartit en anglès.Modifica-la per localitzar-la a la teva llengua.
That Obama refuses to trumpet the notion that the United States is history’s gift to humanity has become an article of faith among Republican leaders who, knowing that 58 percent of Americans believe that “God has granted America a special role in human history,” have opportunistically used Obama’s less-than-full-throated assent to bludgeon him. Former Arkansas Governor Mike Huckabee charged that Obama's “worldview is dramatically different than any president, Republican or Democrat, we’ve had. . . . He grew up more as a globalist than an American. To deny American exceptionalism is in essence to deny the heart and soul of this nation.”
Historians have long since discredited the myth that revulsion caused by the war and European entanglements plunged the United States into isolationism in the 1920s. In fact, World War I marked the end of European dominance and the ascendancy of the United States and Japan, the war’s two real victors. The twenties saw a rapid expansion of American business and finance around the globe. New York replaced London as the center of world finance. The era of U.S. domination of the world economy had now begun. Among the leaders in this effort were the oil companies.
In fact, Harding and his Republican successors made more friends among U.S. bankers than among the inhabitants of those little republics. In May 1922, The Nation reported, revolutionaries sparked an uprising against “Brown Bros.’ extremely unpopular President of Nicaragua.” When the revolutionaries captured a fort overlooking the capital, the U.S. marine commander simply alerted them that he would use artillery if they didn't relinquish control. The Nation saw this as typical of what was happening throughout Latin America, where U.S. bankers ruled through puppet governments backed up by U.S. troops. The magazine inveighed against this deplorable situation:
There are, or were, twenty independent republics to the south of us. Five at least—Cuba, Panama, Haiti, Santo Domingo, and Nicaragua—have already been reduced to the status of colonies with at most a degree of rather fictitious self-government. Four more—Guatemala, Honduras, Costa Rica, and Peru—appear to be in process of reduction to the same status. Mr. Hughes is not treating Mexico as a sovereign, independent state. How far is this to go?... Is the United States to create a great empire in this hemisphere—an empire over which Congress and the American people exercise no authority, an empire ruled by a group of Wall Street bankers at whose disposal the State and Navy Departments graciously place their resources? These are the questions which the people, the plain people whose sons die of tropic fever or of a patriot’s bullet, have a right to ask.
By the early 1920s, the America of Jefferson, Lincoln, Whitman, and the young William Jennings Bryan had ceased to exist. It had been replaced by the world of Mckinley, Teddy Roosevelt, J. Edgar Hoover, and Woodrow Wilson. Wilson’s failures, in many ways, provide a fitting capstone to a period in which the United States' unique mixture of idealism, militarism, avarice, and realpolitik propelled the nation toward becoming a world power. Wilson proclaimed, “America is the only idealistic nation in the world” and acted as if he believed it were true. He hoped to spread democracy, end colonialism, and transform the world. His record is much less positive. While supporting self-determination and opposing formal empire, he intervened repeatedly in other nations’ internal affairs, including Russia, Mexico, and throughout Central America. While encouraging reform, he maintained a deep mistrust of the kind of fundamental, and at times revolutionary, change that would actually improve people’s lives. While championing social justice, he believed that property right were sacrosanct and must never be infringed upon. Though endorsing human brotherhood, he believed that nonwhites were inferior and resegregated the federal government. While extolling democracy and the rule of law, he oversaw egregious abuses of civil liberties. While condemning imperialism, he sanctioned the maintenance of the global imperial order. And while proclaiming a just, nonpunitive peace, he acquiesced in a harsh, retributive peace that inadvertently helped create the preconditions for the rise of Hitler and the Nazis. Wilson’s stunningly inept performance at Versailles and his combative intransigence upon his return home contributed to Senate defeat of the treaty and the League.
Thus the war would have consequences that went far beyond the horrors on the battlefield. The United States never joined the League of Nations, rendering that body impotent in the face of Fascist aggression in the 1930s. Revelations that the United States had entered the First World War on false pretenses, while bankers and munitions manufacturers—later labeled “merchants of death”—had raked in huge profits, created widespread skepticism about foreign involvements at a time when the United States needed to contend with a real “axis of evil”: Germany, Italy, and Japan. By the time the United States acted, it was much too late. The necessity of finally combating fascism would, however, afford the United States an opportunity to reclaim some of that democratic, egalitarian heritage on which its earlier greatness and moral leadership had rested. And, though late in entering World War II, the United States provided crucial assistance in defeating Europe’s fascists and played the decisive role in defeating Japan’s militarists. But by setting off the atomic bombs in Hiroshima and Nagasaki at the end of the war, the United States, once again, proved itself unready to provide the kind of leadership a desperate world cried out for.
The results seemed to justify that description. While the United States and the rest of the capitalist world plunged deeper into depression, the Soviet economy appeared to be booming. In early 1931, the Christian Science Monitor reported that not only was the Soviet Union the only country to have escaped the Depression, its industrial production had jumped an astronomical 25 percent the previous year. In late 1931, The Nation’s Moscow correspondent described the Soviet frontier as “a charmed circle which the world economic crisis cannot cross. . . . While banks crash . . . abroad, the Soviet Union continues in an orgy of construction and national development.” The Nation could be dismissed as a liberal publication, but similar reports in Barron’s, Business Week, and the New York Times were harder to disregard. As the U.S. unemployment rate approached 25 percent, a Times report that the Soviet Union intended to hire foreign workers caused desperate jobless Americans to stampede Soviet offices in the United States. Despite official Soviet disclaimers, Business Week reported that the Soviets planned to import 6,000 Americans and that 100,000 had applied. Soviet society seemed to be undergoing an incredible transformation from agrarian backwardness to industrial modernization before people’s eyes.
The first investigation was short-lived but shocking in its implications. In November 1934, highly decorated retired Marine General Smedley Butler told the House Special Committee on Un-American Activities that William Doyle, commander of the American Legion’s Massachusetts branch, and bond salesman Gerald MacGuire had tried to recruit him to organize a military coup against the Roosevelt administration. Paul Comly French, a reporter for the New York Evening Post and the Philadelphia Record, corroborated Butler’s account, testifying that he overheard MacGuire say at one point, “We need a Fascist government in this country to save the Nation from the Communists who want to tear it down and wreck all that we have built in America. The only men who have patriotism to do it are the soldiers and Smedley Butler is the ideal leader. He could organize one million overnight.” MacGuire had gone to France to study fascist veterans’ movements, which he envisioned as a model for the force Butler could organize in the United States.
The Nye Committee investigations showed that Wilson had, in effect, lied the country into war. He had undermined neutrality by allowing loans and other support to the Allies, deliberately exaggerated claims of German atrocities, and covered up the fact of his knowledge of the secret treaties. Far from being a war to further democracy, it had been a war to redivide the spoils of empire.
Ford and [head of IBM, Thomas] Watson should both have known better. In 1937, Ford’s German subsidiary was manufacturing heavy trucks and troop carriers for the German Wehrmacht. In July 1939, the subsidiary changed its name to Ford-Werke. Farben, which was later convicted of crimes against humanity for operating the Buna rubber plant at Auschwitz and supplying the notorious Zyklon-B tablets used to exterminate Jews, owned 15 percent of the company. When the war started in 1939, Ford and GM still controlled their German subsidiaries, which dominated the German auto industry. Despite their subsequent disclaimers, they refused to divest themselves of their German holdings and even complied with German government orders to retool for war production, while resisting similar demands from the U.S. government to retool their factories at home. Sloan justified such behavior in March 1939, following the Nazi occupation of Czechoslovakia, based on the fact that the German operations were “highly profitable.” Germany's internal politics, he insisted, “should not be considered the business of the management of General Motors.” Opel converted the 432-acre complex in Russelsheim to production of Luftwaffe warplanes, providing fully 50 percent of the propulsion systems for Germany's JU-88 medium-range bombers while also helping to develop the world's first jet fighter, the ME-262, which was capable of speeds a hundred miles per hour faster than the United States’ P-51 Mustangs. In appreciation of their efforts, the Nazi government decorated Henry Ford with the Grand Cross of the German Eagle in 1938, four months after Germany had annexed Austria, and similarly honored James D. Mooney, GM's chief overseas executive, one month later. Ford’s parent company lost effective control of the company during the war years, when Ford-Werke supplied the regime with arms, employing prisoners from nearby Buchenwald concentration camp as slave labor. When a former prisoner Elsa Iwanowa brought suit against the company in 1998, the Ford Motor Company hired a small army of researchers and lawyers to whitewash its unscrupulous behavior and promote its preferred image as part of the “arsenal of democracy.” Just after the war, however, a report by U.S. Army investigator Henry Schneider called Ford-Werke an “arsenal of Nazism.” And, as Bradford Snell discovered during his congressional investigation into the monopolistic practices of the auto industry, through “their multinational dominance of motor vehicle production, GM and Ford became principal suppliers for the forces of fascism as well as for the forces of democracy.”
Prominent among the American capitalists with ties to Nazi counterparts was Prescott Bush, the father of one president and grandfather of another. Researchers have been trying for years to determine the precise nature of Bush’s ties to Fritz Thyssen, the wealthy German industrialist who played a crucial role in bankrolling Hitler, as revealed in his 1941 memoirs I Paid Hitler. Thyssen ultimately repudiated the Nazi dictator and was himself imprisoned.
While incarcerated, Thyssen's vast wealth was protected overseas, much of it by the investment firm of Brown Brothers Harriman, through the holding company Union Banking Corporation. The account was managed by senior partner Prescott Bush. In 1942, the U.S. government seized Union Banking Corporation under the Trading with the Enemy Act for its association with the Thyssen-owned Bank voor Handel en Scheepvaart NV of Rotterdam. The government also seized four other Thyssen-linked companies whose accounts Bush handled: the Holland-American Trading Company, the Seamless Steel Equipment Corporation, the Silesian-American Corporation, and the Hamburg-Amerika Line shipping company.
The international situation deteriorated further in 1938 when the Germans annexed Austria and the Allies capitulated to Hitler at Munich, giving Germany the Sudetenland in Czechoslovakia. British Prime Minister Neville Chamberlain infamously proclaimed that the settlement had brought “peace in our time.” Roosevelt knew better. The British and French, he insisted, had abandoned the helpless Czechs and would “wash the blood from their Judas Iscariot hands.” But Roosevelt also knew that the United States itself was offering little support to those who wanted to stand up to the Nazi dictator. Nor did the United States do enough to help Germany and Austria’s desperate Jewish communities. In 1939 the United States admitted its full quota of 27,300 German and Austrian immigrants—the only year in which it did so. But with hundreds of thousands of Jews seeking refuge, U.S. assistance proved woefully inadequate. And Roosevelt made no effort to raise the low quota established by discriminatory immigration legislation in 1924.
Hitler struck again in March 1939, invading Czechoslovakia. Stalin understood that the Soviet Union’s turn was coming soon. For years the Soviet dictator had implored the West to unite against Hitler and Mussolini. The Soviet Union even joined the League of Nations in 1934. But Soviet pleas for collective security against the fascist aggressors were repeatedly ignored. Following Hitler’s assault on Czechoslovakia, Stalin again urged England and France to join in defense of Eastern Europe. His entreaties fell on deaf ears.
Few people believed that the Soviets could withstand the Nazi onslaught. The U.S. Army calculated that they could hold on for no more than three months and might even fold in four weeks. Roosevelt and Churchill desperately sought to keep the Soviets in the war, knowing that Great Britain’s survival might depend on it. Swallowing his long-standing hatred of communism, Churchill pledged support for the Soviet Union and urged his allies to do the same. He promised “to destroy Hitler and every vestige of the Nazi regime.” Acting Secretary of State Sumner Welles issued a statement on behalf of the president indicating that material assistance to the Soviet Union might be forthcoming but left the question of lend-lease up in the air for the time being. Some tried to nip that idea in the bud. Missouri Senator Harry Truman fanned the flames of mistrust toward the Soviet Union, recommending, “If we see that Germany is winning we ought to help Russia, and if Russia is winning we ought to help Germany, and that way let them kill as many as possible.”
That afternoon Truman met with his old Senate mentor James F. “Jimmy” Byrnes. Admitting his abject ignorance, Truman implored Byrnes to tell him about everything “from Tehran to Yalta” and “everything under the sun.” Because Byrnes had been part of the U.S. delegation at Yalta, Truman assumed he had accurate knowledge about what transpired. It would be many months before Truman discovered that that was not the case. In this and subsequent meetings, Byrnes reinforced Stettinius’s message that the Soviets were breaking the Yalta Agreement and that Truman needed to be resolute and uncompromising with them. He also gave Truman his first real briefing about the atomic bomb, which, he conjectured, “might well put us in a position to dictate our own terms at the end of the war.” He did not specify exactly to whom the United States would be dictating terms. Truman so trusted Byrnes that he made clear his intention to appoint him secretary of state as soon as Stettinius had gotten the United Nations off the ground. Truman’s close friend and appointments secretary Matthew Connelly later wrote, “Mr. Byrnes came from South Carolina and talked to Mr. Truman and immediately decided that he would take over. Mr. Truman to Mr. Byrnes, I’m afraid, was a nonentity, as Mr. Byrnes thought he had superior intelligence.” Superior intelligence, perhaps, but, between this unlikely pair, who would do so much to shape the postwar world, Truman had more formal education, having at least graduated from high school, whereas Byrnes had dropped out at age fourteen.
The Potsdam meeting, though amicable on the surface, would prove a set-back to long-term cooperation. News of the successful atomic bomb test convinced Truman that the United States could get along just fine without catering to Soviet concerns, and his behavior toward Stalin conveyed that message. On his way back from Potsdam on the USS Augusta, he told a group of officers that it didn’t matter if the Soviets were obstinate “because the United States now had developed an entirely new weapon of such force and nature that we did not need the Russians—or any other nation.”
William Stephenson, head of British intelligence in New York, even deployed Roald Dahl to spy on Wallace when the RAF lieutenant and future writer was posted to Washington, D.C. In 1944, Dahl got hold of a draft of [Vice President Henry] Wallace’s forth-coming pamphlet “Our Job in the Pacific.” What he read, he said, “made my hair stand on end.” Wallace called for the “emancipation of . . . colonial subjects” in British India, Malaya, and Burma, French Indochina, the Dutch East Indies, and many small Pacific islands. Dahl secreted the manuscript out of Wallace’s friend’s home and rushed it to British officials to copy and transmit to British intelligence and Churchill. “I was later told,” Dahl reminisced, “that Churchill could hardly believe what he was reading.” Wallace wrote in his diary that “the entire British Secret Service was shaking with indignation as well as the British Foreign Office.” British leaders pressured Roosevelt to censure and part ways with his vice president. Stephenson remarked, “I came to regard Wallace as a menace and I took action to ensure that the White House was aware that the British Government would view with concern Wallace’s appearance on the ticket at the 1944 presidential elections.” Dahl, whose main job in Washington was monitoring Wallace’s activities—the two regularly walked and played tennis together—said his “friend” was “a lovely man, but too innocent and idealistic for this world.”
[Vice President Henry] Wallace’s acclaim made it even more urgent for his detractors to make their move. Knowing that Roosevelt’s failing health meant he would not survive a fourth term, the party bosses decided to oust Wallace from the ticket and replace him with someone more amenable to the party’s conservative factions. In 1944, they staged what was known among insiders as “Pauley’s coup,” named after Democratic Party Treasurer and oil millionaire Edwin Pauley. Pauley had once quipped that he went into politics when he realized that it was cheaper to elect a new Congress than to buy up the old one. Pauley’s co-conspirators included Edward Flynn of the Bronx, Mayor Edward Kelly of Chicago, Mayor Frank Hague of Jersey City Postmaster General and former Party Chairman Frank Walker, Party Secretary George Allen, and national Democratic Party Chairman Robert Hannegan.
After going through the list of potential candidates, the party bosses chose undistinguished Missouri Senator Harry Truman to replace Wallace. They picked Truman not because he had any substantial qualifications for the position but because he had been sufficiently innocuous as a senator that he had made few enemies and he could be counted on not to rock the boat. They gave little, if any, thought to the attributes that would be necessary to lead the United States and the world in the challenging times ahead, when decisions would be made that would shape the course of history. Thus Truman’s ascent to the presidency, like much of his career, was a product of backroom deal-making by corrupt party bosses.
Party leaders made sure they had an iron grip on the convention. Yet the rank-and-file Democrats would not go quietly, staging a rebellion on the convention floor. The groundswell of support for [Vice President Henry] Wallace among the delegates and attendees was so great that despite the bosses’ stranglehold over the proceedings and strong-arm tactics, Wallace’s supporters almost carried the day as an uproarious demonstration for Wallace broke out on the convention floor. In the midst of the demonstration, Florida Senator Claude Pepper realized that if he got Wallace’s name into nomination that night, Wallace would sweep the convention. Pepper fought his way through the crowd to get within five feet of the microphone when the nearly hysterical Mayor Kelly, purporting that there was a fire hazard, got the chairman, Senator Samuel Jackson, to adjourn the proceedings. Had Pepper made it five more feet and nominated Wallace before the bosses forced adjournment against the will of the delegates, Wallace would have become president in 1945 and the course of history would have been dramatically altered. In fact, had that happened, there might have been no atomic bombings, no nuclear arms race, and no Cold War. Wallace was far ahead in the initial balloting. But the bosses further restricted admission and made the requisite backroom deals. Truman finally prevailed on the third ballot. Ambassadorships, postmaster jobs, and other positions were offered. Cash payoffs were made. Bosses called every state chairman, telling them that the fix was in and Roosevelt wanted the Missouri senator as his running mate. On Roosevelt’s urging, Wallace agreed to remain in the cabinet as secretary of commerce.
By February 25, 1942, the FBI had incarcerated all adult males of Japanese ancestry on Terminal Island, California. The U.S. Navy gave all other residents of Japanese ancestry forty-eight hours to clear out. Between March and October 1942, the Wartime Civil Control Administration (WCCA) opened temporary camps, known as assembly centers, to hold Japanese inmates, who were registered and given numbers. In Santa Anita and Tanforan, California, families were housed in horse stables, a single stall accommodating five or six people. They were later moved to more permanent relocation centers, referred to at the time as “concentration camps”. Conditions in the camps were deplorable; they often lacked running water, bathroom facilities, decent schools, insulated cabins, and proper roofs. The camps did, however, have adequate barbed-wire fencing, machine-gun installations, and guard towers. Appalled by the treatment of the prisoners, Milton Eisenhower resigned as director of the War Relocation Authority (WRA).
Some westerners were motivated by greed in supporting the evacuations. Because evacuees were allowed to take away only what they could carry, their former neighbors eagerly bought their property at a fraction of its real value or seized what was left behind, including abandoned crops. A leader of the Grower-Shipper Vegetable Association of Central California admitted, “We’re charged with wanting to get rid of the Japs for selfish reasons. We might as well be honest. We do. It’s a question of whether the white man lives on the Pacific Coast or the brown man.” The Japanese lost an estimated $400 million in personal property—worth perhaps $5.4 billion today.
Truman was dining on board the USS Augusta on his way back from Potsdam when he learned of Hiroshima. He jumped up and exclaimed, “This is the greatest thing in history.” He shortly thereafter said that announcing the news of Hiroshima was the “happiest” announcement he had ever made.
Truman’s reported jubilation made some uncomfortable. One Democratic committeeman admonished him by telegram two days later: “no president of the United States could ever be jubilant over any device that could kill innocent human beings. Please make it clear that it is not destruction but the end of destruction that is the cause of jubilation.”
After the war, Japanese leaders attributed the surrender to both the atom bombs and the Soviet invasion. Although the interviews were carried out by U.S. occupation authorities, several still gave primacy to the Soviet invasion—not the atomic bomb or other U.S. actions. Deputy Chief of Staff General Torashiro Kawabe explained:
It was only in a gradual manner that the horrible wreckage which had been made of Hiroshima became known. . . . In comparison, the Soviet entry into the war was a great shock when it actually came. Reports reaching Tokyo described Russian forces as “invading in swarms.” It gave us all the more severe shock and alarm because we had been in constant fear of it with a vivid imagination that “the vast Red Army forces in Europe were now being turned against us.”
Admiral Toyoda agreed: “I believe the Russian participation in the war against Japan rather than the atom bombs did more to hasten the surrender.” Lieutenant General Sumihisa Ikeda, the director of Japan's General Planning Agency, said that “upon hearing of the Soviet entry into the war, I felt that our chances were gone.” The Army Ministry responded similarly to a direct question from General Headquarters, stating, “The Soviet participation in the war had the most direct impact on Japan's decision to surrender.” A study conducted by the U.S. War Department in January 1946 came to the same conclusion, finding “little mention ... of the use of the atomic bomb by the United States in the discussions leading up to the . . . decision . . . it [is] almost a certainty that the Japanese would have capitulated upon the entry of Russia into the war.”
Nor did dropping atomic bombs on Hiroshima and Nagasaki make the Soviet Union more pliable. It merely convinced Stalin that the United States would stop at nothing to impose its will and that the Soviets must speed the development of their own atomic bomb as a deterrent to the bloodthirsty Americans.
And in what many consider a cruel irony, the United States allowed Japan to keep the emperor, whose retention, most experts believed, was essential to postwar social stability. Contrary to Byrnes’s admonitions, Truman suffered no political repercussions from that decision.
The nuclear arms race that Szilard and others feared was now under way. Truman had helped make real his nightmarish vision of a world poised on the brink of annihilation. Stimson made the same point in his 1947 defense of the bombing, writing, “In this last great action of the Second World War we were given final proof that war is death. War in the twentieth century has grown steadily more barbarous, more destructive, more debased in all its aspects. Now, with the release of atomic energy, man’s ability to destroy himself is very nearly complete.”
The early Cold War would be animated by the clash between two fundamentally different visions of the U.S. role in the world—Henry Luce’s hegemonic vision of the twentieth century as the “American Century” and Henry Wallace’s utopian vision of the “Century of the Common Man.” The stakes would be enormous.
Having just returned from a conference on atomic energy at the University of Chicago, [Henry A.] Wallace understood the real stakes better than Truman and other administration officials. The experts agreed that whatever secret there might have been to the atomic bomb had vanished when the United States dropped the first one on Hiroshima. They also knew, as the Franck Committee had warned in June, that the Soviet Union would soon develop its own atomic arsenal. The scientists in attendance drove home the fact that the current generation of atomic weapons paled by comparison to what would soon be available. Therefore, they concluded, steps to curb an arms race were essential and urgent. Wallace had told the gathering that “any nation that violates the international moral law, sooner or later gets into trouble—the British have done that in relation to colonial peoples and the United States [is] in danger of doing it with the atomic bomb.” He conveyed that same message to his fellow cabinet members.
On December 9, 1950, MacArthur requested authorization to use atomic bombs at his discretion. On December 24, he submitted a list of twenty-six targets. He also requested four bombs to drop on invading forces and four more for “critical concentrations of enemy air power.” He calculated that dropping thirty to fifty atomic bombs “across the neck of Manchuria” could produce “a belt of radioactive cobalt” that would win the war in ten days. But that was just the short-term effect. The belt of radioactive cobalt would spread “from the Sea of Japan to the Yellow Sea.” Therefore, he figured, “For at least 60 years there could have been no land invasion of Korea from the North.”
During his time in office, Eisenhower would be confronted with repeated opportunities to roll back the Cold War and arms race. Presiding over the world’s most powerful nation during perhaps the tensest extended period in history, he could have taken bold action that could have put the world on a different path. Signs emanating from Moscow indicated that the Kremlin might be ready to change course. But because of ideology, political calculations, the exigencies of a militarized state, and limited imagination, he repeatedly failed to seize the opportunities that emerged. And although he deserves credit for avoiding war with the Soviet Union at a time when such a war seemed quite possible, he left the world a far more dangerous place than when he first took office.
Eisenhower prepared for their use by transferring control of the atomic stockpile from the AEC [United States Atomic Energy Commission] to the military. Truman had transferred nine weapons to Guam in 1951 but had otherwise insisted on retaining civilian control. He said he did not want “to have some dashing lieutenant colonel decide when would be the proper time to drop one.” Eisenhower had no such compunctions. In June 1953, he began transferring atomic bombs from the AEC to the Defense Department to enhance operational readiness and protect them from surprise Soviet attack. In December 1954, he ordered 42 percent of atomic bombs and 36 percent of hydrogen bombs deployed overseas, many menacingly close to the Soviet Union. By 1959, the military had custody of more than 80 percent of U.S. nuclear weapons.
To counter this pervasive anti-nuclear sentiment, the NSC’s [United States National Security Council] Operations Coordinating Board proposed that the United States launch a “vigorous offensive on the non-war uses of atomic energy” and offer to build Japan an experimental nuclear reactor. AEC Commissioner Thomas Murray applauded this “dramatic and Christian gesture,” believing it “could lift all of us far above the recollection of the carnage” of Hiroshima and Nagasaki. The Washington Post offered its own hearty endorsement, seeing the project as a way to “divert the mind of man from his present obsession with the armaments race” and added, in an extraordinary admission, “Many Americans are now aware . . . that the dropping of the atomic bombs on Japan was not necessary. ... How better to make a contribution to amends than by offering Japan the means for the peaceful utilization of atomic energy. How better, indeed, to dispel the impression in Asia that the United States regards Orientals merely as cannon fodder!”
In what would seem the cruelest irony yet, Murray and Illinois Representative Sidney Yates proposed building the first nuclear power plant in Hiroshima. In early 1955, Yates introduced legislation to build a 60,000 kilowatt generating plant in the city that less than a decade earlier had been the first target of the atomic bomb.
Later in the decade, the air force devised even more grandiose schemes. Testifying before the House Armed Services Committee in February 1958, Lieutenant General Donald Putt disclosed plans for missile bases on the moon. Putt explained, “Warheads could be catapulted from shafts sunk deep into the moon’s surface,” providing “a retaliation base of considerable advantage over earthbound nations” if the United States were militarily destroyed. An enemy wanting to take out those bases prior to attacking on earth would “have to launch an overwhelming nuclear attack against those bases one to two days prior to attacking the continental United States,” clearly signaling that such an attack was coming. Air Force Assistant Secretary Richard Horner later testified that such bases could break a nuclear stalemate on earth and restore the United States’ first-strike capability. Putt added that if the Soviets established their own moon bases to neutralize the United States' advantage, the United States could erect bases on more distant planets from which it could retaliate against both the Soviet Union and its moon bases. In assessing those plans, the independent journalist I. F. Stone astutely noted that the Latin word for “moon” is luna and suggested that the military establish a fourth branch for space warfare and call it the Department of Lunacy.
Yet under Eisenhower the United States went from having a little more than 1,000 nuclear weapons to approximately 22,000, aimed at 2,500 targets in the Soviet Union. But even the 22,000 figure is misleading. Procurements authorized by Eisenhower continued into the 1960s, making Eisenhower responsible for more than 30,000 nuclear weapons during the Kennedy administration. Between 1959 and 1961, the United States added 19,500 nuclear weapons to its arsenal. The United States was producing new weapons at the rate of 75 per day and doing so at bargain-basement prices. As Pulitzer Prize-winning author Richard Rhodes notes, “Nuclear warheads cost the United States about $250,000 each: less than a fighter-bomber, less than a missile, less than a patrol boat, less than a tank.” Total megatonnage increased sixty-five-fold in five years, reaching 20,491 megatons in 1960. In pure megatonnage, that was the equivalent of 1,360,000 Hiroshima bombs. Although the total megatonnage began to drop in 1961, as 950 10-megaton B36 bombs were retired, the bombs’ destructive capability actually increased as the introduction of ballistic missiles made targeting more accurate. Doubling the accuracy of delivery allows for an eightfold reduction in yield without sacrificing the bombs’ destructive capability.
To an outside observer, it might have seemed that Americans had taken leave of their senses in the summer and fall of 1961 as the nation conducted an extended conversation about the ethics of killing friends and neighbors in order to protect the sanctity, security, and limited resources in one’s home fallout shelter. In August, Time magazine published an article titled “Gun Thy Neighbor,” which quoted one Chicago suburbanite as saying, “When I get my shelter finished, I’m going to mount a machine gun at the hatch to keep the neighbors out if the bomb falls. I’m deadly serious about this. If the stupid American public will not do what they have to do to save themselves, I’m not going to run the risk of not being able to use the shelter I’ve taken the trouble to provide to save my own family.”
At public meetings, neighbors with shelters told next-door neighbors and best friends that they would shoot them if necessary. Clergy weighed in on both sides of the issue. Rev. L. C. McHugh, a former professor of ethics at Georgetown, fueled the controversy when he wrote in the Jesuit magazine America: “Think twice before you rashly give your family shelter space to friends and neighbors or to the passing stranger ... others try[ing] to break in ... may be ... repelled with whatever means will effectively deter their assault. ... Does prudence also dictate that you have some ‘protective devices’ in your survival kit, e.g. a revolver for breaking up traffic jams at your shelter door? That’s for you to decide, in the light of your personal circumstances.”
Still, the chilling specter of nuclear war hung over the first two years of the Kennedy presidency. Having won the election in part by exploiting the fear of a missile gap, once in office Kennedy asked McNamara to quickly ascertain just how big the gap was. It took only three weeks to confirm that the much publicized missile gap did not exist.
Kennedy wanted to keep that information from the public. He intended to exploit the apocryphal missile gap to justify a robust increase in defense spending. But on February 6, his politically inexperienced secretary of defense shocked reporters by announcing “There’s no missile gap.” McNamara offered to resign over this faux pas. Kennedy explained that all such judgments were “premature,” and the issue faded quickly.
Kennedy did approve [Maxwell] Taylor’s other recommendations and expanded U.S. involvement. The number of U.S. military personnel in Vietnam jumped from 800 when Kennedy took office to over 16,000 in 1963. The United States began resettling villagers at gunpoint behind barbed-wire-enclosed compounds guarded by government troops and using herbicides to defoliate areas where guerrillas operated. The long-term environmental and health effects would prove disastrous for Vietnamese and Americans alike.
On October 16, Kennedy pondered Soviet motives. “What is the advantage of” putting ballistic missiles in Cuba, he asked his advisors. “It’s just as if we suddenly began to put a major number of MRBMs [medium-range ballistic missiles] in Turkey. Now that'd be god damn dangerous, I would think.” The room fell silent until [McGeorge] Bundy replied, “Well, we did it, Mr. President.”
The United States had come within a hairsbreadth of invading Cuba. U.S. officials, it turned out, had little idea of what they were about to encounter had they done so. Reconnaissance flights had succeeded in photographing only thirty-three of the forty-two SS-4 medium-range ballistic missiles and never found the nuclear warheads that were also present. SS-5 intermediate-range ballistic missiles, which could travel 2,200 miles and hit most of the continental United States, had also been shipped. The United States remained completely ignorant of the fact that the Soviets had also placed approximately a hundred battlefield nuclear weapons in Cuba to repel a U.S. invading force. They included eighty FKR cruise missiles armed with 12-kiloton warheads, twelve Luna ground-to-ground rockets with 2-kiloton warheads, and six 12-kiloton bombs for II-28 bombers with a range of 750 miles. Anticipating that U.S. forces would confront 10,000 Soviet military personnel and 100,000 armed Cubans, the United States expected to suffer 18,000 total casualties and 4,500 dead in an invasion. When McNamara later learned that there were actually 43,000 Soviet military personnel and 270,000 armed Cubans, he raised the estimate of U.S. deaths to 25,000. Thirty years after the crisis, in 1992, McNamara discovered that the battlefield nuclear weapons were in place and would likely have been used against U.S. invaders. He blanched and responded that in that case, 100,000 Americans would have died and the United States would have responded by wiping out Cuba with a “high risk” of nuclear war between the United States and Soviet Union. Hundreds of millions of people might have perished—possibly all mankind. It has also recently been discovered that on the island of Okinawa, a large force of Mace missiles with 1.1 megaton nuclear warheads and F-100 fighter bombers armed with hydrogen bombs was preparing for action. Their likely target was not the Soviet Union but China.
As Daniel Ellsberg has astutely pointed out, Khrushchev made a blunder of epic proportions by not divulging the fact that the warheads had arrived before the blockade went into effect and then, even more bafflingly, not announcing that he had delivered tactical cruise and ballistic missiles along with their nuclear warheads. By keeping these facts secret, he had undercut the missiles’ deterrent effect. Had U.S. policymakers known for sure of the warheads' arrival for the MRBMs, they would have hesitated to strike and risk a retaliatory launching. Similarly, had they known that tactical missiles with nuclear warheads might be fired at U.S. troops, they would likely have forsworn an invasion. In fact, the Kremlin had initially given local Soviet commanders authority to launch the tactical missiles at their own discretion if the U.S. invaded. Such authorization was later withdrawn, but that did not preclude the possibility of an unauthorized launching. Although the details were different, this frightening scenario of deterrence gone awry with cataclysmic consequences was hauntingly similar to he one that Stanley Kubrick presented little more than a year later in his satirical masterpiece Dr. Strangelove.
Shaken by how close the world had come to a nuclear holocaust, Khrushchev wrote Kennedy another long letter on October 30. “Evil has brought some good,” he reflected. “The good is that now people have felt more tangibly the breathing of the burning flames of thermonuclear war and have a more clear realization of the threat looming over them if the arms race is not stopped.” He guessed that Americans “felt as much anxiety as all other peoples expecting that thermonuclear war would break out any moment.” In light of this, he made a series of bold proposals for eliminating “everything in our relations capable of generating a new crisis.” He suggested a nonaggression treaty between NATO and the Warsaw Pact nations. Even better, he said, why not “disband all military blocs?” He wanted to move quickly to finalize a treaty for cessation of all nuclear weapons testing—in the atmosphere, in outer space, under water, and also underground, seeing it as transitional to complete disarmament. He proposed a formula for resolving the ever-dangerous German question: formal acceptance of two Germanys based on the existing borders. He urged the United States to recognize China and let it assume its legitimate place in the United Nations. He encouraged Kennedy to offer his own counterproposals so that together they could move toward peaceful resolution of the problems threatening mankind. But Kennedy’s tepid response and insistence on additional on-site inspections before signing a comprehensive test ban treaty frustrated Khrushchev.
In 1968, the CIA acknowledged that “in terms of the numbers killed, the anti-PKI massacres in Indonesia rank as one of the worst mass murders of the 20th century.” Ambassador Green told a secret session of the Senate Foreign Relations Committee that nobody knew the actual death toll: “We merely judge it by whole villages that have been depopulated.”
Suharto and other military dictators remained in power for decades. Despite the country’s tremendous natural wealth, the average Indonesian stayed mired in poverty. As the New York Times, which had been effusive in its praise for Suharto over the years, reported in 1993, “the average Indonesian earns the equivalent only of $2 or $3 a day and thinks of regular electricity or indoor plumbing as unimaginable luxuries.” U.S. corporations, however, thrived in the post-1965 business-friendly climate that was shaped with the help of U.S. economic advisors and safeguarded by a brutal military that violently repressed the least signs of opposition.
Johnson, stubborn, vain, coarse, and narrow-sighted, sacrificed his dreams of being a great domestic reformer in order to pursue his anti-Communist obsessions in Vietnam, Indonesia, and elsewhere around the globe. Looking back in 1970, he told historian Doris Kearns that he had faced an impossible choice and ended up sacrificing “the woman I really loved—the Great Society—in order to get involved with that bitch of a war on the other side of the world.” But, if he hadn't done so, he explained, he would have been seen as a “coward” and the United States as an “appeaser.” Johnson claimed that he made the choice knowing full well what it meant for him and understanding clearly how previous wars had destroyed the hopes and dreams of prior generations:
Oh, I could see it coming all right. History provided too many cases where the sound of the bugle put an immediate end to the hopes and dreams of the best reformers: the Spanish-American War drowned the populist spirit; World War I ended Woodrow Wilson’s New Freedom; World War II brought the New Deal to a close. Once the war began, then all those conservatives in Congress would use it as a weapon against the Great Society . . . they'd use it so say they were against my programs, not because they were against the poor . . . but because the war had to come first. First, we had to beat those Godless Communists and then we could worry about the homeless Americans. And the generals. Oh, they’d love the war, too. It’s hard to be a military hero without a war. Heroes need battles and bombs and bullets in order to be heroic. That’s why I am suspicious of the military. They’re always so narrow in their appraisal of everything. They see everything in military terms.
When it finally came down to it, Johnson made his choice—a choice whose consequences will always define his legacy and besmirch that of the nation whose forces he commanded. “Losing the Great Society,” he lamented, “was a terrible thought, but not so terrible as the thought of being responsible for America's losing a war to the Communists. Nothing could possibly be worse than that.”
In April 1967, the AAAS [American Association of the Advancement of Science] magazine, Science, reported that Defense Department officials were having trouble recruiting scientists to perform military research. Former Stanford defense researcher Harold Adams explained, “There is a fundamental revulsion on Vietnam in the egghead community. Academics would rather support the forces of life than those of death.” Over the next few years, scientists would increasingly employ the metaphor of choosing “forces of life” over “forces of death” to explain their antipathy toward military research.
[North Vietnamese foreign minister Nyguyen Co] Thach understood a basic truth that U.S. leaders never grasped: the Vietnam War was about time, not territory or body counts. The United States wreaked unconscionable destruction; it won every major battle. But it could not win the war. Time was on the side of the Vietnamese, who didn’t have to defeat the Americans but simply to outlast them. They would pay a terrible price for independence and freedom. But they would ultimately triumph. North Vietnamese military leader Vo Nguyen Giap explained, looking back:
We won the war because we would rather die than live in slavery. Our history proves this. Our deepest aspiration has always been self-determination. That spirit provided us with stamina, courage, and creativity in the face of a powerful enemy. Militarily, the Americans were much more powerful than we were. But they made the same mistake as the French—they underestimated Vietnamese forces of resistance. When the Americans started their air raids, Uncle Ho said, “The Americans can send hundreds of thousands, even millions of soldiers; the war can last ten years, twenty years, maybe more, but our people will keep fighting until they win. Houses, villages, cities may be destroyed, but we won't be intimidated. And after we’ve regained our independence, we will rebuild our country from the ground up even more beautifully.”
Policy makers arrogantly assumed that the United States’ superior wealth, technology, and firepower would prevail by inflicting such suffering that the Vietnamese would rationally calculate that the price of victory exceeded the benefits. Nixon, in fact, bore some responsibility for Americans’ ignorance of Vietnamese history and culture. As a charter member of Washington's China lobby—anti-Communist zealots in the Congress, military, media, and business who blamed the State Department for the “loss” of China in 1949—Nixon had hounded the most knowledgeable China and East Asia experts out of the State Department in the 1950s. In explaining the U.S. blunders in Vietnam, McNamara later admitted:
I had never visited Indochina, nor did I understand or appreciate its history, language, culture, or values. The same must be said, to varying degrees, about . . . Kennedy . . . Rusk, . . . Bundy, . . . Taylor, and many others. . . . When it came to Vietnam, we found ourselves setting policy for a region that was terra incognita. Worse, our government lacked experts for us to consult to compensate for our ignorance. . . . The irony of this gap was that it existed largely because the top East Asia and China experts in the State Department—John Patton Davies, Jr., John Stewart Service, and John Carter Vincent—had been purged during the Mccarthy hysteria of the 1950s . . . we-certainly I-badly misread China’s objectives and mistook its bellicose rhetoric to imply a drive for regional hegemony. We also totally underestimated the nationalist aspect of Ho Chi Minh's movement.
The antiwar movement continued to grow. As many as three quarters of a million protesters flocked to Washington, D.C., for the November 1969 march; 150,000 more demonstrated in San Francisco. Despite the size of the protests, the war’s dehumanizing effects spread beyond the battlefield, hardening the hearts of the populace as a whole. Sixty-five percent of Americans told pollsters that they weren’t bothered by the My Lai massacre. The steady inuring against human sympathy that Dwight Macdonald had so eloquently described as resulting from the terror bombing of Japanese cities had again infected much of the nation.
News of My Lai opened the door to a steady spate of horror stories. The public learned of “free-fire zones,” where anything that moved would be shot. It learned of the tens of thousands killed by the CIA as part of the “Phoenix Program” and the “tiger cages” in which political prisoners were incarcerated and brutalized. It learned of the displacement of more than 5 million Vietnamese peasants, who were relocated to wire-enclosed refugee camps. It learned of widespread and wanton torture and many other crimes that outraged the sensibilities of at least some Americans and brought forth calls for war-crimes trials.
Exploding antiwar sentiment may have forced Nixon to cancel Duck Hook, but on April 30, 1970, he announced a joint U.S.-South Vietnamese ground invasion of Cambodia to destroy North Vietnamese bases along the border, insisting that the United States would not act “like a pitiful, helpless giant.”
Clearly, [Chilean president Salvador] Allende posed no “mortal threat” to the American people. A National Security Study Memo commissioned by Kissinger concluded that “the U.S. has no vital national interests within Chile” and an Allende government would not significantly change the balance of power. Kissinger himself had earlier disparaged Chile as “a dagger pointed at the heart of Antarctica.” But he now feared that a successful democratic socialist government in Chile could inspire similar uprisings elsewhere. “What happens in Chile,” he figured, would have an effect “on what happens in the rest of Latin America and the developing world . . . and on the larger world picture, including . . . relations with the USSR.”
For Kissinger, Chile’s democratic traditions and the freely expressed will of the Chilean people were of little, if any, concern. While chairing a meeting of the “40 Committee,” Kissinger remarked, “I don’t see why we need to stand by and watch a country go Communist due to the irresponsibility of its own people.”
In delivering his courageous speech to the United Nations, Allende may have been signing his own death warrant. In early 1973, the CIA urged its Chilean agents to “induce as much of the military as possible, if not all, to take over and displace the Allende govt.” Strikes and antigovernment protests escalated. Chilean military leaders, directed by General Augusto Pinochet, the army commander, set the coup for September 11, 1973. When Allende heard that military uprisings had begun across the country, he made a final radio address from the presidential palace: “I will not resign. . . . Foreign capital—imperialism united with reaction—created the climate for the army to break with their tradition. ... Long live Chile! Long live the people! These are my last words. I am sure that my sacrifice will not be in vain. I am sure it will be at least a moral lesson, and a rebuke to crime, cowardice and treason.” Allende took his own life with a rifle he had been given as a gift. A gold-medal plate embedded in the stock was inscribed, “To my good friend Salvador Allende from Fidel Castro.”
Pinochet seized power. After the coup, Nixon and Kissinger assessed the possible damage. Speaking by phone, Kissinger, who was getting ready to attend the Redskins’ season opener, complained that the newspapers were “bleeding because a pro-Communist government has been overthrown.” Nixon muttered, “Isn't that something. Isn’t that something.” Kissinger replied, “I mean instead of celebrating—in the Eisenhower period we would have been heroes.” Nixon said, “Well we didn’t—as you know—our hand doesn’t show on this one though.” Kissinger amended that statement: “We didn’t do it. I mean we helped them. ________created the conditions as great as possible.” To which Nixon responded, “That is right . . . as far as people are concerned . . . they aren't going to buy this crap from the Liberals on this one. . . . it is a pro-Communist government and that is the way it is.” “Exactly. And pro-Castro,” Kissinger agreed. “Well the main thing was. Let's forget the pro-Communist. It was an anti-American government all the way,” Nixon added. “Oh wildly,” Kissinger concurred. He assured Nixon that he was just reporting the criticism. But it wasn't bothering him. Nixon reflected, “Yes, you are reporting it because it is just typical of the crap we are up against.” “And the unbelievable filthy hypocrisy,” Kissinger averred.
Pinochet murdered more than 3,200 of his opponents and jailed and tortured tens of thousands more in a reign of terror that included the actions of the Chilean Army death squad known as the Caravan of Death. Kissinger saw to it that the United States quickly recognized and provided aid to the murderous regime. In June 1976, he visited the Chilean dictator and assured him, “We are sympathetic to what you are trying to do here.”
The war dragged on for two more years. On April 30, 1975, the North Vietnamese seized Saigon. The war was finally over. By its end, the United States had dropped more bombs on tiny Vietnam than had been dropped by all sides in all previous wars throughout history—three times as many explosives as were dropped by all sides in World War II. Unexploded ordnance blanketed the countryside. Nineteen million gallons of herbicide poisoned the environment. In the South, the United States had destroyed 9,000 of the 15,000 hamlets. In the North, it had rained destruction on all six industrial cities, leveling 28 of 30 provincial towns and 96 of 1 16 district towns. Le Duan, who took over the leadership of North Vietnam when Ho died in 1969, told a visiting journalist that the United States had threatened to use nuclear weapons on thirteen different occasions. The war’s human toll was staggering. More than 58,000 Americans had died in the fighting. But that paled in comparison to the number of Vietnamese killed and wounded. Robert McNamara would later tell students at American University that 3.8 million Vietnamese had died.
The horrors of Cambodia exceeded those of Vietnam. In December 1972, Nixon instructed Kissinger, “I want everything that can fly to go in there and crack the hell out of them. There is no limitation on mileage and there is no limitation on budget. Is that clear?”
The Khmer Rouge grew exponentially. Terrifying reports circulated of the fanaticism of its young cadre. In 1975, it seized power. It wasted little time in unleashing new horrors against its own people, leading to a genocide in which more than 1.5 million people perished on top of the half million or so who had been killed in the U.S. phase of the war. The United States, given its new alliance with China, Cambodia’s principal ally, maintained friendly relations with the brutal Pol Pot regime. In late 1975, Kissinger told the Thai foreign minister, “You should . . . tell the Cambodians that we will be friends with them. They are murderous thugs, but we won’t let that stand in our way.”
Gerald Ford announced: “Our long national nightmare is over” and later gave the “madman” Nixon a controversial pardon. But forty government officials and members of Nixon’s reelection committee were convicted of felonies. Among those sentenced to prison terms were Dean, Mitchell, Haldeman, Ehrlichman, political assistants Charles Colson, Egil Krogh, and Jeb Stuart Magruder, and the president’s lawyer, Herbert Kalmbach. Nixon impersonator David Frye quipped, “There's a bright side to Watergate. My administration has taken crime out of the streets and put it in the White House where I can keep an eye on it.”
“Psychopathic” Kissinger came through unscathed. In October 1973, he and North Vietnam’s Le Duc Tho were awarded the Nobel Peace Prize. Tom Lehrer, America’s most brilliant political satirist, announced that Kissinger’s winning the Nobel Peace Prize made political satire obsolete and refused ever to perform again. Unlike Kissinger, Le Duc Tho, knowing that peace had not yet been achieved, had the decency to turn the prize down.
Historian Carolyn Eisenberg aptly pointed out, “Richard Nixon was the only President in American history to engage in sustained military action against three nations without a mandate from the public, the press, the government bureaucracies or the foreign elite.”
Yet Carter, who has performed in such exemplary fashion out of office, was inept in office, disappointing his supporters, betraying his convictions, and leaving with an approval rating of 34 percent. Carter’s most enduring legacy as president was not his hypocrisy-stained campaign for human rights; it was his opening the door to the dark side, legitimizing the often brutal policies of his successor, Ronald Reagan—policies that rekindled the Cold War and left a trail of innocent victims stretching from Guatemala to Afghanistan and back again to the World Trade Center. How did that happen? Were the same forces at work during the Carter years that had undermined the administrations of other Democratic presidents, including Wilson, Truman, Johnson, Bill Clinton, and Barack Obama?
The Vietnamese, who had suffered so deeply during the U.S. invasion, would be left to rebuild their war-ravaged land on their own. Nearly 4 million of their citizens had been killed. The beautiful triple-canopy forests were largely gone. In 2009, land mines and unexploded bombs still contaminated over a third of the land in six central Vietnamese provinces. Efforts by the Vietnamese government, the Vietnam Veterans of America Foundation, and the Vietnam Veterans of America, sometimes led by dedicated U.S. veterans like Chuck Searcy in Quang Tri Province, had cleared over 3,000 acres. But over 16 million acres remained to be cleared. Beyond the terrible toll of the war itself, 42,000 more Vietnamese, including many children, were killed by leftover explosives in the years after the war ended. U.S. veterans would suffer too. By some estimates, the number of Vietnam vets who have committed suicide has exceeded the 58,000-plus who died in combat.
Instead of helping the American people learn from this execrable episode in U.S. history, Ford encouraged Americans to “regain the sense of pride that existed before Vietnam.” The fact that the United States had not learned the lesson that it should never again support a corrupt dictatorship determined to silence the cries for justice from an oppressed people would come back to haunt it repeatedly in future years.
Reeling from defeat in Vietnam, the United States went out of its way to cultivate anti-Communist allies in the region. Ford and Kissinger visited General Suharto, Indonesia’s right-wing dictator, in early December. The day they left, Suharto’s military invaded the newly independent nation of East Timor, a former Portuguese colony. Suharto had asked his guests for “understanding if we deem it necessary to take rapid or drastic action” in toppling East Timor’s left-wing government. Ford assured him, “We will understand and not press you on the issue.” Kissinger urged Suharto to postpone the invasion until he and Ford had returned to the United States and to finish the job quickly. The invasion proved to be bloody and the occupation prolonged. The estimated death toll from the invasion plus starvation and disease ranges from 100,000 to 200,000 and more. Three hundred thousand people, over half the population, were relocated to camps run by the Indonesian military. The United States continued providing military aid to Indonesia until 1999. East Timor did not regain full independence until 2002.
Carter came to office committed to promoting human rights, but he used human rights as a vehicle for attacking the Soviet Union, causing relations between the two countries to chill. The Soviets, proud of the fact that they had expanded civil liberties and decreased the number of political prisoners in recent years, countered that Soviet citizens had rights that Americans didn’t enjoy. The Kremlin instructed Ambassador Anatoly Dobrynin to ask [Secretary of State Cyrus] Vance how the Americans would feel if the Soviets tied détente to ending U.S. racial discrimination or unemployment.
[National security advisor Zbigniew] Brzezinski understood the Soviets’ fear that the Afghan insurgency would spark an uprising by the 40 million Muslims in Soviet Central Asia. Afghan leaders had been pressing Moscow to send troops to quell the uprising and the Russians rebuffed their requests. Brezhnev instead urged them to ease repression of political opponents. Soviet leaders concluded correctly that the Americans were instigating the insurgency in cooperation with extremist elements in Iran and Pakistan. They figured that China might also be playing a role. But they still hesitated to intervene. [Soviet Ambassador Andrei] Gromyko summed up their concerns: “We would be largely throwing away everything we achieved with such difficulty, particularly detente, the SALT-II negotiations would fly by the wayside, there would be no signing of an agreement (and however you look at it that is for us the greatest political priority), there would be no meeting of [Brezhnev] with Carter, ... and our relations with Western countries, particularly the FRG [Federal Republic of Germany], would be spoiled.”
From the Soviet vantage point, U.S. behavior was quite alarming. As future CIA director Robert Gates later admitted, “the Soviets saw a very different Jimmy Carter than did most Americans by 1980, different and more hostile and threatening.” At that point, Soviet leaders didn’t know what to expect from Carter. In late 1979 and early 1980, the U.S. early-warning system malfunctioned on four occasions, triggering combat alerts of U.S. strategic forces. The KGB believed that they were not malfunctions but deliberate Pentagon ploys to lower Soviet anxiety and response time during future alerts by lulling them into a false sense of complacency, thereby making them vulnerable to a surprise attack. The Soviets weren’t the only ones frightened by the episodes. Gates reported Brzezinski’s account to him of the November 9, 1979, incident in his memoirs:
Brzezinski was awakened at three in the morning by [his military assistant William] Odom, who told him that some 220 Soviet missiles had been launched against the United States. Brzezinski knew that the President’s decision time to order retaliation was from three to seven minutes after a Soviet launch. Thus he told Odom he would stand by for a further call to confirm a Soviet launch and the intended targets before calling the President. Brzezinski was convinced we had to hit back and told Odom to confirm that the Strategic Air Command was launching its planes. When Odom called back, he reported that he had further confirmation, but that 2,200 missiles had been launched-it was an all-out attack. One minute before Brzezinski intended to telephone the President, Odom called a third time to say that other warning systems were not reporting Soviet launches. Sitting alone in the middle of the night, Brzezinski had not awakened his wife, reckoning that everyone would be dead in half an hour. It had been a false alarm. Someone had mistakenly put military exercise tapes into the computer system. When it was over, Zbig just went back to bed. I doubt he slept much, though.
The dangerous incident, which was leaked to the press, caused alarm in the Kremlin. Ambassador Dobrynin conveyed Brezhnev’s “extreme anxiety” over what happened. Brzezinski and the Defense Department drafted the response, which senior State Department advisor Marshall Shulman characterized as “gratuitously insulting and inappropriate for the Carter/Brezhnev channel.” Shulman considered it “kindergarten stuff-not worthy of the United States” and wondered, “Why do we have to be so gratuitously snotty?”
Following the Iranian Revolution, U.S. officials cozied up to Iraqi dictator Saddam Hussein, whom they saw as a regional counterweight to the hostile Iranian regime. They feared that Iranian-style Islamic fundamentalism could threaten pro-American regimes in Kuwait, Saudi Arabia, and Jordan. Brzezinski strategized ways to sever Iraq from the Soviet orbit. In September 1980, Saddam, with at least tacit U.S. approval, invaded neighboring Iran, attacking across the Shatt al-Arab waterway leading to the Persian Gulf. Iraq, however, did not secure the easy victory that U.S. intelligence sources had predicted. Within a week, the United Nations called for a cease-fire. In late October, Carter, playing both sides, announced that if the Iranians released the U.S. hostages, the United States would send the $300 million to $500 million in arms that had been purchased by the prior regime. Reaganites smelled an “October surprise” that would hand Carter the election. In what Carter White House Iran aide and Columbia University political scientist Gary Sick called “a political coup,” a group of Reagan supporters were alleged to have cut a deal with the Iranian government. At the time, the presidential race was still tight. Some mid-October polls even had Carter in the lead. The details are murky and impossible to confirm, but it appears that Reagan campaign officials met with Iranian leaders and promised to allow Israel to ship arms to Iran if Iran would hold the hostages until Reagan won the election. In response to a 1992 query from Indiana Congressman Lee Hamilton, the Supreme Soviet’s Committee on Defense and Security Issues reported that a series of secret meetings had taken place in Europe between top Reagan campaign officials and Iranian officials. The Soviet report identified Reagan campaign manager and future CIA Director William Casey, vice presidential candidate and former CIA Director George Bush, and NSC staffer and future CIA Director Robert Gates as attending and offering substantially more military supplies than the Carter team was offering. Iran released the embassy personnel on January 21, 1981, Reagan’s first day in office. The United States continued arms sales to Iran via Israel, often channeled through private dealers, for several years. An early chance to end the war, which Saddam offered to do in return for Iraqi control of the Shatt al-Arab waterway and an Iranian promise not to interfere in Iraq, was also squandered. With the United States helping fuel the conflict, the Iran-Iraq War would continue for eight years, leaving, some estimate, over a million dead and costing over a trillion dollars.
Reagan’s simplistic worldview seemed to be a pastiche stitched together from Hallmark greeting cards, Currier and Ives lithographs, Benjamin Franklin aphorisms, Hollywood epics, and Chinese fortune cookies. He wrote, “I’d always felt that from our deeds it must be clear to anyone that Americans were a moral people who . . . had always used our power only as a force for good in the world.” He often displayed a striking inability to differentiate between reality and fantasy.
In a late 1983 Oval Office meeting, he told Israeli Prime Minister Yitzhak Shamir that as a photographer during the Second World War he had filmed the Allies liberating the Nazi death camps and had been so moved by the suffering he witnessed that he had decided to keep a copy of the film in case he ever encountered a Holocaust skeptic. Shamir was so impressed with Reagan’s story that he repeated it to his cabinet and it was printed in the Israeli paper Ma’ariv. Reagan later repeated a variant of the story to Simon Wiesenthal and Rabbi Marvin Hier, telling them he had been with the Signal Corps filming the camps and had shown the film to someone just a year after the war. Hearing the story, Washington Post reporter Lou Cannon noted that Reagan had never left the United States during or immediately after the war. The story was entirely fanciful.
Reporters then had a field day revealing other Reagan whoppers. Chicago Tribune columnist Mike Royko, perhaps to dispel the notion that the president’s flights of fancy were a product of old age or related to his diminishing mental powers, wrote that he first became aware of Reagan’s habit of altering the truth in 1968 when, to highlight how lawless society was becoming, Reagan asserted that eight Chicago police officers had been killed in one recent month alone. Royko, curious, discovered that no cops had been killed in Chicago in months and only one or two in the entire year. Reagan often repeated his story about the Chicago “welfare queen” with eighty names, thirty addresses, and twelve Social Security cards who had a tax-free income of over $150,000. The numbers would change—she sometimes had 127 names and received over one hundred different checks—but the point—an attack on greedy, dishonest blacks who stole from hardworking white Americans—remained the same.
In March 1981, the CIA informed Vice President Bush that D’Aubuisson, the “principal henchman for wealthy landlords,” was running “the right-wing death squads that have murdered several thousand suspected leftists and leftist sympathizers during the past year. Three American Maryknoll nuns and a Catholic layperson who had been involved in humanitarian relief work had been raped and slaughtered shortly before Reagan's inauguration. UN ambassador-designate Jeane Kirkpatrick insisted, “the nuns were not just nuns” but FMLN [Farabundo Martí National Liberation Front] “political activists.” Secretary of State Alexander Haig called them “pistol-packing nuns” and suggested to a congressional committee that “perhaps the vehicle the nuns were riding in may have tried to run a roadblock.”
One atrocity particularly stands out. U.S.-trained and armed Salvadoran troops slaughtered the 767 inhabitants of the village of El Mozote in late 1981. The victims, including 358 children under age thirteen, were stabbed, decapitated, and machine-gunned. Girls and women were raped. When New York Times correspondent Raymond Bonner tried to expose what had occurred, the Wall Street Journal and other pro-Reagan newspapers assaulted Bonner’s credibility. The Times buckled under pressure and pulled Bonner out of El Salvador. Administration officials helped cover up the massacre. Conditions worsened. In late 1982, the Council on Hemispheric Affairs reported that El Salvador, along with Guatemala, had the worst record of human rights abuses in Latin America: “Decapitation, torture, disemboweling, disappearances and other forms of cruel punishment were reported to be norms of paramilitary behavior sanctioned by the Salvadoran government.” However, Elliott Abrams, assistant secretary of state for human rights, testified that the reports of death-squad involvement were “not credible.”
George Bush had trouble sympathizing with the suffering of the people in the United States’ backyard. Before Pope John Paul II visited Central America, Bush said he couldn’t understand how Catholic clergy could reconcile their religious beliefs with Marxist philosophy and tactics and support the insurgents. Reverend Theodore Hesburgh, president of Notre Dame, tried to explain that poverty and social injustice could easily lead priests to supporting Marxists or anyone else challenging the status quo. “Maybe it makes me a right-wing extremist,” Bush replied, “but I’m puzzled. I just don’t understand it.”
In 1980, Commentary magazine, the United States leading neoconservative journal, published a series of essays decrying what conservatives called the “Vietnam syndrome”—the revulsion against the Vietnam War that made Americans squeamish about using force to resolve international conflicts. Reagan agreed: “For too long, we have lived with the ‘Vietnam Syndrome.’ ... Over and over they told us for nearly 10 years that we were the aggressors bent on imperialistic conquests. . . . It is time we recognized that ours was, in truth, a noble cause. . . .We dishonor the memory of 50,000 young Americans who died in that cause when we give way to feelings of guilt.”
Bogged down in protracted proxy wars in Nicaragua and El Salvador, Reagan hungered for an easy military victory that would restore Americans’ self-confidence and get the Vietnam monkey off America’s back. His opportunity came in 1983 when a radical faction overthrew the revolutionary government of Maurice Bishop in Grenada, a tiny Caribbean island with 100,000 inhabitants, murdering its leaders. Before his death, Bishop had alleged that a campaign was under way to destabilize his nation by “the vicious beasts of imperialism”—the United States. Using the resulting instability as a pretext for action, U.S. officials decided to invade and topple the new government, despite clear opposition from the United Nations, the Organization of American States (OAS), and even British Prime Minister Margaret Thatcher. They pressured reluctant Caribbean nations to call for U.S. intervention.
Reagan’s scare tactics worked. By 1985, he had increased defense spending by a staggering 51 percent over 1980 expenditures. To finance this, he slashed federal support for discretionary domestic programs by 30 percent, effectively transferring $70 billion from domestic programs to the military.
Senator Howard M. Metzenbaum praised Budget Director David Stockman’s adroitness at cutting the budget, “but,” he added, “I also think you’ve been cruel, inhumane and unfair.” Four hundred eight thousand people lost their eligibility for Aid to Families with Dependent Children (AFDC) by 1983, and 299,000 saw their benefits cut. Reagan prodded Congress into cutting $2 billion out of the $12 billion food stamp budget and $1 billion from the $3.5 billion budget for school lunches. The budgets for Medicaid, child nutrition, housing and energy assistance were also pared. Federal funds for cities were cut almost in half. While waging war on the poor, Reagan cut the highest income tax rate, which was 70 percent when he took office, to 28 percent by the time he left.
New and upgraded weapons systems rolled off the assembly lines, including the long-delayed and very costly MX missile program, which moved missiles around loops that hid their precise location, making them largely invulnerable to a Soviet first strike. Reagan knew that the Soviets, whose economy was stagnant, would be hard pressed to keep pace.
Iran asked for a UN Security Council investigation. Although U.S. intelligence reports confirmed Iran’s charges, the United States remained silent for several more months, before finally criticizing the Iraqi use of chemical warfare in early March. But when Iran proposed a UN resolution condemning Iraq’s use of chemical weapons, U.S. Ambassador Kirkpatrick lobbied other countries to render “no decision.” Upon the Iraqi ambassador's suggestion, the United States pre-empted the Iranian measure by getting a Security Council presidential statement in late March opposing the use of chemical weapons but not mentioning Iraq as the guilty party. In November 1984, the United States restored diplomatic relations with Iraq. Not only did the use of chemical warfare persist until the end of the war with Iran, but in late 1987, the Iraqi air force began dropping chemical weapons against Iraq’s own Kurdish citizens, whom the government accused of supporting Iran. The attacks against rebel-controlled villages peaked with the chemical warfare assault on the village of Halabjah in March 1988. Despite widespread outrage in the United States, including from many inside the administration, U.S. intelligence aid to Iraq actually increased in 1988 and, in December 1988, the government authorized a sale to Iraq of $1.5 million in insecticides by Dow Chemical, the manufacturer of the napalm used in Vietnam.
Reagan also understood how easily a crisis could be provoked. In September 1983, when Soviet military personnel mistakenly took a Korean Air Lines passenger jet that had crossed into Soviet airspace for a spy plane and, after unheeded warnings, shot it down, killing all 269 people on board, including 61 Americans, Reagan railed against “the Korean Air Lines massacre” as an “act of barbarism” and a “crime against humanity.” But in his memoirs he drew a different lesson: “If anything, the KAL incident demonstrated how close the world had come to the precipice and how much we needed nuclear arms control: If, as some people speculated, the Soviet pilots simply mistook the airliner for a military plane, what kind of imagination did it take to think of a Soviet military man with his finger close to a nuclear push button making an even more tragic mistake?”
Despite his abhorrence of nuclear war, Reagan possessed a dark side that fantasized about using those weapons to defeat his enemies. Such thinking slipped out in shocking fashion when Reagan quipped during a sound check for a radio broadcast, “My fellow Americans, I am pleased to tell you today that I’ve signed legislation that will outlaw Russia forever. The bombing begins in five minutes.” Reagan was unaware that the tapes were rolling as he spoke. The reaction at home and abroad was quick and unsparing. Colorado Senator Gary Hart thought that Reagan’s “poor judgment” might have been caused by the stress of his reelection campaign but worried that “more frighteningly, it’s in moments of that sort that his real feelings come out, which is the most dismaying and distressing possibility.” The New York Times reported that the story was front-page news across Europe. Paris’s Le Monde figured that psychologists would have to determine whether the comments were “an expression of repressed desire or the exorcism of a dreaded phantom.” West Germany’s Social Democrats dismissed Reagan—“The Lord of life or combustion of Western Europe”—as “an irresponsible old man . . . who probably can no longer distinguish whether he is making a horror movie or commanding a superpower”, while the Greens exclaimed that the “perverse joke makes the blood of every reasonable person run cold.”? TASS, the Soviet news agency, quoted a Western leader who described Reagan as a man “who smiles at the possibility of the mass extermination of people” and decried “the hypocrisy of his peace rhetoric.” Izvestia called it a “monstrous statement.”
Gorbachev and his supporters were sincere in their desire for disarmament, detente, and democratic reform. Anatoly Chernyaev, who was one of Gorbachev’s most trusted foreign policy advisors, later insisted that “detente was a sincere policy. We wanted detente, we wanted peace, we craved it. . . . Look at Central Committee Secretary Yegor Ligachev, he was a conservative, right? A reactionary, even, and yet he ... would stand up, right in front of Gorbachev, and he would scream, ‘How long will our military-industrial complex keep devouring our economy, our agriculture and our consumer goods? How long are we going to take this ogre, how long are we going to throw into its mouth the food of our children?’”
Gorbachev decided to push his “peace offensive” even more aggressively. In January 1986, he wrote to Reagan, boldly offering “a concrete program . . . for the complete liquidation of nuclear weapons throughout the world ... before the end of the present century.” In the interim, he proposed removing all U.S. and Soviet intermediate-range ballistic missiles from Europe, ending nuclear testing, sharply reducing strategic weapons, and changing the ABM treaty to allow the United States to continue research on SDI but banning deployment for fifteen years. He had already, the previous August, announced a unilateral nuclear testing moratorium.
The U.S. response reinforced Soviet doubts about Reagan’s real intentions. The United States announced plans for a new series of nuclear tests. It also increased its support for the Afghan mujahideen and undertook provocative actions on other fronts.
Soviet Foreign Minister Shevardnadze then interjected “very emotionally” that future generations, reading the minutes of meetings and seeing how close the participants had come to eliminating nuclear weapons, would never forgive them if they didn't come to an agreement. Reagan said that adding the word “laboratory” would cause him great political damage at home. Gorbachev said that if he allowed the United States to take the arms race to space and deploy SDI after ten years, he would be viewed as foolish and irresponsible. Each asked the other to bend. Neither would.
The meeting ended. The United States and the Soviet Union had come within a hairsbreadth—one word—of eliminating nuclear weapons. But the scourge of nuclear weaponry would continue to haunt the world. Reagan, egged on by arch-neocon Perle, sacrificed the hopes of humanity for an illusion—Star Wars fantasy that, as Richard Rhodes wrote, represented little more than “a specious concern for testing outside the ‘laboratory’ systems that had hardly yet even entered the laboratory in 1986.”
Reagan and Gorbachev left the building. Gorbachev described the scene:
It was already dusk. The mood was downcast. Reagan reproached me: “You planned from the beginning to come here and put me in this position!” “No. Mr. President," I replied. "I am prepared to go back inside right now and sign the document concerning the issues we already agreed upon if you will refrain from plans to militarize space.” “I’m extremely sorry," Reagan answered.
Meanwhile, Soviet operations were finally winding down in Afghanistan. Reagan and Casey had transformed Carter’s tentative support for the Afghan insurgents into the CIA's largest covert operation to date, totaling more than $3 billion. The CIA channeled aid through Pakistan’s President Zia, who funneled U.S. arms and dollars to the most extreme Afghan Islamist faction under Gulbud din Hekmatyar, a man of legendary cruelty. According to James Forest, director of terrorism studies at West Point, Hekmatyar “was known . . . to patrol the bazaars of Kabul with vials of acid, which he would throw in the face of any woman who dared to walk outdoors without a full burka covering her face.” He was also known for skinning prisoners alive. Senior State Department official Stephen Cohen admitted, “The people we did support were the nastier, more fanatic types of mujahdeen.” The CIA station chief in Islamabad, Pakistan, Howard Hart, recalled, “I was the first chief of station ever sent abroad with this wonderful order: ‘Go kill Soviet soldiers.’ Imagine! I loved it.” The CIA even provided between 2,000 and 2,500 U.S.-made Stinger missiles, some of which WikiLeaks revealed were used to down NATO helicopters three decades later.
Tens of thousands of Arabs flooded into Pakistan to join the jihad against the infidels, including a wealthy Saudi named Osama bin Laden and Egyptian doctor Ayman al-Zawahiri. They and thousands of other future Islamist terrorists received military training in the Pakistani camps, learning such valuable skills as how to perform assassinations and detonate car bombs. Thousands more flocked to Pakistan's madrassas, where they were indoctrinated in radical Islam and recruited for jihad. The madrassas were one product of the $75 billion the Saudis spent during the 1980s to spread Wahhabi extremism. Casey ignored repeated warnings that the religious fanaticism he was helping unleash would eventually pose a threat to U.S. interests. He instead persisted in his view that the unholy partnership between Christianity and Islam would endure and could be used to bludgeon the Soviets throughout the region. In fact, in mid-decade, Casey unleashed mujahideen raids across the border into the Soviet Union in the hope of inciting Islamist uprisings by Soviet Muslims.
Upon withdrawing from Afghanistan, the Soviets sounded out U.S. willingness to collaborate on curbing Islamic extremism in Afghanistan, but the Americans could not be bothered. The die-hard Islamists now in control of Afghanistan worked closely with Pakistani intelligence. Having achieved its goals, the United States continued to provide covert aid but eventually washed its hands of the mess it had helped create. Former U.S. Ambassador to Saudi Arabia Charles Freeman complained, “We start wars without figuring out how we would end them. Afghanistan was lurching into civil war, and we basically didn’t care anymore.” He said that he and U.S. Ambassador to Pakistan Robert Oakley had tried to get CIA officials from Directors Robert Gates and William Webster on down to think seriously about ending the U.S., Saudi, and Pakistani involvement, but they were dealing with people who reasoned, “Why should we go out there and talk to people with towels on their heads?” According to RAND expert Cheryl Benard, whose husband, Zalmay Khalilzad, served as U.S. ambassador to Afghanistan:
We made a deliberate choice. At first, everyone thought, there’s no way to beat the Soviets. So what we have to do is to throw the worst crazies against them that we can find, and there was a lot of collateral damage. We knew exactly who these people were, and what their organizations were like, and we didn’t care. Then, we allowed them to get rid of, just kill all the moderate leaders. The reason we don’t have moderate leaders in Afghanistan today is because we let the nuts kill them all. They killed the leftists, the moderates, the middle-of-the-roaders. They were just eliminated, during the 1980s and afterwards.
Reagan left office a befuddled old man who claimed little knowledge of things going on under his nose, yet many people lionize him and credit him with having restored the United States’ faith in itself after the failed presidencies of Johnson, Nixon, Ford, and Carter. Even before Reagan’s second term, conservatives had begun anointing him one of the nation's great presidents. A 1984 Republican campaign memo read, “Paint Reagan as the personification of all that is right with or heroized by America. Leave Mondale in a position where an attack on Reagan is tantamount to an attack on America's idealized image of itself.”
But what is Reagan’s real legacy? One of the most poorly informed and least engaged chief executives in U.S. history, he empowered a right-wing resurgence of hard-line anti-Communists who militarized U.S. foreign policy and rekindled the Cold War. He paid lip service to democracy while arming and supporting repressive dictators. He turned local and regional conflicts in the Middle East and Latin America into Cold War battlegrounds, unleashing a reign of terror to suppress popular movements. He spent enormous sums on the military while cutting social programs for the poor. He sharply reduced taxes on the wealthy, tripling the national debt and transforming the United States from the world’s leading creditor in 1981 to its biggest debtor by 1985. In October 1987, he oversaw the worst stock market collapse since the Great Depression. He let the chance to rid the world of offensive nuclear weapons slip through his fingers because he wouldn’t let go of a childish fantasy. And as for his much-vaunted role in ending the Cold War, as we will see, the lion’s share of credit goes instead to his Soviet counterpart, Mikhail Gorbachev.
“Suddenly, a season of peace seems to be warming the world,” the New York Times exulted on the last day of July 1988. Protracted, bloody wars were ending in Afghanistan, Angola, Cambodia, and Nicaragua, and between Iran and Iraq. Later that year, Palestine Liberation Organization leader Yasir Arafat, under pressure from Moscow, renounced terrorism and implicitly recognized Israel’s right to exist. But the most dramatic development was still to come. In December 1988, Soviet leader Mikhail Gorbachev declared the Cold War over:
the use or threat of force no longer can . . . be an instrument of foreign policy. This applies above all to nuclear arms. . . . let me turn to the main issue-disarmament, without which none of the problems of the coming century can be solved. . . . the Soviet Union has taken a decision to reduce its armed forces . . . by 500,000 men. . . . we have decided to withdraw by 1991 six tank divisions from East Germany, Czechoslovakia and Hungary, and to disband them. . . . Soviet forces stationed in those countries will be reduced by 50,000 men and their armaments, by 5,000 tanks. All Soviet divisions remaining . . . will become clearly defensive.
He promised to reveal Soviet plans for the “transition from the economy of armaments to an economy of disarmament” and called upon other military powers to do likewise through the United Nations. He proposed a 50 percent reduction in offensive strategic arms, asked for joint action to eliminate “the threat to the world’s environment,” urged banning weapons in outer space, and demanded an end to exploitation of the third world, including a “moratorium of up to 100 years on debt servicing by the least developed countries.”
Still, he was not finished. He called for a UN-brokered cease-fire in Afghanistan as of January 1. In nine years of war, the Soviets had failed to defeat the Afghan insurgents despite deploying 100,000 troops, working closely with local Afghans, and building up the Afghan army and police. He proposed an international conference on Afghan neutrality and demilitarization and held out an olive branch to the incoming administration of George H. W. Bush, offering a “joint effort to put an end to an era of wars, confrontation and regional conflicts, to aggressions against nature, to the terror of hunger and poverty as well as to political terrorism. This is our common goal and we can only reach it together.”
The New York Times characterized Gorbachev’s riveting hourlong speech as the greatest act of statesmanship since Wilson's Fourteen Points in 1918 or Roosevelt and Churchill’s Atlantic Charter in 1941-“the basic restructuring of international politics.” And, the Times proclaimed, “he promised to lead the way unilaterally. Breathtaking. Risky. Bold. Naive. Diversionary. Heroic. . . . his ideas merit-indeed, compel-the most serious response from President-elect Bush and other leaders. The Washington Post called it “a speech as remarkable as any ever delivered at the United Nations.”
Gorbachev saw this as a new beginning, but many U.S. policy makers hailed it as the ultimate vindication—the triumph of the capitalist West after decades of Cold War. It was “the end of history,” State Department policy planner Francis Fukuyama declared, anointing Western liberal democracy “the final form of human government.” In September 1990, Michael Mandelbaum, director of East-West studies at the Council on Foreign Relations, exulted, “the Soviets . . . have made it possible to end the cold war, which means that for the first time in 40 years we can conduct military operations in the Middle East without worrying about triggering World War III.” The United States would soon test that hypothesis.
When Bush traveled to Poland and Hungary in July, he deliberately avoided saying or doing anything that might provoke a Soviet response. Having previously derided “the vision thing,” even the tearing down of the Berlin Wall failed to elicit a jubilant response on his part. He explained, “I am not an emotional kind of guy.” He told Gorbachev, “I have conducted myself in ways not to complicate your life. That’s why I have not jumped up and down on the Berlin Wall.” “Yes, we have seen that” and “appreciate” it, Gorbachev replied.
Latin Americans’ bitterness about the invasion, which violated the charter of the OAS [Organization of American States], would persist for years. Shortly after the 9/1 1 attacks by Al-Qaeda, the editors of the Nicaragua-based magazine Envio wrote that in December 1989 “the government of George Bush Sr. ordered the invasion of Panama, a military operation that bombed civilian neighborhoods and killed thousands of Panamanians just to flush out a single man, Manuel Noriega. ...” “Was that not state terrorism?” they asked.
Soviet U.S. expert Georgi Arbatov warned that the invasion would strengthen Soviet hard-liners, who would see through the hypocrisy of the United States’ praising Soviet nonintervention while it was itself overthrowing governments. They had good reason to feel that way. The invasion was indeed a signal that Soviet inaction would not curb U.S. bellicosity; it might, in fact, embolden the United States to act more recklessly. The Washington Post’s Bob Woodward pointed to Colin Powell’s support for the invasion as critical to Bush’s decision making. Powell declared, “We have to put a shingle outside our door saying ‘Superpower Lives Here’, no matter what the Soviets do, even if they evacuate from Eastern Europe.” Neocon Elliott Abrams concluded that the United States should have invaded sooner and speculated that “the reduced danger of escalation makes limited military action more rather than less likely.”
U.S. Ambassador April Glaspie met with Saddam in Baghdad on July 25, 1990, assuring him that Bush “wanted better and deeper relations” and had “no opinion" on its border dispute with Kuwait, which had been no friend of the United States. Senator and former UN Ambassador Daniel Patrick Moynihan described Kuwait to fellow senators as “a particularly poisonous enemy of the United States” whose “anti-Semitism was at the level of the personally loathsome.” Saddam took Glaspie’s remarks as a signal that the United States would acquiesce in his Kuwaiti takeover. The following week, three Iraqi divisions entered Kuwait, giving Iraq control of one-fifth of the world’s oil supply. In September, Glaspie effectively confirmed that she had led Saddam on, telling the New York Times, “I didn't think—and nobody else did—that the Iraqis were going to take all of Kuwait.”
U.S. officials urged the Iraqis to rise up and topple Saddam. Shiites and Kurds responded en masse. But the United States stood idly by while the Iraqi government crushed the uprising, using poison gas and helicopter gunships. Still, the war showcased U.S. military power. Bush proclaimed a new world order and gushed, “The ghosts of Vietnam have been laid to rest beneath the sands of the Arabian desert.” One White House speechwriter programmed his word processor so he could write “New World Order” by hitting a single command key. Among those who dismissed this empty “burst of triumphalism” was conservative columnist George Will, who wrote, “If that war, in which the United States and a largely rented and Potemkin coalition of allies smashed a nation with the GNP of Kentucky, could . . . make America ‘feel good about itself’, then America should not feel good about itself.” He noted “how close Bush came to unilaterally amending the Constitution by stripping from Congress all right to involvement in the making of war. Bush only grudgingly . . . sought constitutional approval for launching the biggest military operation in U.S. history, an attack on a nation with which we were not at war.” In two months of bombing, the United States destroyed much of Iraq’s infrastructure, including roads, bridges, sanitation facilities, waterways, railroads, communications systems, factories, and the electrical grid, and caused immense suffering. In March, the United Nations described the bombing as “near apocalyptic”, driving Iraq back into the “pre-industrial age.” A Harvard team reported a “public health catastrophe.” The continuing UN sanctions exacerbated a miserable situation, reducing real wages by over 90 percent. Although estimates vary widely, credible sources report that over 200,000 Iraqis died in the war and its aftermath, approximately half of them civilians. The U.S. death toll stood at 158.
“By God, we've kicked the Vietnam syndrome once and for all!” Bush rejoiced. But privately he was more circumspect. As the war was coming to an end, he wrote in his diary that he was experiencing “no feeling of euphoria.” “It hasn’t been a clean end,” he regretted. “There is no battleship Missouri surrender. This is what’s missing to make this akin to WWII, to separate Kuwait from Korea and Vietnam.” And with Saddam Hussein remaining safely ensconced in power, victory seemed hollow and incomplete.
On Christmas Day, having lost his base of support, Gorbachev resigned. The Soviet Union was no more. The Cold War had ended. The most visionary and transformative leader of the twentieth century had yielded power. Even some people in the United States had come to appreciate the immensity of his contribution. James Baker had said to him in September 1990, “Mr. President . . . nobody in the world has ever tried what you and your supporters are trying today. . . . I’ve seen a lot, but I’ve never met a politician with as much bravery and courage as you have.”
The Democrats’ euphoria over capturing the White House proved short-lived. Republicans weakened Clinton out of the gate by blocking his attempt to secure the open admission of gays into the military, but the much more telling blow would be struck in defeating his plan to overhaul the health care system. Among advanced industrial countries, only the United States and apartheid South Africa lacked a national health care system. The Republicans and their business allies spent $50 million to frighten the American public and deny health care coverage to tens of millions of citizens. Richard Armey, chair of the House Republican Conference, prepared for what he called “the most important domestic policy debate of the past half century . . . the Battle of the Bulge of big government liberalism.” Armey believed, “The failure of the Clinton plan will...leave the President’s agenda weakened, his . . . supporters demoralized, and the opposition emboldened. Our market-oriented ideas will suddenly become thinkable, not just on health care, but on a host of issues. ... Historians may mark it as . . . the start of the Republican renaissance.”
The 1994 midterm elections gave Republicans control of both branches of Congress for the first time in forty years. Both parties lurched further to the right. Succumbing to conservative pressure, Clinton ended Aid to Families with Dependent Children, which had helped poor families since the Great Depression, and supported a war on drugs and tough-on-crime legislation. The U.S. prison population exploded from a half million in 1980 to 2 million twenty years later. Forty-five percent of those incarcerated were African American, and 15 percent were Hispanic.
Once Afghanistan’s Russian-backed government fell in 1992, the United States lost interest in that distant, barren land, where life expectancy stood at forty-six years. A bloody civil war erupted between various Islamist factions and ethnic groups. One consisted largely of Afghan refugees recruited from madrassas—Saudi-sponsored religious schools in Pakistan. These fanatical religious students, or talibs, formed the Taliban—with help from the Pakistani intelligence service. Many had already received military training in CIA-financed camps. Most had studied textbooks developed by the University of Nebraska at Omaha’s (UNO) Center for Afghanistan Studies in a program funded by USAID to the tune of $51 million between 1984 and 1994. Published in Dari and Pashtu, the dominant Afghan languages, the books were designed to stoke Islamic fanaticism and spur resistance to the Soviet invaders. Page after page was filled with militant Islamic teachings and violent images. Children learned to count using pictures of missiles, tanks, land mines, Kalashnikovs, and dead Soviet soldiers. One leading Afghan educator said, “The pictures . . . are horrendous to school students, but the texts are even much worse.” One, for example, shows a soldier adorned with a bandolier and a Kalashnikov. Above him is a verse from the Koran. Below is a statement about the mujahideen, who, in obedience to Allah, willingly sacrifice their lives and fortunes to impose Sharia law on the government. Students learned to read by studying stories about jihad. When the Taliban seized Kabul in 1996, they continued using the same violent jihadist texts, simply removing the human images, which they considered blasphemous. Girls would be spared the indignity of seeing such texts, though; they were banned from school entirely. The Taliban subjected all Afghans to the most extreme Sharia law, banning visual images and instituting public amputations, beatings, and executions. Women lost all rights, including the rights to work and to go out in public without a male escort.
Experts would debate the precise number of Iraqi children who died as a result of the sanctions. In December 1995, two UN-affiliated researchers, writing in the British medical journal The Lancet, placed the number at 567,000 but later lowered that estimate. In 2003, British Prime Minister Tony Blair, speaking at a joint press conference with George Bush, said, “Over the past five years, 400,000 Iraqi children under the age of five died of malnutrition and disease,” using that as an excuse to justify an invasion that would add tens of thousands more to that total.
Though Clinton resisted the pressure to invade, he and his secretary of state laid the rhetorical groundwork for Bush and Cheney. Albright warned, “Iraq is a long way from [the United States], but what happens there matters a great deal here. For the risks that the leaders of a rogue state will use nuclear, chemical or biological weapons against us or our allies is the greatest security threat we face.” On another occasion, Albright had the audacity to declare, “If we have to use force, it is because we are America; we are the indispensable nation. We stand tall and we see further than other countries into the future.”
The election was very close indeed. Nationally, Gore won the popular vote by 544,000. Winning Florida would have also given him victory in the electoral college. The majority of Florida voters clearly intended to vote for Gore. But confusing “butterfly ballots” in West Palm Beach caused many elderly Jewish voters to inadvertently vote for Buchanan, who was sometimes accused of anti-Semitism and whom those voters, in particular, despised, and antiquated punch card machines in poor, heavily Democratic districts caused state officials to invalidate 180,000 ballots either for not clearly identifying a candidate or for voting more than once. But most troubling was that tens of thousands of pro-Gore African-American voters had been purged from the voting lists and denied the right to vote by Republican election officials, who had been directed to do so by the Bush Florida campaign cochair, Secretary of State Katherine Harris, on the pretext, often incorrect, that they were convicted felons. In the end, more than 10 percent of African Americans were disqualified compared to only 2 percent of Republican-leaning whites. Had the rates been equal, more than 50,000 more African Americans would have voted in Florida, giving Gore an overwhelming lead and ensuring his election. But because of the irregularities and the 97,000 votes that went to Nader, Bush clung to a minuscule lead of less than 1,000 votes out of 6 million cast. If certified, Bush would win the election by 271-266 electoral college votes.
The deck was stacked against Gore. Bush’s younger brother Jeb was governor. Harris, a fierce partisan, was in charge of certifying the results. Partial recounts cut Bush’s lead below 600 votes. Fearing that the full state recount Gore demanded would sink him, Bush deployed family consigliere James Baker, his father’s campaign manager and secretary of state, to use every available court challenge to block the recount. The Bush campaign also flew down a small army of members of Congress, congressional staffers, and lawyers to run the operation on the ground, many arriving in corporate jets leased to the campaign by Bush’s friend “Kenny Boy” Lay of Enron and Cheney’s friends at Halliburton.
Cheney and Bush spent their first eight months in office nice aggressively pursuing the PNAC [Project for the New American Century] agenda. They paid little if any heed to the terrorist threat. The attacks on September 11, 2001, could have and should have been prevented. NSC Counterterrorism Chief Richard Clarke tried to alert top administration officials, including Cheney, Rice, and Powell, to the Al-Qaeda threat from their very first days on the job. He warned that an attack was imminent. On January 25, he requested that [Condoleezza] Rice call an urgent cabinet-level “principals” meeting to discuss the threat. He finally got his meeting on September 4.
Warning signs abounded in the summer of 2001. Intercepted Al-Qaeda messages stated that “something spectacular” was about to occur. FBI agents reported suspicious behavior by individuals who wanted to know how to fly passenger airplanes but were not interested in learning how to land. Tenet received an August briefing paper titled “Islamic Extremist Learns to Fly” about the arrest in Minnesota of Zacarias Moussaoui after officials at the flight school he was attending reported his strange behavior. Clarke testified that CIA Director George Tenet was running around Washington with his “hair on fire,” trying to get Bush's attention. In late June, Tenet told Clarke, “I feel it coming. This is going to be the big one.” Intelligence agencies issued threat reports with headlines such as “Bin Laden Threats Are Real”, “Bin Laden Planning High Profile Attacks,” “Bin Laden Planning Multiple Operations,” “Bin Laden Public Profile May Presage Attack,” and “Bin Laden’s Network’s Plans Advancing.” Alerts warned of a high probability of near-term “spectacular” attacks resulting in numerous casualties and causing turmoil in the world. According to writer Thomas Powers, in the nine months before September 11, intelligence personnel “had warned the administration as many as forty times of the threat posed by Osama bin Laden, but that is not what the administration wanted to hear, and it did not hear it.”
George W. Bush was legendary for his misstatements and malapropisms. But sometimes, through the mangled syntax, a bit of truth would slip out. Such was the occasion in 2004 when he declared, “Our enemies are innovative and resourceful, and so are we. They never stop thinking about new ways to harm our country and our people, and neither do we.”
When Bush’s party was unceremoniously booted out of office in 2008, he was rated by historians as among the very worst presidents in U.S. history if not the absolute worst. His popularity and approval ratings set new lows for the modern era but were actually higher than those of his even less popular vice president, Dick Cheney. Bush and Cheney left the country in shambles, its economy collapsing and its international reputation at an all-time low. After invading two countries, threatening many others, and undermining the rule of law at home and abroad, the once-admired United States was now universally feared and widely condemned. People wondered whether the wrongheaded policies of the Bush-Cheney administration had resulted from ineptitude, hubris, and blind ambition or if perhaps there was something more sinister about its plans for the United States and the world.
Although the ever-cautious Barack Obama chose not to investigate the crimes of his predecessor, others would adhere more closely to the strictures of international law. In February 2011, George W. Bush was forced to cancel a speaking engagement in Switzerland for fear of massive protests against his torture policies. Activists were also planning to file a criminal complaint with Swiss prosecutors. Katherine Gallagher of the Center for Constitutional Rights explained, “Waterboarding is torture, and Bush has admitted without any sign of remorse, that he approved its use…Torturers—even if they are former presidents of the United States—must be held to account and prosecuted. Impunity for Bush must end.” Protest organizers urged demonstrators to bring a shoe in honor of the Iraqi journalist who was jailed for throwing his shoes at Bush in 2008. Referencing the 1998 London arrest of the late Chilean dictator Augusto Pinochet, Gavin Sullivan of the European Center for Constitutional and Human Rights said, “What we have in Switzerland is a Pinochet opportunity.” Amnesty International announced that similar steps would be taken if Bush traveled to any of the 147 nations that were party to the UN Convention Against Torture.
Bush subsequently ignored demands to investigate how such a colossal intelligence and leadership failure could have occurred. When the pressure finally became too great, Bush turned to Henry Kissinger to produce an official whitewash. Even the New York Times wondered whether the choice of Kissinger, the “consummate Washington insider,” with his “old friendships and business relationships,” to chair the commission was anything more than “a clever maneuver by the White House to contain an investigation it long opposed.”
Kissinger received a visit from a group of New Jersey women who had been widowed in the attack. One asked him if he had any clients named bin Laden. Kissinger spilled his coffee and nearly fell off his office couch. While his visitors rushed to clean up his mess, Kissinger tried to blame his clumsiness on his “bad eye.” The next morning, he resigned from the commission.
Having been attacked by Al-Qaeda in Afghanistan, the United States was preparing to retaliate against Iraq, whose leader, Saddam Hussein, was an avowed enemy of both Al-Qaeda and the anti-U.S. regime in Iran. Clarke admitted, “At first I was incredulous that we were talking about something other than getting al Qaeda. Then I realized with almost a sharp physical pain that Rumsfeld and Wolfowitz were going to try to take advantage of this national tragedy to promote their agenda about Iraq.”
Clarke underestimated Bush, Cheney, Rumsfeld, and Wolfowitz. Their agenda went far beyond Iraq. From atop the rubble at the World Trade Center, Bush proclaimed, “Our responsibility to history is already clear. To answer these attacks and rid the world of evil.”
Bush made it clear that this was a new kind of war—a war fought not against a nation or even an ideology but against a tactic: terrorism. As retired Ambassador Ronald Spiers pointed out, framing it that way was deliberate and pernicious. Choosing the “war” metaphor, he wrote in 2004, “is neither accurate nor innocuous, implying as it does that there is an end point of either victory or defeat. . . . A ‘war on terrorism’ is a war without an end in sight, without an exit strategy, with enemies specified not by their aims but by their tactics. ... The President has found this ‘war’ useful as an all-purpose justification for almost anything he wants or doesn’t want to do. . . . It brings to mind Big Brother’s View and never-ending war in Orwell’s 1984.”
On October 7, 2001, less than a month after the terrorist attacks, the United States and its allies launched Operation Enduring Freedom. The Taliban leaders quickly got the message and scrambled to negotiate. On October 15, Taliban foreign minister Wakil Ahmed Muttawakil, whom the U.S. Embassy in Islamabad considered to be very close to Taliban leader Mullah Muhammad Omar, offered to turn bin Laden over to the Organization of the Islamic Conference (OIC) for trial. Evidence suggests that Omar had been trying to rein in bin Laden for some time and that relations between the Afghans and Al-Qaeda had been frayed. U.S. representatives had actually had more than twenty meetings with Taliban officials during the previous three years to discuss their turning bin Laden over for trial. U.S. officials concluded that the Taliban were stalling. Milton Bearden, the former CIA station chief who oversaw the 1980s covert war in Afghanistan from his Pakistani base, disagreed, blaming U.S. obtuseness and inflexibility. “We never heard what they were trying to say,” he told the Washington Post. “We had no common language. Ours was ‘give up bin Laden.’ They were saying ‘do something to help us give him up.’” U.S. State Department and Embassy officials met with Taliban security chief Hameed Rasoli as late as August 2001. “I have no doubts they wanted to get rid of him,” Bearden said in October 2001. But the United States never offered the face-saving measures the Taliban needed.
Rumsfeld’s high-tech warfare succeeded in sharply limiting U.S. casualties, but the lack of U.S. boots on the ground allowed bin Laden, Omar, and many of their supporters to slip away when the United States had them trapped at Tora Bora in December 2001. Afghan civilians didn't fare as well, suffering approximately four thousand deaths, according to University of New Hampshire Professor Marc Herold-more than the number killed at the World Trade Center and Pentagon combined. Perhaps five times that number would die from disease and starvation in the months to follow.
Arthur Schlesinger, Jr., told journalist Jane Mayer that he considered the new torture policy, as Mayer put it, “the most dramatic, sustained, and radical challenge to the rule of law in American history.” The CIA outlined the procedures in detail. Upon arrest, the suspect would be “deprived of sight and sound” with blindfold and earmuffs. If the detainee proved uncooperative, he would be stripped naked, flooded with constant bright light and high-volume noise up to 79 decibels, and kept awake for up to 180 hours. Once the prisoner was convinced that he had no control, serious interrogation would begin. After guards shackled the prisoner’s arms and legs, placed a collar around his neck, and removed the hood covering his head, interrogators would slap him across the face, sometimes repeatedly, and, using the collar as a handle, slam his head into the wall up to thirty times. Subsequent methods included dousing the prisoner with water, denying him the use of toilet facilities, forcing him to wear dirty diapers, chaining him to ceilings, and requiring him to stand or kneel in painful positions for prolonged periods of time. The International Committee of the Red Cross reported that prisoners at Guantanamo were told that they were being taken “to the verge of death and back.”
Waterboarding was employed in special cases—and sometimes repeatedly, despite the fact that the United States had prosecuted Japanese military interrogators for use of waterboarding against U.S. prisoners during World War II. The process was described by Malcolm Nance, an interrogation expert who had been an instructor with the U.S. military's Survival, Evasion, Resistance, and Escape (SERE) program to train U.S. soldiers to withstand interrogation:
Unless you have been strapped down to the board, have endured the agonizing feeling of the water overpowering your gag reflex, and then feel your throat open and allow pint after pint of water to involuntarily fill your lungs, you will not know the meaning of the word. Waterboarding is a controlled drowning that, in the American model, occurs under the watch of a doctor, a psychologist, an interrogator and a trained strap-in/strap-but team. It does not simulate owning, as the lungs are actually filling with water. There is no way to simulate that. The victim is drowning. How much the victim is to drown depends on the desired result (in the form of answers to questions shouted into the victim’s face) and the obstinacy of the subject.”
Abu Zubaydah was waterboarded in Bangkok at least eighty-three times over four- or five-day period in August 2002, even though interrogators were convinced he was telling the truth. Still, CIA officials at the Counterterrorism Center at Langley demanded that the procedure be continued for a month, backing down only when the interrogators threatened to quit. Upon Zubaydah’s capture, Bush identified him as “al-Qaeda's chief of operations.” In reality, though, Zubaydah turned out to be a minor operative—not even an official member of Al-Qaeda—who may very well have been mentally ill. The Washington Post reported in 2009, “The methods succeeded in breaking him, and the stories he told of al-Qaeda terrorism plots sent CIA officers around the globe chasing leads. In the end, though, not a single significant plot was foiled as a result of Abu Zubaida’s tortured confessions, according to former senior government officials who closely followed the interrogations.” And, the Post acknowledged, whatever information investigators extracted that might have had any marginal utility had come out before the waterboarding began. The waterboarding did yield an abundance of information, according to the Post: “Abu Zubaida began unspooling the details of various al-Qaeda plots, including plans to unleash weapons of mass destruction. Abu Zubaida’s revelations triggered a series of alerts and sent hundreds of CIA and FBI investigators scurrying in pursuit of phantoms.” One former intelligence official admitted, “We spent millions of dollars chasing false alarms.”
Purported 9/11 mastermind Khalid Sheikh Mohammed was waterboarded 183 times, as if he were going to disclose something on the 183rd time that he hadn't divulged in the previous 182. Psychologists helped refine techniques, exploiting prisoners’ phobias. Interrogators also exploited Arabs’ cultural sensitivities by subjecting prisoners to public nudity and snarling dogs.
In February 2004, Major General Antonio Taguba reported that his investigation had turned up numerous instances of “sadistic, blatant, and wanton criminal abuses” at Abu Ghraib detention center, including rape of both male and female prisoners. Only four months earlier, Bush had announced, a bit prematurely perhaps, that “Iraq is free of rape rooms and torture chambers.”
Journalist Patrick Cockburn interviewed the senior U.S. interrogator in Iraq who had elicited the information that led to the capture of Iraqi Al-Qaeda leader Abu Musab al-Zarqawi. He told Cockburn that torture not only produces no useful information, but its use in Iraq “has proved so counterproductive that it may have led to the death of as many U.S. soldiers as civilians killed in 9/11.”
Although officials tried to pin the blame on a few "bad apples"—sadistic rogue interrogators who took matters into their own hands—torture was approved by top administration officials. Members of the National Security Council’s Principals Committee—Cheney, Rice, Rumsfeld, Powell, Tenet, and Ashcroft—met repeatedly to specify which methods would be used on which prisoners. Ashcroft interrupted one NSC discussion and asked, “Why are we talking about this in the White House? History will not judge this kindly.” General Barry McCaffrey agreed: “We tortured people unmercifully. We probably murdered dozens of them during the course of that, both the armed forces and the C.I.A.” For years, the 770-plus prisoners at Guantanamo and thousands more in Iraq and Afghanistan were denied legal counsel and the right to call witnesses to defend themselves. As of late 2008, charges had been brought against only twenty-three. Over five hundred had been released without being charged, often after years of harsh and humiliating treatment. One FBI counterterrorism expert testified that of the Guantanamo detainees, fifty at most were worth holding. Major General Taguba said, “There is no longer any doubt as to whether the current administration has committed war crimes. The only question that remains to be answered is whether those who ordered the use of torture will be held to account.”
Administration neocons were on board from the start. When the NSC principals met again two days later, Rumsfeld interrupted Powell’s discussion of “targeted sanctions” against Iran. “Sanctions are fine,” he blurted out. “But what we really want to think about is going after Saddam.” He later added, “Imagine what the region would look like without Saddam and with a regime that’s aligned with U.S. interests. It would change everything in the region and beyond it. It would demonstrate what U.S. policy is all about.” Looking back, Treasury Secretary Paul O’Neill recognized that the die had been cast from the very beginning: “From the start, we were building the case against Hussein and looking at how we could take him out and change Iraq into a new country. And, if we did that, it would solve everything. It was all about finding a way to do it. That was the tone of it. The President saying, ‘Fine. Go find me a way to do this.’”
O’Neill told Ron Suskind that as early as March 2001, administration officials were discussing concrete plans for invading and occupying Iraq. Cheney’s Energy Task Force played an important part. Other invasion backers included Wolfowitz protégés I. Lewis “Scooter” Libby, who was Cheney’s national security advisor; Stephen Hadley, who was Rice’s deputy; and Richard Perle, now head of the Pentagon’s Defense Policy Board. On September 19 and 20, Defense Policy Board members decided to put Iraq in the crosshairs as soon as they disposed of Afghanistan. The New York Times reported that the insiders promoting the invasion were called the “Wolfowitz cabal.”
In fact, Iraq posed no threat. It had destroyed so many weapons between 1991 and 1998 that it had become one of the weaker states in the region. Its military expenditures were just a fraction of that of some of its neighbors. In 2002 Iraq spent approximately $1.4 billion on its military. The United States spent more than three hundred times that amount.
Nevertheless, the scare tactics worked. To make sure they would, the administration deliberately timed the congressional vote to occur before the 2002 midterm elections and threatened to brand all who opposed the rush to war as unpatriotic and cowardly at a time of grave national crisis. Many caved in to the pressure, including Hillary Clinton and John Kerry. On October 2, 2002, the Senate voted 77-23 to authorize the use of force. The House did likewise, 296-133. The resolution directly connected Iraq to Al-Qaeda and alleged that Iraq posed a threat to the United States.
Only one Republican—Lincoln Chafee of Rhode Island—voted against the resolution in the Senate. He later condemned the spinelessness of top Democrats who had succumbed to Bush’s blackmail: “They were afraid that Republicans would label them soft in the post-September 11 world, and when they acted in political self-interest, they helped the president send thousands of Americans and uncounted innocent Iraqis to their doom.” Chafee watched as cowering Democrats repeatedly “went down to the meetings at the White House and the Pentagon and came back to the chamber ready to salute. With wrinkled brows they gravely intoned that Saddam Hussein must be stopped. Stopped from what? They had no conviction or evidence of their own. They were just parroting the administration’s nonsense.”
Members of the intelligence community were outraged over Pentagon neocons’ hijacking, distorting, and fabricating intelligence. When the nonexistent WMD later failed to materialize, New York Times columnist Nicholas Kristof described them as “spitting mad” and eager to have their say. One lashed out, “As an employee of the Defense Intelligence Agency, I know how this administration has lied to the public to get support for its attacks on Iraq.”
Whereas Powell’s speech largely fell flat overseas, it had the desired impact on U.S. public opinion. The Washington Post described the evidence as “irrefutable.” Prowar sentiment jumped from one-third of the public to one-half. When Powell visited the Senate Foreign Relations Committee the next day, Joseph Biden gushed, “I’d like to move the nomination of Secretary of State Powell for President of the United States.”
For the neocons, Iraq was just the appetizer. After devouring Iraq, they planned to return for the main course. In August 2002, a senior British official told Newsweek, “Everyone wants to go to Baghdad. Real men want to go to Tehran.” Undersecretary of State John Bolton voted for Syria and North Korea. PNACer Norman Podhoretz urged Bush to think bigger still. “The regimes that richly deserve to be overthrown and replaced are not confined to the three singled-out members of the axis of evil,” he wrote in his journal, Commentary. “At a minimum, the axis should extend to Syria and Lebanon and Libya, as well as ‘friends’ of America like the Saudi royal family and Egypt’s Hosni Mubarak, along with the Palestinian Authority, whether headed by Arafat or one of his henchmen.” Michael Ledeen, a former U.S. national security official and neocon strategist, mused, “I think we’re going to be obliged to fight a regional war, whether we want to or not. It may turn out to be a war to remake the world.”
When retired General Wesley Clark visited the Pentagon in November 2001, he discovered that this was more than a pipe dream. A senior military staff officer told him, “we were still on track for going against Iraq. . . . But there was more. This was being discussed as part of a five year campaign plan, he said, and there were a total of seven countries, beginning with Iraq, then Syria, Lebanon, Libya, Iran, Somalia, and Sudan. So, I thought, this is what they mean when they talk about ‘draining the swamp.’”
With war approaching, some noticed how few of the war enthusiasts had served their country during the Cold War or in Vietnam, earning them the label of “chickenhawks.” Despite heartily supporting the Vietnam War, most went out of their way to avoid combat. Now they were blithely sending other young men and women off to Afghanistan and Iraq to kill and be killed. Republican Senator Chuck Hagel of Nebraska, a Vietnam veteran who opposed the administration’s warmongering, remarked, “It is interesting to me that many of those who want to rush this country into war and think it would be so quick and easy don’t know anything about war. They come at it from an intellectual perspective versus having sat in jungles or foxholes and watched their friends get their heads blown off.” Highly decorated Marine General Anthony Zinni found it “interesting to wonder why all the generals see it the same way, and all those that never fired a shot in anger and really hell-bent to go to war see it a different way. That’s usually the way it is in history.”
It was even more so now. Dick Cheney called Vietnam a “noble cause,” but after leaving Yale for Casper Community College in Wyoming, he applied for and received four student deferments and then another one for being married. “I had other priorities in the 60s than military service,” he explained.” Some think it not accidental that the Cheneys had their first child in July 1966, nine months after the Johnson administration announced it would begin drafting married men without children. George W. Bush used family connections to get into the National Guard, which was only 1 percent African American. Bush failed to complete his six-year commitment and got himself assigned to Alabama, where he engaged in politics. Four-star General Colin Powell, the former chairman of the Joint Chiefs of Staff, wrote in his 1995 autobiography, “I am angry that so many of the sons of the powerful and well placed ... managed to wangle slots in Reserve and National Guard units. Of the many tragedies of Vietnam, this raw class discrimination strikes me as the most damaging to the ideal that all Americans are created equal and owe equal allegiance to their country.” Future House Speaker Newt Gingrich got a student deferment. He told a reporter that Vietnam was “the right battlefield at the right time.” When asked why it wasn’t right for him, he replied, “What difference would I have made? There was a bigger battle in Congress than in Vietnam.” But he wasn’t elected to Congress until four years after the United States pulled out all its troops. John Bolton supported the Vietnam War while attending Yale but enlisted in the Maryland National Guard to avoid combat. He later wrote in his Yale twenty-fifth reunion book, “I confess I had no desire to die in a Southeast Asian rice paddy.” Paul Wolfowitz, Scooter Libby, Peter Rodman, Richard Perle, former White House Chief of Staff Andrew Card, John Ashcroft, George Will, former New York City Mayor Rudolph Giuliani, Phil Graham, former Speaker of the House Dennis Hastert, Joe Lieberman, Senator Mitch Mcconnell, Supreme Court Justice Clarence Thomas, Trent Lott, Richard Armey, and former Senator Don Nickles got deferments. John Ashcroft got seven of them. Elliott Abrams had a bad back, former Solicitor General Kenneth Starr psoriasis, Kenneth Adelman a skin rash, Jack Kemp a knee injury—though he managed to play quarterback in the NFL for another eight years. Superhawk Tom DeLay, the future Republican majority leader, had worked as a pest exterminator. He assured critics that he would have served but that minorities had already taken the best positions. Rush Limbaugh missed Vietnam because he had a pilonidal or anal cyst.
Al-Qaeda leaders thanked Allah for the colossal blunders, both tactical and strategic, of the United States’ neocon strategists. In September 2003, on the second anniversary of 9/11, Al-Qaeda leader Ayman al-Zawahiri exulted, “We thank God for appeasing us with the dilemmas in Iraq and Afghanistan. The Americans are facing a delicate situation in both countries. If they withdraw they will lose everything and if they stay, they will continue to bleed to death.” The following year, bin Laden drew on the same “bleeding” metaphor to explain his strategy, taking credit for having “bled Russia for 10 years until it went bankrupt and was forced to withdraw in defeat.” He was, he claimed, “continuing this policy in bleeding America to the point of bankruptcy,” noting that the half-million dollars Al-Qaeda spent on the 9/11 terrorist acts had resulted in a U.S. “economic deficit” of over a trillion dollars.
Nor was Bush making many friends in Russia. Although he said he had looked into Russian President Vladimir Putin’s soul and liked what he saw, Bush, as had Clinton before him, treated Russia with contempt. Shortly after taking office, Bush, ignoring strong Russian opposition, withdrew from the 1972 ABM Treaty to pursue his missile defense initiative. But he and Putin had a surprisingly friendly meeting in June 2001. After the September 11 attacks, Putin was one of the first foreign leaders to phone Bush and express condolences. On September 24, he announced a five-point plan to support the U.S. war on terrorism. Not only would he share intelligence and open Russian airspace to the United States, he said, but he would acquiesce in and even facilitate the stationing of U.S. troops in the Middle East, which many in Russia's military and intelligence community strongly opposed.
Bush repaid Putin’s largesse by breaking his father’s promise to Gorbachev and expanding NATO ever closer to Russia’s borders, effectively encircling Russia with U.S. and NATO military bases, some in former Soviet territories. This second wave of expansion began in late 2002 and concluded with the admission of Bulgaria, Romania, Slovakia, Slovenia, Lithuania, Latvia, and Estonia in March 2004. The Russians objected vehemently. Extending NATO to former Warsaw Pact nations like Bulgaria and Romania was objectionable enough, but extending NATO to former Soviet republics like Lithuania, Latvia, and Estonia was adding insult to injury.
Openly contemptuous of Russian opinion, Bush pressed NATO to expand even farther. Croatia and Albania joined in 2008. And he made it clear that he also wanted to add Georgia and Ukraine, despite protests by Russia and warnings from other NATO members that this would seriously damage relations between Russia and the West. Russians were convinced that U.S. democracy programs in Ukraine, Georgia, and Belarus were simply a ploy to further expand NATO and isolate Russia.
Russia also took sharp issue with Bush administration efforts to weaponize space. Bush appeared to be realizing the vision of the head of the U.S. Space Command, who had predicted in 1996, “We will engage terrestrial targets someday—ships, airplanes, land targets—from space. . . . We’re going to fight from space and we’re going to fight into space. . . . That’s why the U.S. has development programs in directed energy and hit-to-kill mechanisms.” The rest of the world united in opposition to U.S. plans to expand the realm of conflagration. In 2000, the United Nations, by a vote of 163-0, passed a resolution on the Prevention of an Outer Space Arms Race, with Micronesia, Israel, and the United States abstaining. Defying world opinion, in January 2001, a commission led by Rumsfeld warned that the United States could face a “Space Pearl Harbor” if it didn’t dominate space and recommended that the military “ensure that the president will have the option to deploy weapons in space.” That year, Peter Teets, undersecretary of the air force, told a space warfare symposium, “We haven’t reached the point of strafing and bombing from space—nonetheless, we are thinking about those possibilities.”
In 2006, UN members voted 166-1 in favor of the resolution, with only the United States opposed. At the UN Conference on Disarmament, the United States consistently thwarted efforts by Russia and China to ban weaponization. Among the more bizarre programs the air force was looking into was one called “Rods from God,” which would deploy solid tungsten cylinders, twenty or thirty feet long and one or two feet in diameter, that would be fired from satellites at tremendous speeds, easily destroying any target on the earth.
It fell to none other than Zbigniew Brzezinski to accurately assess the toll taken on American democracy by Bush’s disastrous war on terrorism. Brzezinski was in a good position to know, having played a similar role in stirring up Cold War fears of the Soviet Union. He wrote in March 2007 that the so-called war on terror, by deliberately creating a “culture of fear,” had had a “pernicious impact on American democracy, on America's psyche and on U.S. standing in the world.” The damage was “infinitely greater” than that inflicted on 9/11. He worried that the administration was exploiting public fear to justify war with Iran and contrasted the United States’ “five years of almost continuous national brainwashing on the subject of terror” with the “more muted reactions of” other victims of terrorism, including Britain, Spain, Italy, Germany, and Japan. He mocked Bush’s “justification for his war in Iraq” and his absurd claim “that he has to continue waging it lest al-Qaeda cross the Atlantic to launch a war of terror here in the United States.” Bush’s fearmongering was reinforced by “terror entrepreneurs . . . experts on terrorism [whose] task is to convince the public that it faces new threats. That puts a premium on the presentation of credible scenarios of ever-more-horrifying acts of violence.” As a result, “America has become insecure and more paranoid.” For proof, he pointed to Congress’s ever-growing list of potential targets across the United States for would-be terrorists. He also deplored the madness of proliferating “security checks,” “electronic billboards urging motorists to ‘Report Suspicious Activity’ (drivers in turbans?),” and television shows with “bearded ‘terrorists’ as the central villains” that “reinforce the sense of the unknown but lurking danger that . . . increasingly threaten[ed] the lives of all Americans.” Television and films had stereotyped Arabs, he regretted, “in a manner sadly reminiscent of the Nazi anti-Semitic campaigns,” opening Arab Americans to harassment and abuse.
Within months of taking office, Bush signed a bill cutting taxes for the wealthiest Americans. He passed additional tax cuts in 2002 and 2003. Meanwhile, federal spending rose sharply, increasing 17 percent in his first term alone. Under Clinton, federal spending had increased by 11 percent in constant dollars over two terms. By 2004, Bush had turned the $128 billion surplus he inherited into a $413 billion deficit. The New York Times reported that for Wall Street, the Bush years were the new Gilded Age. Bankers, the Times revealed, celebrated their obscene bonuses with five-figure dinners. The Government Accountability Office (GAO) reported that between 1998 and 2005, two-thirds of American corporations, at least a quarter of which had assets in excess of $250 million, paid no income taxes. These years saw the sharpest rise in income inequality in the nation’s history. The 44.3 percent of the nation’s income that went to the top 10 percent in 2005 exceeded the 43.8 percent that had gone to the top 10 percent in 1929 and was a far cry from the 32.6 percent of 1975. In 2005, the richest 3 million had as much income as the bottom 166 million, who comprised more than half of the population. The ranks of American billionaires swelled from 13 in 1985 to more than 450 in 2008. Two hundred twenty-seven thousand people joined the ranks of millionaires in 2005 alone. But workers’ wages barely kept pace with inflation, and 36 million were below the poverty line. Almost all the new wealth created went directly to the top 10 percent of the population, with most going to the top one-tenth of 1 percent. In 2006, the twenty-five top U.S. hedge fund managers earned an average of $570 million each. In 2007, their average earnings jumped to $900 million.
By late 2009, over 40 million Americans were living in poverty. In 1988, 26 percent of Americans told Gallup pollsters that the country was divided between haves and have-nots, with 59 percent identifying themselves as haves and only 17 percent as have-nots. When Pew asked that same question in the summer of 2007, 48 percent responded that the country was so divided, with 45 percent considering themselves haves and 34 percent have-nots.
The United States had become a plutocracy with almost a quarter of income going to the top 1 percent and the richest one-tenth of 1 percent earning as much as the poorest 120 million. Former Secretary of Labor Robert Reich identified the new plutocrats: “With the exception of a few entrepreneurs like Bill Gates, they’re top executives of big corporations and Wall Street, hedge-fund managers, and private equity managers.”
“We are an attractive empire, the one everyone wants to join,” crowed neocon Max Boot in the aftermath of 9/11. But now, after two long and disastrous wars, trillions of dollars in military spending, a network of more than 1,000 foreign military bases, torture and abuse of prisoners on several continents, assault on both international law and the U.S. Constitution, a near economic collapse, drone attacks killing alleged terrorists and civilians alike, disparities between rich and poor unheard of in an advanced industrial country, appallingly low test scores for students, government surveillance on an unprecedented scale, collapsing infrastructure, domestic uprisings on both the Left and the Right, and an international reputation left in tatters, the U.S. empire does not look all that attractive.
George W. Bush, who canceled his 2011 speaking engagement in Switzerland to avoid massive protests and the risk of being indicted as a war criminal, and his empire-friendly advisors bear a lot of responsibility for this sorry state of affairs. They saddled Barack Obama and the American people with an incredible mess. Obama confided to one of his closest aides: “I’m inheriting a world that could blow up any minute in a half dozen ways . . .”
The country Obama inherited was indeed in shambles, but Obama took a bad situation and, in certain ways, made it worse. Swept into office on a wave of popular euphoria, he mesmerized supporters throughout the campaign with his exhilarating rhetoric, surpassing intelligence, inspiring biography, commitment to defending civil liberties, rejection of unilateralism, and strong opposition to the Iraq War—qualities that made him seem the antithesis of Bush. The election of Barack Hussein Obama, the child of a black Kenyan father and a white Kansan mother, who was raised in Indonesia as well as Hawaii and went on to graduate from Columbia and become president of the Harvard Law Review, felt like a kind of expiation for the sins of a nation whose reputation had been sullied, as we have shown throughout this book, by racism, imperialism, militarism, nuclearism, environmental degradation, and unbridled avarice. The suffering caused by misguided U.S. policies had been immense. For many, Obama’s election offered redemption. It attested to the other side of America and its place in history, a side marked by idealism, egalitarianism, constitutionalism, republicanism, humanism, environmentalism, and the embrace of freedom and democracy as universal principles. Progressives hoped Obama would become the heir to a tradition represented by Franklin Roosevelt and Henry Wallace and by the post-Cuban Missile Crisis John F. Kennedy.
Yet rather than repudiating the policies of Bush and his predecessors, Obama has perpetuated them. Rather than diminishing the influence of Wall Street and the major corporations in U.S. life, Obama has given them latitude to continue most of their predatory practices. Rather than restoring the civil liberties that Bush had eviscerated and limiting the executive powers that Bush usurped after 9/11, Obama, with few exceptions, has tightened the grip of the domestic security/surveillance apparatus, stifling civil liberties and the right to dissent.
Having betrayed his earlier promises and become the first presidential candidate to turn down public camp financing in the general election, Obama turned to Wall Street funders with deep pockets, like Goldman Sachs, Citigroup, JPMorgan Chase, Skadden Arps, and Morgan Stanley. Also high on the list of Obama contributors were General Electric and other defense contractors. And the pharmaceuticals industry—Big Pharma—reversed years of supporting Republicans, contributing more than three times as much to Obama as to McCain.
Obama’s grassroots supporters largely overlooked these disturbing facts. Progressives projected onto him their own hopes and expectations, conservatives their worst fears. Both were mistaken. He ran a centrist campaign, advancing safely pragmatic policy initiatives. He consistently championed the middle class. The working class and the poor—black, Hispanic, Asian, Native American and white—seemed an afterthought as Obama battled Hillary Clinton and then John McCain. Instead of seizing the opportunity to explain how the decline of manufacturing and other structural factors at the heart of a dysfunctional corporate- and Wall Street-dominated system had exacerbated the problems for all poor people and especially African Americans, he hectored poor blacks for not taking more “personal responsibility.” He positioned himself to the left of Clinton by trumpeting his opposition to the Iraq War, which she had voted to support, but to the right of George Bush on Afghanistan, a position his supporters conveniently ignored. And his Senate vote for the Foreign Intelligence Surveillance Act, which gave legal immunity to telecommunications companies complicit in Bush’s wiretapping, should have signaled that he might be unwilling to relinquish some of the powers that Bush and Cheney had appropriated.
Former Democratic strategist David Sirota aptly identified the ways Rubin’s people would mold Obama's economic strategy: “Bob Rubin, these guys, they’re classic limousine liberals. These are basically people who have made shitloads of money in the speculative economy, but they want to call themselves good Democrats because they’re willing to give a little more to the poor. That’s the model for this Democratic Party: Let the rich do their thing, but give a fraction more to everyone else.”
On November 23, 2008, the Bush administration announced a potential $306 billion bailout of Citigroup, which was facing collapse. Citigroup had recently received $25 billion under the Troubled Asset Relief Plan, which provided a massive bailout to the financial sector. The Times made clear that Geithner played a “crucial role” in the negotiations and that Bush’s Treasury Secretary, Henry Paulson, had worked very closely with Obama’s transition team. Wall Street was so exuberant over the deal that the Dow posted its biggest two-day jump in over twenty years and Citigroup’s stock, which had tumbled in price from $30 to $3.77 in the past year, shot up 66 percent in one day. “If you had any doubts at all about the primacy of Wall Street over Main Street,” former Labor Secretary Robert Reich exclaimed, “your doubts should be laid to rest.” Abundant proof would be forthcoming. The Washington Post reported in early April 2009 that the Treasury Department was bending the law and defying the will of Congress to avoid limiting executive pay: “The Obama administration is engineering its new bailout initiatives in a way that it believes will allow firms benefiting from the programs to avoid restrictions imposed by Congress, including limits on lavish executive pay, according to government officials.”
University of Texas economist James Galbraith lambasted Obama for meekly submitting to the bankers’ demands as if there were no other way to solve the crisis:
... one cannot defend the actions of Team Obama on taking office. Law, policy and politics all pointed in one direction: turn the systemically dangerous banks over to Sheila Bair and the Federal Deposit Insurance Corporation. Insure the depositors, replace the management, fire the lobbyists, audit the books, prosecute the frauds, and restructure and downsize the institutions. The financial system would have been cleaned up. And the big bankers would have been beaten as a political force. Team Obama did none of these things. Instead they announced “stress tests,” plainly designed so as to obscure the banks’ true condition. They pressured the Federal Accounting Standards Board to permit the banks to ignore the market value of their toxic assets. Management stayed in place. They prosecuted no one. The Fed cut the cost of funds to zero. The President justified all this by repeating, many times, that the goal of policy was to get credit flowing again.” The banks threw a party. Reported profits soared, as did bonuses. With free funds, the banks could make money with no risk, by lending back to the Treasury. They could boom the stock market. They could make a mint on proprietary trading. Their losses on mortgages were concealed. ...
A stunning measure of how far the United States had fallen came from the October 2011 Bertelsmann Stiftung Foundation report “Social Justice in the OECD—How Do the Member States Compare,” which ranked the United States twenty-seventh out of the thirty-one OECD nations, only beating Greece, Chile, Mexico, and Turkey. The report measured many factors, including poverty prevention, poverty rates for children and senior citizens, income inequality, expenditures on pre-primary education, health care, and other key metrics. The United States came in twenty-ninth in overall poverty rate and twenty-eighth in child poverty and income inequality. Columbia University's National Center for Children in Poverty reported that 42 percent of children lived in low-income families, half of them below the poverty line. The Associated Press reported in December 201 1 that almost half of all Americans were either in poverty or subsisting on low incomes. The Census Bureau reported that in 2010, 46.2 million Americans were below the poverty line, which was the highest number since it began publishing those figures fifty-two years earlier.
Not only were more Americans falling into poverty, fewer and fewer were able to escape. Mobility studies shattered the myth that the United States was a society with fluid class lines and easy upward mobility. In fact, the United States, with its porous social safety net, failing schools, and low percentage of unionized workers, had considerably less social mobility than any other advanced industrial society.
Pearlstein wondered, “Whose Side Is Obama On?” The question became more poignant as the 2012 elections approached. Anger over the economy had boiled over. Occupy Wall Street and allied protesters gathered in towns and cities across the nation in a grassroots uprising of a sort not seen since the 1930s. Obama walked a fine line, trying to signal both the anti-Wall Street protesters and the Wall Street tycoons, whom the protesters reviled, that he was with them. In June 2011, the New York Times reported that Obama had offended Wall Street’s high rollers by calling them “‘fat cats’ and criticizing their bonuses” and by having the audacity to propose any curbs at all on their rapaciousness. Now, according to the Times, Obama and his top aides, looking for Wall Street backing in his reelection bid, were trying to salve the bankers’ wounded feelings. Franklin Roosevelt had compared ungrateful capitalists to the drowning old man who berates his rescuer for not saving his hat; Obama came before them, hat in hand, and begged forgiveness. Unlike Roosevelt, who had made enemies of Wall Street financiers by implementing large-scale government job creation and sweeping regulatory reform, Obama not only privileged those Wall Street insiders over the working masses, he apologized for having hurt their feelings.
Obama also paid debts to other corporate donors. Nobel Prize-winning economist Joseph Stiglitz noted, “When pharmaceutical companies receive a trillion-dollar gift—through legislation prohibiting the government, the largest buyer of drugs, from bargaining over price—it should not come as cause for wonder. It should not make jaws drop that a tax bill cannot emerge from Congress unless big tax cuts are put in place for the wealthy. Given the power of the top 1 percent, this is the way you would expect the system to work.” Stiglitz cited the response from banker Charles Keating, who was brought low by the 1980s savings and loan crisis. When asked by a congressional committee whether the $1.5 million he had contributed to elected officials could buy influence, he answered, “I certainly hope so.” The Supreme Court decision in the 2010 Citizens United case, which removed limits on corporate campaign spending, ensured that the influence of corporate and banking interests would mushroom.
When the 2010 Congressional elections rolled around, the enthusiasm gap between Republicans and Democrats was enormous. Obama’s temporizing and lassitude had so demoralized his base that the Republicans won in a landslide, prompting him to reach out even further across the aisle. He quickly reneged on his promise to implement tougher environmental standards, announcing that he would forgo new rules regarding smog and toxic emissions from boilers, leaving in place Bush administration policies.
That was precisely what Obama has done—a far cry from his campaign promises to defend the Constitution against Bush’s trespasses. He had, for example, criticized Bush's repeatedly invoking state secrets to block lawsuits. In office, he reversed himself, impeding prosecution of Bush-era torture and other abuses and advancing what the New York Times described as “a sweeping view of executive secrecy powers.” He has invoked the “state secrets privilege” time and again to halt lawsuits involving torture, extraordinary rendition, and illegal NSA wiretapping. He has continued the CIA’s extraordinary rendition program, denied habeas corpus rights to Afghan prisoners, sanctioned military commissions, and authorized, without due process, the CIA killing of a U.S. citizen in Yemen who was accused of having ties to Al-Qaeda. His refusal to investigate and prosecute those in the Bush administration guilty of torture was itself a violation of international treaties.
Bush Justice Department official Jack Goldsmith quickly recognized that Dick Cheney’s rebuke of Obama for reversing the Bush-era terrorism policies was flat out wrong. In fact, Goldsmith wrote in the New Republic, “The truth is closer to the opposite: The new administration has copied most of the Bush program, has expanded some of it, and has narrowed only a bit. Almost all of the Obama changes have been at the level of packaging, argumentation, symbol, and rhetoric. ... The Obama strategy,” he concluded, “can thus be seen as an attempt to make the core Bush approach to terrorism politically and legally more palatable, and thus sustainable.”
Civil libertarians were appalled, expecting so much more from this former professor of constitutional law. His University of Chicago law school colleague Geoffrey Stone, chairman of the board of the American Constitutional Society, decried the gap between Obama's policies and his campaign promises, regretting Obama’s “disappointing willingness to continue in his predecessor's footsteps.” George Washington University law professor Jonathan Turley observed disappointedly that “the election of Barack Obama may stand as one of the single most devastating events in our history for civil liberties.”
Obama’s own foreign policy experience was quite limited and his views were conventional, if sometimes muddled. He told one campaign audience in Pennsylvania, “The truth is that my foreign policy is actually a return to the traditional bipartisan realistic policy of George Bush’s father, of John F. Kennedy, of, in some ways, Ronald Reagan.” While untangling precisely what Obama meant by that bizarre conflation would be a challenge, what was clear was that he was not offering a decisive break with over a century of imperial conquest. His was a centrist approach to better managing the American empire rather than advancing a positive role for the United States in a rapidly evolving world. He intended to reduce U.S. involvement in the Middle East and increase U.S. engagement with Asia, where American hegemony was being challenged by a resurgent and ever more potent China. “We’ve been on a little bit of a Middle East detour over the course of the last ten years,” Kurt Campbell, the assistant secretary of state for East Asian and Pacific Affairs, noted. “And our future will be dominated utterly and fundamentally by developments in Asia and the Pacific region.” “The project of the first two years has been to effectively deal with the legacy issues that we inherited, particularly the Iraq war, the Afghan war, and the war against Al-Qaeda, while rebalancing our resources and our posture in the world,” Benjamin Rhodes, one of Obama's deputy national security advisors, said. “If you were to boil it all down to a bumper sticker, it’s ‘Wind down these two wars, reestablish American standing and leadership in the world, and focus on a broader set of priorities, from Asia and the global economy to a nuclear-nonproliferation regime.”
Gates was the principal guarantor of imperial continuity. A staunch Cold Warrior with close ties to the neocons, Gates’s involvement in several scandalous situations, including allegedly delaying the release of U.S. hostages in Iran in 1980 and facilitating arms sales to both Iraq and Iran during their disastrous war, have never been fully investigated. During the Reagan years, he was instrumental in revamping CIA intelligence gathering, purging independent-minded analysts who wouldn't go along with the view of a menacing Soviet threat that justified an enormous U.S. military buildup. He was also a key proponent of Reagan’s murderous policies in Central America, advocating illegal covert measures against the Sandinista regime in Nicaragua.
Gates teamed with [Secretary of State Hillary] Clinton to frustrate those who hoped for reassessment of America’s role in the world. “People are wondering what the future holds, at home and abroad,” Clinton told the Council on Foreign Relations (CFR). “So let me say it clearly: The United States can, must, and will lead in this new century.” “We are still essentially, as has been said before, the indispensable nation,” Gates concurred in November 2010.” But in declaring a “new American moment” before the CFR, Clinton offered a version of American history stunning in its simplicity and vapidity: “After the Second World War, the nation that had built the transcontinental railroad, the assembly line, the skyscraper, turned its attention to constructing the pillars of global cooperation. The third World War that so many feared never came. And many millions of people were lifted out of poverty and exercised their human rights for the first time. Those were the benefits of a global architecture forged over many years by American leaders from both political parties.”
In speeches in Prague, Cairo, Oslo, and elsewhere, Obama articulated a more nuanced understanding of America's role in the world. But his ultimate message was largely the same as that of Clinton and Gates. Nothing was more disappointing than his Nobel Peace Prize acceptance speech in December 2009. That a president waging two wars would receive the prize was preposterous in the first place. But the selection committee members must have been even more chagrined when they heard Obama’s defense of American militarism in an address that came just days after announcing that he was sending additional forces to Afghanistan. An at times thoughtful speech about the complex problems facing the world was sullied by a defense of war, unilateralism, and preemption.
Pakistanis particularly bristled at increased U.S. drone attacks inside Pakistan, which, the Washington Post reported, had left between 1,350 and 2,250 dead in Obama’s first three years in office. Drones, which could be used for surveillance or attack, when equipped with Hellfire missiles, had increasingly become the U.S. weapon of choice in both Pakistan and Afghanistan. Obama authorized as many drone attacks in his first nine months as Bush had in the preceding three years, leading to deaths of many innocent civilians.
David Kilcullen, who had served as a counterinsurgency advisor to General David Petraeus from 2006 to 2008, and Andrew Exum, an army officer in Iraq and Afghanistan from 2002 to 2004, provided insight into Pakistani rage in May 2009. They cited Pakistani press reports indicating that over the past three years U.S. drone strikes had killed 700 civilians and only 14 terrorist leaders, which equated to 50 civilians for every militant, “a hit rate of 2 percent.” While noting that U.S. officials “vehemently” denied these figures and acknowledging that they likely exaggerated the proportion of civilian casualties, Kilcullen and Exum warned that “every one of these dead noncombatants represents an alienated family, a new desire for revenge, and more recruits for a militant movement that has grown exponentially even as drone strikes have increased” and that “visceral opposition” was being expressed in areas of Pakistan far from where the strikes were occurring.
No wonder 97 percent of Pakistanis told Pew researchers that they viewed U.S. drones negatively and the number who saw the United States as an enemy jumped from 64 percent in 2009 to 74 percent in 2012. No wonder so many were incensed at the smug indecency of President Obama’s comment at the May 2010 White House Correspondents’ Association dinner upon spotting the popular teenage Jonas Brothers band members in the audience. Referring to his daughters, Obama quipped, “Sasha and Malia are huge fans but, boys, don't get any ideas. Two words for you: predator drones. You will never see it coming.” In spring 2012, only 7 percent of Pakistanis held a positive view of Obama.
Drone usage would expand on Obama’s watch from Pakistan—the only country targeted when Bush left office—to six countries over the next three plus years as the United States added Islamic rebels in the Philippines to the list in February 2012. Critics agreed with Tom Engelhardt’s astute observation that “drones . . . put wings on the Bush-era Guantanamo principle that Washington has an inalienable right to act as a global judge, jury, and executioner, and in doing so remain beyond the reach of any court of law.”
What was not appreciated until later was the direct, hands-on involvement of the president himself in targeting specific individuals who were put on official “kill lists.” In 2006, former vice president Al Gore expressed outrage over George Bush’s exercise of powers and wondered whether there were any limits to what presidents could do. Gore asked: “If the president has the inherent authority to eavesdrop on American citizens without a warrant, imprison American citizens on his own declaration, kidnap and torture, then what can’t he do?” Obama’s targeted assassinations provided a chilling answer. Glenn Greenwald warned that “the power to order people executed (including U.S. citizens) is far too extreme and dangerous to vest in one person without any checks, review, oversight or transparency.” After all, he reminded readers, “it was a consensus among Democrats that George Bush should be forced to obtain judicial review before merely spying on or detaining people, let alone ordering them executed by the CIA.”
For Engelhardt, drones were simply the latest in a long line of “wonder weapons guaranteed to ensure American military hegemony from atomic bombs to hydrogen bombs to the Vietnam-era electronic battlefield to Reagan's missile defense shield to the First Gulf War’s “smart bombs.” Doubt was cast upon their wonder-weapon status in late 2011 when Iranians displayed an RQ-170 Sentinel that they had brought down intact while it was spying over their territory. More than two dozen others had crashed up to that point but none with so much fanfare or such embarrassing consequences.
Some expressed concern that the Iranians would reverse engineer the drone and learn its secrets. Dick Cheney demanded that Obama send planes to destroy the downed aircraft while it lay grounded. But it was too late. The cat was already out of the bag. More than fifty countries, some friendly and some hostile to the United States, had already purchased drones, and several had their own sophisticated drone programs in place. Most of those purchased were of the surveillance variety, but the United States had sold attack drones to close allies. In 2009, the United States punished Israel, which was second only to the United States in drone manufacture, for selling an attack drone to China. WikiLeaks revealed that Israel had angered U.S. authorities by selling advanced model drones to Russia. Among the other countries that claimed to have mastered manufacturing drones with lethal capabilities were Russia, India, and even Iran. In summer 2010, Iranian president Mahmoud Ahmadinejad displayed a model he called the “ambassador of death.”
Reports surfaced repeatedly of American soldiers who went over the line, gratuitously killing innocent civilians in Afghanistan as was earlier the case in Iraq. One twenty-year-old soldier, who went AWOL in Canada, described the process that led to the erosion of human empathy:
I swear I could not for a second view these people as anything but human. The best way to fashion a young hard dick like myself—“dick” being an acronym for “dedicated infantry combat killer”—is simple and the effect of racist indoctrination. Take an empty shell off the streets of L.A. or Brooklyn, or maybe from some Podunk town in Tennessee and these days America isn’t in short supply. I was one of those no-child-left-behind products. Anyway, you take this empty vessel and you scare the living shit out of him, break him down to nothing, cultivate a brotherhood and camaraderie with those he suffers with, and fill his head with racist nonsense like all Arabs, Iraqis, Afghans are Hajj. Hajj hates you. Hajj wants to hurt your family. Hajj children are the worst because they beg all the time. Just some of the most hurtful and ridiculous propaganda, but you’d be amazed at how effective it’s been in fostering my generation of soldiers.
One group of deranged young men formed a twelve-person “kill team” that murdered innocent Afghans and then staged evidence to make it look as if they acted in self-defense. One of the accused confessed to the murders. U.S. authorities were not pleased when photos of the soldiers posing with the corpses appeared in Der Spiegel.
The Taliban had actually done a good job keeping the drug trade under control when they were in power. But following the U.S. invasion, drugs had proliferated wildly. Opium production skyrocketed from 185 tons in 2001 to 8,200 tons in 2007, constituting 53 percent of the entire national economy and employing nearly 20 percent of the Afghan population. The drug lords lived in opulent carnival-colored mansions called “poppy palaces” that were distinguished by their non-Afghan “narcotecture” styles. But many Afghans suffered from the resulting drug abuse. In 2005, 920,000 addicts were reported. The number increased substantially after that.
Under Karzai, illicit drugs provided a steady flow of funds for the Taliban who taxed them at a rate of 10 percent and protected drug convoys for an additional fee. The Taliban also received hundreds of millions of dollars indirectly from the United States and NATO. Journalist Jean Mackenzie reported that in much of the country contractors factored at least a 20 percent cut on projects to the Taliban to let them proceed. One Afghan contractor reported, “I was building a bridge. The local Taliban commander called and said ‘don’t build a bridge there, we'll have to blow it up.’ I asked him to let me finish the bridge, collect the money—then they could blow it up whenever they wanted. We agreed, and I completed my project.”
In 2010, American officials paid $2.2 billion to U.S. and Afghan trucking companies to transport supplies to U.S. bases. These trucking companies hired security firms that were often linked to top government officials to protect the trucks for between $800 and $2,500 per truck. The security firms, in turn, often faked fights to magnify the need for their services and bribed the Taliban to let the trucks pass without attacking them, leading one NATO official in Kabul to complain, “We’re funding both sides of the war.”
Obama’s logic befuddled CNN commentator Fareed Zakaria: “If Al Qaeda is down to 100 men there at the most, why are we fighting a major war?” Citing the 100 NATO troops who had been killed the previous month and the $100 billion plus annual cost, he determined that the war was costing “more than one allied death for each living Al Qaeda member in the country in just one month” and “a billion dollars for every member of Al Qaeda thought to be living in Afghanistan in one year.” In response to those who justified the war because the Taliban were allies of Al-Qaeda, Zakaria observed, “this would be like fighting Italy in World War II after Hitler’s regime had collapsed and Berlin was in flames just because Italy had been allied with Germany.”
Jim Lacey of the Marine Corps War College made his own calculations, based on the 140,000 coalition soldiers, and determined the cost was actually $1.5 billion annually per Al-Qaeda member in Afghanistan. “Did anyone do the math?” Lacey wondered. “In what universe do we find strategists to whom this makes sense?”
Historian Andrew Bacevich pointed out the most glaring contradiction. If Afghanistan was really so critical to U.S. safety and security, which he considered “a preposterous notion,” “then why set limits on U.S. involvement there? ... Why not send 100,000 troops rather than 30,000? Why not vow to do ‘what-ever it takes,’ rather than signal an early exit? Why not raise taxes and reinstate the draft . . . ? Why not promise ‘victory’—a word missing from the president's address?”
Those who wondered what government forces were doing if not reading or fighting got some disturbing insight in January 2011, when the Afghan government signed an agreement with the UN to stop recruiting children into the police force and to ban the common and, according to the Washington Post, growing practice of using young boys as sex slaves. The New York Times reported that “as part of the Afghan tradition of bacha bazi, literally ‘boy play’, boys as young as nine are dressed as girls and trained to dance for male audiences, then prostituted in an auction to the highest bidder. Many powerful men, particularly commanders in the military and the police, keep such boys, often dressed in uniforms, as constant companions for sexual purposes.” In Afghanistan, bacha bazi had actually become rampant among the insurgent mujahideen during their U.S.-backed campaign to oust Soviet forces. It was most openly practiced around Kandahar, where, the Times noted, the “Taliban originally came to prominence . . . when they intervened in a fight between two pedophile warlords over the possession of a coveted dancing boy.” The Taliban had banned this practice when they were in power.
Joseph Stiglitz and Harvard public policy professor Linda Bilmes reported n 2010 that 600,000 of the 2.1 million who had served in Iraq and Afghanistan had sought medical treatment from the Department of Veterans Affairs and that 500,000 had applied for disability benefits, which was approximately 30 percent higher than initially estimated. With treatment of post-traumatic stress disorder (PTSD) and other health issues only increasing over time and life expectancy rising, they estimated that the real cost of both wars could top $4 trillion. Considering that 9/11 cost Al-Qaeda approximately $50,000, the multitrillion-dollar U.S. response was indeed playing into bin Laden’s goal of bankrupting the United States.
Given the fundamental illogic of fighting a decade-long and immensely costly war in Afghanistan in order to defeat a debilitated enemy that was based in Pakistan, some concluded the United States must have an ulterior motive. They found a possible answer in 2010, when the Pentagon announced that its team of geologists and other investigators had confirmed the existence of vast Afghan mineral resources. The Pentagon projected that Afghanistan could become the “Saudi Arabia of lithium,” a crucial ingredient in batteries for various electronic devices. London banker Ian Hannam, a mining expert with JPMorgan, went further, drooling over the prospect that “Afghanistan could be one of the leading producers of copper, gold, lithium, and iron ore in the world.” Petraeus, who would shortly replace McChrystal as commander of U.S. forces in Afghanistan, agreed. “There is stunning potential here,” he said. Afghan officials estimated mineral worth at $3 trillion, a truly staggering figure for a country whose gross domestic product was around $12 billion and whose economy consisted largely of narcotics and foreign aid.
One would be hard-pressed to know where to begin dissecting the distortions and debunking the myths, but, as we have shown throughout these pages, the notions of American altruism, benevolence, and self-sacrifice might be a good place to start, especially when combined with an explicit disavowal of interest in territories and resources. Obama identified America’s uniqueness among nations as its “willingness . . . to pay a great price for the progress of human freedom and dignity.” The wars, he absurdly claimed, had made the United States “stronger and the world more secure.” He compared the troops who had slaughtered hundreds of Iraqi civilians in Fallujah to the American colonists “who overthrew an empire” and to the World War II generation who “faced down fascism.” Perhaps he hadn’t seen the American-flag burning jubilation of Fallujah’s crowds during their Day of Resistance and Freedom that commemorated the U.S. departure from Iraq. Perhaps he hadn’t read the accounts by U.S. marines of wanton and often indiscriminate killing of Iraqi civilians, including women and children, in Haditha and elsewhere throughout the country. Perhaps he hadn’t seen the commander of U.S. forces in Anbar Province’s explanation of why he didn’t investigate U.S. troops' random killing of 24 Iraqi civilians in Haditha. “It happened all the time ...,” he explained, “throughout the whole country.” And in what was either the most contemptible lie since the early days of the Bush administration or unfortunately sloppy and careless phrasing Obama congratulated the troops for having “fought for the same principles in Fallujah and Kandahar, and delivered justice to those who attacked us on 9/11,” adding credence to the Bush-Cheney fabrication that the invasion of Iraq was somehow justified by Saddam’s support for Al-Qaeda and perpetuating the dangerous illusion that the occupation of either country, as of 2011, had anything at all to do with that initial Al-Qaeda attack.
In fact, when criticizing repressive Middle Eastern regimes, Obama pointedly omitted mention of Saudi Arabia, whose reactionary monarchy the United States had propped up for six decades in return for Saudi oil. Saudi Arabia had long been the largest purchaser of advanced U.S. weaponry. The Wall Street Journal projected that the sale Obama approved in 2010 might top $60 billion. Now with the Saudis helping thwart democratic reform throughout the region, intervening politically, monetarily, and, in the case of Bahrain, even militarily, the United States proved itself an unreliable ally for those seeking progressive change.
The nation faced a quandary. The post-Cold War world refused to play by its rules. Neither its unprecedented military strength nor its overwhelming economic power translated into an ability to bend history in the ways U.S. leaders desired. The world seemed to be spinning increasingly out of U.S. control. Nothing symbolized this more than the rise of China, with its 1.3 billion people, booming economy (almost 40 percent of which remained state-owned), and authoritarian Communist Party-controlled political system. China’s economic growth, while extraordinary under any circumstances, stood out even more starkly when measured alongside U.S. economic stagnation and decline. In 2011, China’s per capita GDP, though still only 9 percent that of the United States, was double what it had been four years earlier. And Chinese leaders were projecting another doubling in the next four years. China had already replaced Japan as the world’s second largest economy, having jumped remarkably from number seven in 2003. Indicative of future prospects, the Urban Land Institute and Ernst & Young reported that China devoted 9 percent of its GDP to infrastructure—more than triple the portion invested by the United States.
China’s new economic clout in October 2011 came into sharp relief when Europe asked China for help in saving the euro, inviting it to invest tens of billions of dollars in Europe’s emergency stability fund. China was, in effect, being asked to assume the role long played by the United States as the world’s financial leader. China had already bought up key economic assets in Europe, which had become China’s largest trading partner. Although China balked at investing so heavily at a time when Europe’s economic situation remained so precarious, the significance was unmistakable, especially coming just weeks after Timothy Geithner’s advice to a gathering of European finance ministers had been so rudely dismissed. The New York Times aptly titled its front-page article “Advice on Debt? Europe Suggests U.S. Can Keep It.”
Secretary of State Hillary Clinton had thrown down the gauntlet with an article in the November 2011 issue of Foreign Policy magazine bluntly titled “America’s Pacific Century.” The article began, “As the war in Iraq winds down and America begins to withdraw its forces from Afghanistan, the United States stands at a pivot point.” The dramatic change she heralded would be “a substantially increased investment—diplomatic, economic, strategic, and otherwise—in the Asia-Pacific region,” which included the Indian Ocean as well as the Pacific.
Obama reinforced that message during his eight-day trip to the Pacific later that month. He informed the Australian Parliament, “In the Asia-Pacific century, the U.S. is all in. ... I’ve therefore made a deliberate and strategic decision—as a Pacific nation the United States will play a larger and long-term role in shaping this region and its future.” “The United States is a Pacific power, and we are here to stay,” he said, even predicting the downfall of the Chinese Communist Party. Impending cuts in U.S. defense spending, he assured the Aussies, “will not—I repeat, will not—come at the expense of the Asia-Pacific.” Proving his point, Obama announced that the United States would deploy 2,500 marines to Australia in what amounted to the first long-term U.S. troop increase in Asia since Vietnam, reversing decades of steady decline. The increase would come on top of the 85,000 troops the United States already had in the Pacific, where seven of its eleven aircraft carriers and eighteen nuclear submarines were based.
U.S. hegemonic pretensions remained lofty, but U.S. ability to police Asia and the rest of the globe was constrained by the dimensions of its budget crisis. By 2010, the U.S. was spending $1.6 trillion over revenues in its $3.8 trillion budget. The shortfall was borrowed largely from China and Japan. Debt service alone cost $250 billion. The military budget, including black operations, intelligence, foreign military aid, private contractors, and veterans’ benefits, totaled over $1 trillion. Christopher Hellman of the National Priorities Project calculated that the U.S. actually spends over $1.2 trillion out of its $3 trillion annual budget on “defense,” when all military- and security-related expenses are factored in.
That figure approximately equaled what the rest of the world spent. Even during the height of the Cold War, the United States spent only 26 percent of the world total. As Congressman Barney Frank observed, “We have fewer enemies and we’re spending more money.” U.S. military spending consumed approximately 44 percent of all U.S. tax revenues. Maintaining bases cost approximately $250 billion. Hiring the Pentagon’s vast army of private contractors, which, according to the Washington Post, totaled 1.2 million people, cost almost as much. New and costly high-tech weapons systems added to the burden. Did all this spending make Americans safer? Frank commented, “I don’t think any terrorist has ever been shot by a nuclear submarine.”
While cutting defense spending, pulling combat forces out of Iraq, and beginning the drawdown in Afghanistan represented a welcome retreat from the hypermilitarism of the Bush-Cheney years, they did not represent the sharp and definitive break with empire that the world needed to see from the United States and that Obama had been encouraged to pursue by the man who had engineered the end of the Soviet Empire: Mikhail Gorbachev. Gorbachev had pressed Obama to pursue the kind of bold initiatives that had allowed Gorbachev to change the course of history. “America needs perestroika right now . . . ,” he said in 2009, “because the problems he has to deal with are not easy ones.” Gorbachev called for ending the kind of unregulated free market policies that caused the global economic downturn and perpetuated the gap between the world’s rich and poor. The United States, he warned, can no longer dictate to the rest of the world. “Everyone is used to America as the shepherd that tells everyone what to do. But this period has already ended.” He condemned the Clinton and Bush administrations’ dangerous militarization of international politics and urged the United States to withdraw from Afghanistan as Russia had done over twenty years ago when Gorbachev inherited a similarly disastrous and unpopular war.
In what would presage a remarkable turnaround if it continues, even Barack Obama began showing faint signs of returning to the transformational figure he had appeared to be during the 2008 campaign. Spurred by the Occupy Wall Street movement’s success in getting out its message, continued Republican intransigence, economic stagnation, budgetary constraints, and tumbling approval ratings, by late 2011 Obama appeared to be regaining some of his old dynamism. Traces of populism crept into his speeches. He openly embraced ending the Iraq War and cutting defense spending, even though both developments had been forced upon him. Was there a chance that he might be undergoing a Kennedyesque road-to-Damascus conversion and realizing how poorly American militarism and imperialism had served the American people and the rest of the world? The prospects looked dim, and his Fort Bragg speech and willingness to sign the extremely dangerous 2012 Defense Authorization bill were not encouraging. What had become apparent was that the real hope for changing the United States—for helping it regain its democratic, egalitarian, and revolutionary soul—lay in U.S. citizens joining with the rebellious masses everywhere to deploy the lessons of history, their history, the people’s history, which is no longer untold, and demand creation of a world that represents the interests of the overwhelming majority, not that of the wealthiest, greediest, and most powerful. Building such a movement is also the only hope to save American democracy from the clutches of an ever-expanding, ever-encroaching national security state. Such tyranny was a threat that America’s revolutionary leaders understood very well. When a woman asked Benjamin Franklin in 1787, after the Constitutional Convention, “Well, Doctor, what have we got—a republic or a monarchy?” Franklin responded with words as timely today as when he uttered them, “A republic, Madam, if you can keep it.”
Krauthammer didn’t wait for the crushing of Saddam Hussein’s Baathist regime to marvel at America’s unparalleled might. He published an essay in late 2002 in which he revisited his 1990 embrace of unipolarity and admitted that he had underestimated the vastness of U.S. domination. “Nothing has ever existed like this disparity of power; nothing,” he bellowed. Kosovo had been good. But the invasion of Afghanistan was a thing of true beauty—confirmation that the U.S. was far and away the greatest hegemon in all of history. There would be no more hesitation, no more concern with international legality, no more obsession with exit strategies. In 1990, he had written that “unipolarity could last thirty or forty years,” which “seemed bold at the time.” But now that projection seemed “rather modest” because “the unipolar moment has become the unipolar era” and could go on indefinitely. “The choice is ours,” he concluded. “To impiously phrase Benjamin Franklin: History has given you an empire, if you can keep it.”
In 2003, neocon triumphalism was unbounded. Talk of empire filled the air. Neocon strategists debated which countries were in line for regime change. Iraq, Syria, Libya, Iran, North Korea, Lebanon, Somalia, and Sudan made the lists. Commentary editor Norman Podhoretz also threw in Saudi Arabia, Egypt, and the Palestinian Authority for good measure.
In February 2018, former NATO secretary general Javier Solana echoed the growing realization that “multipolarity is back, and with it strategic rivalry among the great powers.” He noted that “the remergence of China and the return of Russia to the forefront of global politics are two of the most salient international dynamics of the century thus far.” Solana observed that “during Donald Trump’s first year in the White House, the tension between the United States and these two countries increased markedly. As the US domestic political environment has deteriorated, so, too, have America’s relations with those that are perceived as its principal adversaries.”
“The process of creating a polycentric world order is an objective trend,” Russian foreign minister Sergei Lavrov told the United Nations General Assembly during “leaders week” in September 2017. “This is something,” he declared pointedly, “that everyone will need to adapt to, including those who are used to lording it over others.” The sentiment was seconded by Lavrov’s Chinese counterpart, Wang Yi, who told the delegates, “We live in an era that’s defined by a deepening trend toward a multipolar world . . . that is witnessing profound changes in the international landscape and balance of power.” He called for the UN to play a central role in this transformation “so countries can be equal, so countries can run global affairs together.”
The Trump administration’s policies toward China and Russia have been erratic at best, but relations with both regional powers had already soured under Obama. The two former communist behemoths, who had been antagonists more often than allies over the past 60 years, moved closer together. China had become Russia’s leading trade partner, accounting for 15 percent of Russia’s trade in 2017, an amount that Russia expected to reach $100 billion in 2018. China recently agreed to increase oil imports from Russia by 50 percent. Ties were also manifest in joint naval exercises, ostensibly in response to U.S. military exercises with South Korea, mutual opposition to the U.S. use of punitive sanctions, and even the launching of a new kind of “red tourism” in which visitors would tour sites crucial to early communist history in both countries.
In 1997, Zbigniew Brzezinski had warned that such a “grand coalition of China, Russia, and perhaps Iran, an ‘antihegemonic’ coalition united not by ideology but by complementary grievances,” would be “the most dangerous scenario” for American security interests. It would, he continued, “be reminiscent in scale and scope of the challenge once posed by the Sino-Soviet bloc, though this time China would likely be the leader and Russia the follower.”
It is the pace of Chinese economic expansion that is truly stunning. As historian Al Mccoy notes, the 2012 National Intelligence Estimate indicates that during Britain’s period of national ascent, from 1820 to 1870, it raised its share of global GDP by 1 percent. Between 1900 and 1950, the U.S. increased its share by 2 percent, and Japan, between 1950 and 1980, raised its share by 1.5 percent. China’s share of global GDP, by comparison, jumped a remarkable 5 percent between 2000 and 2010 and may do so again between 2010 and 2020. The World Bank described China’s rate of growth as “the fastest sustained expansion by a major economy in history. PriceWaterhouseCooper released economic growth projections in February 2017 and predicted that in 2050, China would be the world’s largest economy, followed by India, the U.S., Indonesia, Brazil, and Russia.
Nowhere was this more evident than in venture capital, which the U.S. had long dominated. In 2008, China had accounted for only 5 percent of venture capital spending in global start-ups. But with what CNBC described as a “tidal wave of Chinese cash into promising new start-ups,” China’s investment in R&D is increasing at a rate of 18 percent, meaning that China will overtake the U.S. in R&D spending by 2019.
Unwilling to give up the dominance it had enjoyed since the end of the Second World War, the United States moved to reassert its regional hegemony and contain China. In 2011, Hillary Clinton proclaimed the twenty-first century “America’s Pacific Century,” announcing the U.S. “pivot” from the Middle East and Europe to Asia, a strategy that Obama quickly confirmed.
Trump also doubled drone strikes in Afghanistan in 2017. The U.S. invasion of that impoverished nation, which began in 2001, had earned the dubious distinction of being the longest war in American history. In a desperate attempt to project strength, the Trump administration, in April 2017, dropped the twenty-one-thousand-pound GPS-guided GBU-43—the “mother of all bombs”—on an underground ISIS compound in Nangarhar Province, igniting, in the words of historian Jeremy Kuzmarov, “a flammable fuel mist capable of obliterating the equivalent of nine city blocks while creating a Hiroshima-like mushroom cloud.” Gersh Kuntzman of the New York Daily News decried the resulting media coverage as a “bloodlust” akin to “death porn” in its “disgusting” celebratory, voyeuristic fetishizing of the $16 million super-weapon. Former Afghan president Hamid Karzai tweeted afterward that “this is not the War on Terror but the inhuman and most brutal misuse of our country as a testing ground for new and dangerous weapons.”
Trump took pains to differentiate his Afghan policy from Obama’s, declaring “You’ll see there’s a tremendous difference, tremendous difference.” In reality, the situation had only grown more desperate. RAND political scientist Laurel Miller, former acting State Department special representative for Afghanistan and Pakistan, acknowledged in summer 2017, “I don’t think there is any serious analyst of the situation in Afghanistan who believes that the war is winnable.” By early 2018, the Taliban were operating openly in 70 percent of the country. Government control had shrunk to only about 30 percent of the country, down from 72 percent in 2015.
Both countries began feeling the pinch of Trump’s trade war. Cui Tiankai, China’s ambassador to the U.S., pointed out that in 2015, trade with China had lowered prices for U.S. consumers by up to 1.5 percent, representing an average saving of $850 per family. Walmart, for example, got 80 percent of its goods from Asia, with much of that coming from China. Even before the trade war with China had begun to take effect, business leaders representing the U.S. Chamber of Commerce and fifty-one trade groups had implored legislators to block Trump’s reckless actions by passing a bill requiring Congress to approve any new tariffs the president imposed.
The Corker-Kaine fix, however, would make for little, if any, improvement. The bill would give the president the authority to use “all necessary and appropriate force” against Iraq, Afghanistan, Syria, Yemen, Libya, Somalia, Al-Qaeda, ISIS, the Taliban, Al-Shabaab, the Haqqani Network, and “associated forces.” The Center for Constitutional Rights argued that the “proposed AUMF effectively rewrites the Constitution by usurping Congress’ war authorization power and handing it to the executive branch” and “allows the president to wage war against six enumerated groups and add new groups in the future, all without any geographic or time constraints.” Congressional representatives Barbara Lee and Walter Jones (R-NC) sent a bipartisan letter, cosigned by forty-nine lawmakers to the Senate Foreign Relations Committee condemning the Corker-Kaine war authorization. Lee wrote that she “fear[ed] that the . . . proposal would further limit congressional oversight of our perpetual wars. Replacing one blank check with another even broader one is a recipe for disaster.” The New York Times agreed that the authorization was “too broad” and that it “could bless military operation in perpetuity.” Legal scholar Marjorie Cohn also warned that the new bill would allow the president “to lock up Americans who dissent against U.S. military power.”
Understanding that the situation was rife with danger, Obama responded cautiously to the start of hostilities. But in August 2011, amid mounting condemnation of Syrian government treatment of protesters, he buckled to pressure from Senate hawks and human rights groups and announced his support for Assad’s ouster and for new sanctions on Syria, which Secretary of State Hillary Clinton said would “strike at the heart of the regime.” France, Germany, and Britain quickly climbed on board the regime change bandwagon. At the point when U.S. support for the insurgency began, the Washington Post reported that “hundreds” of Syrian civilians had been killed. The opposition, according to the New York Times, was still “barely organized.” But U.S. actions gave the insurgents hope. As one leading activist enthused, “With what Obama said, it’s going to become widespread now. I think we’re going to see lots of people taking to the streets.”
The uprising did indeed spread, but regime opponents were suffering tremendous losses at the hands of Syrian troops. Hillary Clinton, CIA director David Petraeus, and Leon Panetta pressured Obama to do more in the summer of 2012. They supported establishing a no-fly zone in Syria, a position Clinton recklessly maintained throughout the 2016 campaign despite opposition from many military leaders and explicit warnings from Dunford that establishing such a zone would “require us to go to war against Syria and Russia.” Retired navy officer John Kuehn, who had flown no-fly missions over Bosnia and Iraq, said he hoped Clinton was just “politically posturing,” because “if she is not politically posturing, it’s going to be a disaster.” Even candidate Trump understood enough to assert that Clinton’s proposed “safe zones” would “lead to World War Three.”
Whereas U.S. involvement was covert and illegal, according to international law, the Syrian government invited Russian involvement. The rebels proved no match for the Russian bombing campaign that began in 2015 and made short work of both the Nusra Front and its U.S.-trained allies. As the Syrian government, aided by Russia, Iran, and Hezbollah, continued making gains on the battlefield, talk of regime change largely disappeared and the U.S. limited its objectives to defeating ISIS and preventing the use of chemical weapons. Turkey, which had initially supported the U.S. in demanding Assad’s ouster, also had a change of heart, now seeing the main threat as coming from the Kurds, who, with U.S. backing, had seized a large part of northern Syria, creating a rift between the U.S. and Turkey, two erstwhile NATO allies, and deepening ties between Turkey and Russia. Trump, who explicitly disavowed regime change during the 2016 campaign, pulled the plug on Timber Sycamore in 2017, complaining that CIA-supplied weapons were ending up in Al-Qaeda hands. He tweeted that he was ending “massive, dangerous, and wasteful payments to Syrian rebels fighting Assad.” Many in Congress cheered his decision. Syrian government forces backed by Russian aircraft continued to mop up the remaining rebel strongholds, taking the besieged city of Aleppo in December 2017.
Expressing a view rarely heard on mainstream media, Columbia professor and international development expert Jeffrey Sachs placed the blame for the Syria debacle squarely on U.S. shoulders and urged a speedy withdrawal. Appearing on Morning Joe on MSNBC on April 11, 2018, Sachs defiantly asserted:
This is a U.S. mistake that started seven years ago when President Obama said “Assad must go.” ... The CIA and Saudi Arabia together in covert operations tried to overthrow Assad. It was a disaster. Eventually, it brought in ... ISIS. ... It also brought in Russia. ... And so we have made a proxy war in Syria. It’s killed 500,000 people, displaced 10 million. ... So what I would plead to President Trump is get out like his instinct told him. That was his instinct, but then all the establishment—the New York Times, the Washington Post, the Pentagon—everybody said no, no that’s irresponsible. But his instinct is right—get out. We’ve done enough damage. Seven years. And now we really risk a confrontation with Russia that is extraordinarily dangerous. Reckless.
In September, John Bolton, doing his best Dirty Harry tough-guy impersonation, snarled menacingly at Iran, “Let my message today be clear: we are watching, and we will come after you.” Bolton added, “If you cross us, our allies, or our partners; if you harm our citizens; if you continue to lie, cheat, and deceive, yes, there will indeed be hell to pay.” He also threatened “terrible consequences” for anyone who continued doing business with Iran, warning, “We do not intend to allow our sanctions to be evaded by Europe or anybody else.”
Colonel Larry Wilkerson had seen this all before. Months earlier, he had realized that Trump was taking a page from the George W. Bush playbook on Iraq in preparing for war with Iran. Wilkerson regretted having helped make the meretricious case for the 2003 invasion as Secretary of State Colin Powell’s chief of staff. Alarmed by the parallels, he wrote a powerful op-ed in the New York Times in hopes of waking up the American people before it was too late. Wilkerson cited Nikki Haley’s mendacious presentation of “undeniable” evidence that Iran was not complying with UN resolutions on ballistic missiles and Yemen, comparing it to Powell's “astonishing[ly] ... similar” and equally contemptible presentation fifteen years earlier. He deplored the “politicization of intelligence” that was again occurring and the fact that “news organizations have largely failed to refute false narratives coming out of the Trump White House on Iran.” Wilkerson concluded:
As I look back at our lock-step march toward war with Iraq, I realize that it didn’t seem to matter to us that we used shoddy or cherry-picked intelligence; that it was unrealistic to argue that the war would “pay for itself,” rather than cost trillions of dollars; that we might be hopelessly naïve in thinking that the war would lead to democracy instead of pushing the region into a downward spiral. The sole purpose of our actions was to sell the American people on the case for war with Iraq. Polls show that we did. Mr. Trump and his team are trying to do it again. If we’re not careful, they’ll succeed.
Wilkerson did see one big difference between Bush’s war with Iraq and Trump’s looming war with Iran. Fighting Iran would make the invasion of Iraq look like a walk in the park or a weekend in Mar-a-Lago: “War with Iran, a country of almost 80 million people whose vast strategic depth and difficult terrain make it a far greater challenge than Iraq, would be 10 to 15 times worse than the Iraq war in terms of casualties and costs.”
Trump’s erratic behavior and hostile, if mixed, messaging reinforced North Korean leaders’ fears of a U.S. attack and strengthened their desire to showcase their growing nuclear capability. Memories of the Korean War had never abated in the North and were kept deliberately alive by leaders whose legitimacy and demand for extraordinary sacrifice depended largely on having protected the people from further American carnage. Reminders of the horrific wartime suffering at American hands abound. The Victorious Fatherland Liberation War Museum in Pyongyang, for example, decries the “US imperial aggressors” who perpetrated the “most brutal slaughter of people ever known in history.” General Curtis LeMay, head of the Strategic Air Command during the war, said that the U.S. “targeted everything that moved in North Korea,” causing the deaths of 20 percent of the population. This fact is conveyed to North Koreans from the time they are school-children and reinforced throughout their lives. Historian Bruce Cumings told Newsweek that “most Americans are completely unaware that we destroyed more cities in the North than we did in Japan or Germany during World War II. ... Every North Korean knows about this, it’s drilled into their minds. We never hear about it.” South Koreans, on the other hand, are taught a fundamentally different version of history, one that exonerates the U.S. for imposing the dictatorial Syngman Rhee and empowering wartime Japanese collaborators, who repressed and brutalized the South Korean population for decades. The U.S. wartime bombing that also pulverized South Korea has been largely forgotten. What is remembered is South Korea’s more recent economic prosperity, which, Su-kyoung Hwang wrote in Korea’s Grievous War, “is the most commonly cited factor used to justify the U.S. role in the Korean War and the continued presence of U.S. troops.”
Further efforts to reach agreement during the Bush and Obama years failed, although North Korea repeatedly stated its willingness to halt its nuclear program if a deal could be struck guaranteeing its security. When the younger, Swiss-educated Kim replaced his father in 2011, many hoped for liberalization of the dictatorial regime. But, having learned from the fate of Saddam Hussein in Iraq and Muammar Gaddafi in Libya, Kim accelerated the pace of North Korea’s missile and nuclear programs. Obama, largely eschewing diplomacy, responded with an aggressive cyber warfare strategy to slow the North’s progress.
North Korea’s accelerated efforts yielded fruit. On July 4, 2017, the Democratic People’s Republic of Korea (DPRK) tested a Hwasong-14 ICBM that it claimed could “reach anywhere in the world,” including the mainland United States. On August 5, the UN unanimously slapped new sanctions on North Korea, banning export of North Korean coal, lead, iron, and seafood-exports that counted for approximately $1 billion of North Korea's $3 billion earnings. North Korea accused the U.S. of trying to “strangle” it and threatened a sharp response.
The U.S. intelligence community reported that North Korea possessed as many as sixty nuclear weapons and had developed a miniaturized nuclear warhead that could fit atop a missile. When the North Koreans threatened to fire missiles at Guam, Trump erupted, warning of “fire and the fury like the world has never seen.” “Military solutions are now fully in place, locked and loaded, should North Korea act unwisely,” he tweeted.
Among those demanding military action was soon-to-be national security advisor John Bolton. On Sean Hannity’s Fox News show, he called for preemptive strikes “before North Korea has dozens and dozens of nuclear warheads on ballistic missiles that can hit the United States.” He declared on Secure Freedom Radio on August 10 that “you eliminate the nuclear threat by eliminating North Korea.”
Still, neither Russia nor China, both of which border on North Korea, wants to see U.S.-backed regime change in the DPRK or war of any sort on the peninsula. They have pressured both sides to find a peaceful path forward, advocating a “freeze for freeze” agreement in which the U.S. and South Korea would halt their joint military exercises and North Korea would put an end to its nuclear and missile tests. China made clear that if the U.S. and South Korea tried “to overthrow the North Korea regime . . . China will prevent them from doing so.” But if North Korea provoked the U.S. and South Korea into taking military action, China would remain neutral.” The U.S. had ignored China’s red line in 1950 with disastrous consequences.
The first break in the diplomatic impasse came in 2016, when the South Korean people rose up in sustained peaceful protest known as the Candlelight Revolution against South Korean President Park Geuyn-hye’s corrupt and bitterly anticommunist regime and replaced the conservative leader with Moon Jae-in, a prodemocracy human rights lawyer who promised friendship with the North. As many as one in three South Koreans may have participated in the uprising.
Moon immediately reached out to Kim. Trump accused the South Korean president of “appeasement.” He had earlier excoriated Secretary of State Tillerson for “wasting his time” trying to negotiate with Kim. Ignoring Trump Moon again extended an olive branch to North Korea’s leaders, insisting: “We will not give up our goal of working together with allies to seek a peaceful denuclearization of the Korean Peninsula.”
With the late November North Korean launch of a Hwasong-15 ICBM capable of reaching anywhere in the U.S, the DPRK declared itself a “complete” nuclear state. It had achieved its goal. So when Kim announced in his New Year’s Day 2018 nationally televised address that “the whole territory of the U.S. is within the range of our nuclear strike and a nuclear button is always on the desk of my office,” it had some teeth to it. Trump responded predictably that his nuclear button was “much bigger & more powerful . . . and my Button works!”
The New York Times reported on U.S. military planning for a strike. Then, on January 13, Hawaiians had a terrible scare when a state employee sent out a statewide emergency alert warning that a ballistic missile was heading for Hawaii. The alert read “BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.” Under the circumstances, most assumed it was a nuclear attack by North Korea. Panic set in. Alia Wong described the terrified response: “Many assumed they would die, but sought shelter anyway. They took cover in mall bathrooms, bathtubs, drug stores—even a storm drain. Hawaii has very few shelters, and houses with basements are rare. There were reports of people speeding down highways and running red lights to reunite with family members. Others called one another to say ‘I love you’ one last time.” It took thirty-eight minutes before the state issued an official clarification.
Jack Matlock, who served as U.S. ambassador to the Soviet Union between 1987 and 1991, understood how differently the world looked from the Russian vantage point. He wrote in 2014, “President Bill Clinton supported NATO’s bombing of Serbia without U.N. Security Council approval and the expansion of NATO to include former Warsaw Pact countries. Those moves seemed to violate the understanding that the United States would not take advantage of the Soviet retreat from Eastern Europe. The effect on Russians’ trust in the United States was devastating. In 1991, polls indicated that about 80 percent of Russian citizens had a favorable view of the United States; in 1999, nearly the same percentage had an unfavorable view.”
“Vladimir Putin,” Matlock continued,
was elected in 2000 and initially followed a pro-Western orientation. When terrorists attacked the United States on Sept. 11, 2001, he was the first foreign leader to call and offer support. He cooperated with the United States when it invaded Afghanistan, and he voluntarily removed Russian bases from Cuba and Cam Ranh Bay in Vietnam. What did he get in return? Some meaningless praise from President George W. Bush, who then delivered the diplomatic equivalent of swift kicks to the groin: further expansion of NATO in the Baltics and the Balkans, and plans for American bases there; withdrawal from the Anti-Ballistic Missile Treaty; invasion of Iraq without U.N. Security Council approval; overt participation in the ‘color revolutions’ in Ukraine, Georgia and Kyrgyzstan; and then, probing some of the firmest red lines any Russian leader would draw, talk of taking Georgia and Ukraine into NATO. Americans, inheritors of the Monroe Doctrine, should have understood that Russia would be hypersensitive to foreign-dominated military alliances approaching or touching its borders.
Under Obama, the U.S. provided military equipment and training to the Ukrainian army and national guard but denied Ukraine lethal weaponry, realizing that such arms would only escalate the conflict and, even worse, might lead to war. Candidate Clinton broke with Obama on the issue, signaling that she would send such weapons. Trump not only challenged that view, his representatives stripped the plank supporting lethal weapons from the Republican platform.
But that caution disappeared in December 2017, when the Trump administration authorized the sale of Javelin antitank missiles and sniper rifles, which Senate Foreign Relations Committee chairman Bob Corker (R-TN) approvingly described as “defensive lethal arms.” An unnamed “senior congressional official” told the Washington Post, "We have crossed the Rubicon, this is lethal weapons and I predict more will be coming.” The following March, the State Department formally approved the $47 million sale of weapons provided by Raytheon and Lockheed Martin.
Ukraine has been in the U.S./EU orbit for more than four years, during which conditions have gone from bad to worse as increased military spending has only exacerbated Ukraine’s problems with corruption. Neo-Nazis, led by the three-thousand-plus strong Azov militia, have almost free rein, with twenty thousand marching in the streets of Kiev, wielding torches and shouting anti-Semitic slogans. In February 2018, the New York Times reported: “The problem [of corruption] has throttled the hopes raised in February 2014 by the ouster of Ukraine’s notoriously corrupt, pro-Russian former president, Viktor F. Yanukovych. It has also left the country's dispirited Western backers and many Ukrainians wondering what, after two revolutions since independence in 1991, it will take to curb the chronic corruption.” Among those on the take were President Poroshenko and his friends.
Former U.S. ambassador Matlock believed that Russia had a solid case in Ukraine. Matlock wrote:
Russians would argue, with some substance in the argument, that the U.S. is interested in territorial integrity only when its interests are served. American governments have a record of ignoring it when convenient, as when it and its NATO allies violated Serbian territorial integrity by creating and then recognizing an independent Kosovo. Also, by supporting the separation of South Sudan from Sudan, Eritrea from Ethiopia, and East Timor from Indonesia. So far as violating sovereignty is concerned, Russia would point out that the U.S. invaded Panama to arrest Noriega, invaded Grenada to prevent American citizens from being taken hostage (even though they had not been taken hostage), invaded Iraq on spurious grounds that Saddam Hussein possessed weapons of mass destruction, targets people in other countries with drones, etc., etc. In other words, for the U.S. to preach about respect for sovereignty and preservation of territorial integrity to a Russian president can seem a claim to special rights not allowed others.
Obama’s deputy national security advisor Ben Rhodes understood the Russian logic: “Putin is not entirely wrong . . . we engaged in regime change around the world. There is just enough rope for him to hang us.” Actually, there was more than enough rope to hang the United States when it came to ousting governments, including democratically elected ones, as we have shown time after sordid time throughout the book.
Award-winning journalist Robert Parry, who had worked tirelessly for years to combat false historical narratives, was deeply troubled by the media distortion of what was occurring in Ukraine. He wrote sagaciously, “If you wonder how the world could stumble into world war three—much as it did into world war one a century ago—all you need to do is look at the madness that has enveloped virtually the entire US political/media structure over Ukraine where a false narrative of white hats versus black hats took hold early and has proved impervious to facts or reason.”
“The Maidan shifted a gear,” Ben Rhodes told the Atlantic’s Julia loffe. Putin went on offense after the Maidan. The gloves were off, in a way. To Putin, Ukraine was such a part of Russia that he took it as an assault on him.”
Ioffe similarly connected the dots: “Regime change in Libya and Ukraine led to Russia propping up Bashar al-Assad in Syria. ‘Not one more’ is how Jon Finer, former Secretary of State John Kerry’s chief of staff, characterizes Putin’s approach in Syria. It also led inexorably to Russian meddling in the U.S. election: Russia would show the U.S. that there was more than one regime-change racket in town.”
In The World as It Is: A Memoir of the Obama White House, and in an interview with Time, Rhodes elaborated on the connection between U.S. actions in Ukraine and Russian intervention in the 2016 elections. Rhodes explained,
Putin I think interpreted the protest that ousted the Ukrainian president as the Americans coming into Russia. He thought we were behind those protests. He drew no distinction between Ukraine and Russia. I’d sit in the Oval Office and listen to these endless conversations where Putin would steer everything back to his view that we had overthrown a democratically elected government in Yanukovych. He was gonna punch back ... that’s when we started to see fake news and we started to see the release of intercepted communications to suit Russia’s political narratives. So all of that set the stage for what happened in our election. ... You cannot understand the Russian meddling in the 2016 election without understanding the Russian response to Ukraine. It was all the same tools. It was information warfare essentially that they had developed in Ukraine, just set loose in the United States.
One senior Pentagon official dismissed the tired and oft-told tale of American military weakness: “This is the ‘Chicken-Little, sky-is-falling’ set in the Army. These guys want us to believe the Russians are 10 feet tall. There’s a simpler explanation: The Army is looking for a purpose, and a bigger chunk of the budget. And the best way to get that is to paint the Russians as being able to land in our rear and on both of our flanks at the same time. What a crock.” Military analyst Mark Perry also thought the military weakness argument absurd and offered the following breakdown:
The United States spends seven times the amount of money on defense as Russia ($598 billion vs. $84 billion), has nearly twice the number of active duty personnel (1.4 million vs. 766,000), just under six times as many helicopters (approximately 6,000 vs. 1,200), three times the number of fighters (2,300 vs. 751) and four times the total number of aircraft. We have 10 aircraft carriers, the Russians have one. And while it’s true that the Russians field nearly twice the number of tanks as the U.S. (15,000 vs. 8,800), their most recent version, the T-14 Armata, broke down during the 2015 Moscow May Day Parade. America’s M1A1 Tank, on the other hand, has never been defeated in battle. Ever.
By 2018, the United States was actually spending twelve times as much on defense as Russia. Trump’s 2018 increase alone was more than Russia’s entire military budget. And Perry’s comparison would have been far more lopsided if he had taken into account total U.S. military spending, which would also include intelligence, homeland security, veterans’ benefits, and department of energy expenditures.
In late March 2016, the army dignified the assertions of American weakness and announced plans to send another brigade of U.S. troops to Europe. Retired lieutenant colonel Daniel Davis scoffed at this escalation, insisting it would just give Putin an excuse “to spend more money on his own military. ... This is all very predictable. He’ll up the ante and the Army will say ‘See, we don't have enough troops.’ So here we go again.”
But there was also a part of Trump’s [presidential campaign] message that made sense. Trump called for “a new foreign policy that finally learns from the mistakes of the past” and said, “We will stop looking to topple regimes and overthrow governments.” He promised to “partner with any nation that is willing to join us in the effort to defeat ISIS and radical Islamic terrorism. ... In our dealings with other countries, we will seek shared interests wherever possible and pursue a new era of peace, understanding, and good will.” U.S. policy toward the Middle East, he rightly noted, had “done a tremendous disservice not only to the Middle East—we’ve done a tremendous disservice to humanity. The people that have been killed, the people that have been wiped away—and for what?” he asked. “It’s a mess. The Middle East is totally destabilized, a total and complete mess. I wish we had the $4 trillion or $5 trillion. I wish it were spent right here in the United States on schools, hospitals, roads, airports, and everything else that are all falling apart!”
Trump was particularly unsparing when it came to the U.S. invasion of Iraq. On February 17, 2016, he told the audience at a town hall event in Bluffton, South Carolina: “I tell the truth about Iraq. I say the war was a disaster. We spent $2 trillion. Lost thousands of lives, thousands of lives. We have wounded warriors all over the place ... we should have never been there. ... It’s one of the worst decisions in the history of the country. We have totally destabilized the Middle East.” Trump pointed out the obvious truth that Iran was the big winner: “Now Iran is taking over Iraq. Just as sure as you’re sitting there. We don’t even have anything to do with Iraq anymore. We’re gone. But think of it. We spent $2 trillion. Could have rebuilt our country. We could have done so much with that money. And instead, we’re worse in the Middle East than we were 15 years ago. Right now, it’s a disaster.”
Perhaps most troubling to the foreign policy establishment in both parties was Trump’s views on Russia. In his first major foreign policy address, in April 2016, he stated, “We desire to live peacefully and in friendship with Russia and China. Russia, for instance, has also seen the horror of Islamic terrorism. I believe an easing of tensions, and improved relations with Russia from a position of strength only is possible, absolutely possible. Common sense says this cycle, this horrible cycle of hostility must end and ideally will end soon. Good for both countries.” In June he asked, “Wouldn't it be nice if we actually got along with Russia? Wouldn’t that be good?”
Alongside his eagerness to improve relations with Russia was Trump’s dismissal of NATO as an “obsolete” relic of the Cold War. In March 2016, he told Fox News, “We’re dealing with NATO from the days of the Soviet Union, which no longer exists.”
Putin was especially incensed over the U.S. role in instigating a street demonstration in December 2011 that cast aspersions on the integrity of Russian parliamentary elections and on Putin’s upcoming 2012 presidential bid. Decrying the use of hundreds of millions of dollars in “foreign money” to influence Russian politics, he accused Secretary of State Hillary Clinton of encouraging “mercenary” Kremlin protesters: “She set the tone for some actors in our country and gave them a signal. They heard this signal and with the support of the U.S. State Department began active work.” “Pouring foreign money into electoral processes is particularly unacceptable,” he said, anticipating what the Americans would accuse Russia of just a few years later, and stressed the “need to work out forms of protections of our sovereignty, defense against interference from outside.”
Clearly, the damage to U.S.-Russian and U.S.-Chinese relations that had begun under Obama would not be easily undone. Overall, Obama’s legacy is a mixed one. He was clearly a disappointment to the progressive backers who held such high hopes for him and the country when he defeated John McCain in 2008. While repudiating the worst aspects of the neoconservative hyper-militarism of the George W. Bush presidency, he never wavered in his support of American empire or broke as definitively as needed with Bush’s policies. On June 6, 2013, former Bush press secretary Ari Fleischer told CNN’s Anderson Cooper, “Across the board when you look at what he’s done, he’s continued so many of the Bush administration policies from drone strikes to military commissions to wiretaps to renditions to—you name it, he’s doing it. It’s like George Bush is having his fourth term. And I praise President Obama for it. Now, I think he’s a hypocrite. He. . . campaigned against President Bush, said it was a violation of the Constitution to do these things. But I think he’s learned.”
Fleischer may have felt vindicated by Obama’s policies, but American progressives felt betrayed. Perhaps the most egregious of all the betrayals came in Obama’s failure to substantially reduce the risk of nuclear war or even the size of America's nuclear arsenal. He had thrilled the world in June 2009 with his Prague pledge to eliminate nuclear weapons, even winning the world’s greatest honor the Nobel Peace Prize—largely in response to that promise. Given Obama’s long-held and apparently sincere opposition to nuclear weapons, dating at least back to his undergraduate days at Columbia in the early 1980s, the world anticipated some major progress in nuclear weapons reduction. Not only did people not see progress, the antinuclear movement lost ground under Obama. In 2010, the United States and Russia negotiated the New Strategic Arms Reduction Treaty, widely known as New START, which limited each nation to 1,550 strategic nuclear warheads as of February 2018. But because bombers are counted as single warheads regardless of the number of bombs they carry, the actual number of warheads is around 2,000. The U.S. and Russia also have thousands of war-heads in storage. In order to secure congressional approval of the New START Treaty, Obama made a soul-selling compromise—a devil’s bargain—with pronuclear lobbyists and their Republican allies. When Senate minority whip Jon Kyl (R-AZ) threatened to kill the New START Treaty, Obama caved to his demands for a sweeping modernization of America’s nuclear arsenal and delivery systems, its bombers, missiles, and submarines, a massive project estimated initially to cost $1 trillion over three decades. The Congressional Budget Office raised that estimate to $1.2 trillion in October 2017. The true cost, with anticipated inflation, will certainly be significantly higher—more in the $1.7 trillion range.
But cost is the least of the problems. The modernization will make the weapons more accurate, more deadly, and, in the minds of many nuclear war planners, more usable. Reuters explained the changes succinctly: “The United States under Obama transformed its main hydrogen bomb into a guided smart weapon, made its submarine-launched nuclear missile five times more accurate, and gave its land-based missiles so many added features that the Air Force in 2012 described them as ‘basically new.’ To deliver these more lethal weapons, military contractors are building fleets of new heavy bombers and submarines.” While limiting the number of warheads and launch vehicles, the treaty lets each side determine design of delivery methods. “Thus,” Reuters noted, “both sides are increasing exponentially the killing power of these weapons, upgrading the delivery vehicles so that they are bigger, more accurate and equipped with dangerous new features.” In the authoritative Bulletin of the Atomic Scientists, three leading nuclear experts argued that improved targeting capability had tripled the killing power of U.S. ballistic missiles, laying the basis for a successful first strike.
While the hopes of those who had long dreamed of a world without nuclear weapons were crushed by Obama’s reversals, pronuclear forces were ecstatic. In 2015, Alabama congressman Mike Rogers addressed a weapons conference sponsored by Northrup Grumman, Lockheed Martin, and General Dynamics “I think we can safely say the President’s Prague vision is dead,” he gloated, “And I’ll leave it to the Nobel committee to ask for its prize back.”
In May 2016, Hiroshima survivor Setsuko Thurlow, who delivered one of the two Nobel Peace Prize acceptance speeches in 2017 in recognition of her lifelong efforts to abolish nuclear weapons, expressed her disappointment with Obama’s lack of meaningful action, “The world waited and waited and waited but so far our problem continues and actually it’s worse. So we are, and I am, very disappointed . . . it’s a huge, huge disappointment for the world.”
Administration insiders were often as disappointed with Obama’s capitulation to the hardliners as were his outside critics. Former undersecretary of state for arms control Ellen Tauscher reported “a universal sense of frustration” and disillusionment among those who entered the administration expecting to limit nuclear weaponry.
In May 2016, with Obama en route to Hiroshima, the Federation of American Scientists (FAS) reported that he had actually reduced the nuclear arsenal at a slower pace than had either of the Bushes or Bill Clinton, his three post-Cold War predecessors. At that point, with a few more months of his presidency still to go, the U.S. had dismantled 702 warheads or 13.3 percent of the arsenal. Clinton, by comparison, had reduced the arsenal by approximately 23 percent, George H. W. Bush by 41 percent, and George W. Bush by 50 percent. Hans Kristensen, the FAS analyst, noted that the fault was not entirely Obama’s given the fierce opposition in Congress and a Russian government that had resisted additional reductions while New START was being implemented. Kristensen did, however, note that during the time of the Obama presidency, Russia had reduced its stockpile by more than a thousand warheads.
Obama looked history in the eye and blinked. He assured his audience that what “makes our species unique” is our ability to learn from and not repeat the “mistakes of the past.” “We can tell our children a different story,” he declared, “one that describes a common humanity, one that makes war less likely and cruelty less easily accepted.”
Like so much of the Obama presidency, this speech was filled with high minded sentiments and poetic imagery. But like so much of Obama’s eight years in office, the reality left much to be desired. Nothing rang more false and hollow than his hypocritical call for nuclear abolition at a time when he was committing the country to decades of modernization that would make nuclear weapons more, not less, usable. “We may not realize this goal [of a nuclear-free world] in my lifetime,” he granted, failing to state the obvious: As president, he could have striven mightily to realize that goal as Mikhail Gorbachev had done. But he lacked the vision and courage to make that happen. He will be remembered as a decent, well-intentioned man, a man who lent dignity to the highest office in ways that no one had since Jack Kennedy, but a man who snatched mediocrity from the jaws of greatness and whose failures helped open the door for Donald Trump, who did everything he could to destroy what Obama had achieved.
A frighteningly revealing moment came while Trump was campaigning in Evansville, Indiana. Former basketball coach and Indiana icon Bobby Knight endorsed Trump, comparing him to Harry Truman: “I’ll tell you who they said wasn’t presidential . . . Harry Truman.” “And Harry Truman, with what he did in dropping and having the guts to drop the bomb in 1944 [sic] saved, saved millions of American lives. And that’s what Harry Truman did. And, he became one of the three great presidents of the United States. And here’s a man who would do the same thing because he's going to become one of the four great presidents of the United States.” Trump gushed, “Such a great guy. Wow. How do you top that? You should be very proud of him in Indiana. ... This is a national treasure, OK?”
Among those who might have indirectly influenced Trump’s thinking on nuclear weapons were his billionaire backers Robert Mercer, the conservative hedge fund manager, and his daughter Rebekah. The Mercers had championed some of the most odious members of the Trump administration—Steve Bannon, Kellyanne Conway, Michael Flynn, and Jeff Sessions, helped found Cambridge Analytica, and funded the “alt-right” Breitbart News as well as the Oregon Institute of Science and Medicine run by biochemist Arthur Robinson, who believes human urine holds the key to extending life and has collected fourteen thousand urine samples that he keeps refrigerated on his farm. Robinson has inveighed against climate change, calling it a “false religion,” and in 1998 organized a deceptive petition claiming support of thirty thousand scientists who rejected the Kyoto Protocol and cast doubt upon man-made climate change. Upon closer examination, the thirty thousand included many falsified signatures and few if any bona fide climate scientists. Robinson convinced Mercer that nuclear war would not be as bad as advertised. In fact, he argued, outside the immediate blast zone, radiation would have a beneficial effect on human evolution—an insane theory that conservative physicist Edward Teller had espoused several decades earlier. In 1986, Robinson wrote a book arguing that the vast majority of Americans would survive “an all-out atomic attack on the United States.”
The Trump administration’s abysmal record on environmental issues has only made things worse. Trump’s first EPA head, Scott Pruitt, who denied evolution itself in 2005, became the administration’s poster boy for climate change denial and environmental degradation until he was forced to resign in July 2018 under a cloud of ethics scandals so vast that it staggers the imagination. But he was not alone. From Trump, who had once dismissed man-made climate change as a “hoax” created “by and for the Chinese,” on down, administration flat-earthers expressed hostility not only to climate science but to science in general. Trump was the first president to have not named a science advisor a year and a half into his presidency since the position was created in 1941. Neither the State nor Agricultural departments had appointed science advisors though science was crucial to both departments’ missions. The Interior Department and the National Oceanic and Atmospheric Administration went so far as to disband their science advisory committees. The Food and Drug Administration had done likewise with its Food Advisory Committee. Pruitt purged his agency’s Board of Scientific Counselors, dismissing members who wanted to take action to slow climate change. Finding it impossible to work in such a hostile environment, scientists were fleeing the federal government in droves. Princeton professor of geosciences and international affairs Michael Oppenheimer critically observed, “I don’t think there’s ever been a time in the post-World War II period where issues as important as nuclear weapons are on the table, and there is no serious scientist there to help the president through the thicket. This reverberates throughout policy.”
In April 2018, more than a thousand members of the National Academy of Sciences blasted Trump’s decision to withdraw from the Paris Climate Accord, calling attention to the “dangers of human-induced climate change,” which were already causing “suffering and economic loss” that would only get worse in the future.
Darreres paraules
Informació del coneixement compartit en anglès.Modifica-la per localitzar-la a la teva llengua.
As bleak as things might seem, despair is an attitude we can’t afford. Taking the broad view of history and using a conceptual device that Carl Sagan popularized in his award-winning book The Dragons of Eden, telescoping the history of our universe into one 365-day calendar affords a different perspective. From this vantage point, the Big Bang, which occurred 13.8 billion years ago, is placed at 12:00 a.m. on January 1. Our galaxy, the Milky Way, had its start 11 billion years ago on March 16. The Earth was formed 4.54 billion years ago on September 6, Life on Earth began 4.1 billion years ago on September 14. And human beings finally made their entrance on December 31, the last day of the year, at 10:30 p.m. Columbus’s “discovery” of America didn’t occur until December 31 at 11:59 and 59 seconds. Thus, in cosmic time, human existence has been incredibly brief. In a sense, we just learned to walk upright a minute and a half ago. In our fleeting moment on this stage, our species has created works of extraordinary beauty and grandeur. Yet, in that short time, we have also developed the means to end life on our planet. It can certainly be said that our technological proficiency far exceeds our moral, social, and political development. Therefore, our real task, as a species, in the period ahead is to survive our own worst and most destructive tendencies—to neither kill ourselves off with nuclear bombs nor destroy the earth we collectively inhabit. Certainly, we must struggle with all our might for peace and social justice. Those of us who have learned the lessons of history and can envision a fundamentally different and more humane way of organizing society, one that would allow people to live fuller, more prosperous, and peaceful lives, have a special responsibility to make our voices heard. But if that better world is not to be for our generation, we must at least sustain the possibility that future generations—perhaps a hundred, perhaps a thousand, perhaps ten thousand years from now—will have the opportunity of realizing our unfulfilled dreams.
At the start of this journey, when we began the documentary film and book project, we dedicated it to our six children—biological, adopted, and step, of Asian, African, and European ancestry—and “the better world that they and all children deserve.” We end on the same note, affirming our faith in our often misguided, sometimes destructive, and occasionally exalted species to someday achieve that goal.
A companion to the ten-part documentary series outlines provocative arguments against official American historical records to reveal the origins of conservatism and the obstacles to progressive change.