Category Archives: United States History

Portrait of a President Living on Borrowed Time

Joseph Lelyveld, His Final Battle:

The Last Months of Franklin Roosevelt 

            During the last year and a half of his life, from mid-October 1943 to his death in Warm Springs, Georgia on April 12, 1945, Franklin D. Roosevelt’s presidential plate was full, even overflowing. He was grappling with winning history’s most devastating  war and structuring a lasting peace for the post-war global order, all the while tending to multiple domestic political demands. But Roosevelt spent much of this time out of public view in semi-convalescence, often in locations outside Washington, with limited contact with the outside world. Those who met the president, however, noticed a striking weight loss and described him with words like “listless,” “weary,” and “easily distracted.” We now know that Roosevelt had life-threatening high blood pressure, termed malignant hypertension, making him susceptible to a stroke or coronary attack at any moment. Roosevelt’s declining health was carefully shielded from the public and only rarely discussed directly, even within his inner circle. At the time, probably not more than a handful of doctors were aware of the full gravity of Roosevelt’s physical condition, and it is an open question whether Roosevelt himself was aware.

In His Final Battle: The Last Months of Franklin Roosevelt, Joseph Lelyveld, former executive editor of the New York Times, seeks to shed light upon, if not answer, this open question. Lelyveld suggests that the president likely was more aware than he let on of the implications of his declining physical condition. In a resourceful portrait of America’s longest serving president during his final year and a half, Lelyveld considers Roosevelt’s political activities against the backdrop of his health. The story is bookended by Roosevelt’s meetings to negotiate the post-war order with fellow wartime leaders Winston Churchill and Joseph Stalin, in Teheran in December 1943 and at Yalta in the Crimea in February 1945. Between the two meetings came Roosevelt’s 1944 decision to run for an unprecedented fourth term, a decision he reached just weeks prior to the Democratic National Convention that summer, and the ensuing campaign.

Lelyveld’s portrait of a president living on borrowed time emerges from an excruciatingly thin written record of Roosevelt’s medical condition. Roosevelt’s medical file disappeared without explanation from a safe at Bethesda Naval Hospital shortly after his death.   Unable to consider Roosevelt’s actual medical records, Lelyveld draws clues  concerning his physical condition from the diary of Margaret “Daisy” Suckley, discovered after Suckley’s death in 1991 at age 100, and made public in 1995. The slim written record on Roosevelt’s medical condition limits Lelyveld’s ability to tease out conclusions on the extent to which that condition may have undermined his job performance in his final months.

* * *

            Daisy Suckley, a distant cousin of Roosevelt, was a constant presence in the president’s life in his final years and a keen observer of his physical condition. During Roosevelt’s last months, the “worshipful” (p.3) and “singularly undemanding” Suckley had become what Lelyveld terms the “Boswell of [Roosevelt’s] rambling ruminations,” secretly recording in an “uncritical, disjointed way the hopes and daydreams” that occupied the frequently inscrutable president (p.75). By 1944, Lelyfeld notes, there was “scarcely a page in Daisy’s diary without some allusion to how the president looks or feels” (p.77).   Lelyveld relies heavily upon the Suckley diary out of necessity, given the disappearance of Roosevelt’s actual medical records after his death.

Lelyveld attributes the disappearance to Admiral Ross McIntire, an ears-nose-and-throat specialist who served both as Roosevelt’s personal physician and Surgeon General of the Navy. In the latter capacity, McIntire oversaw a wartime staff of 175,000 doctors, nurses and orderlies at 330 hospitals and medical stations around the world. Earlier in his career, Roosevelt’s press secretary had upbraided McIntire for allowing the president to be photographed in his wheel chair. From that point forward, McIntire understood that a major component of his job was to conceal Roosevelt’s physical infirmities and protect and promote a vigorously healthy public image of the president. The “resolutely upbeat” (p.212) McIntire, a master of “soothing, well-practiced bromides” (p.226), thus assumes a role in Lelyveld’s account which seems as much “spin doctor” as actual doctor. His most frequent message for the public was that the president was in “robust health” (p.22), in the process of “getting over” a wide range of lesser ailments such as a heavy cold, flu, or bronchitis.

A key turning point in Lelyveld’s story occurred in mid-March 1944, 13 months prior to Roosevelt’s death, when the president’s daughter Anna Roosevelt Boettiger confronted McIntire and demanded to know more about what was wrong with her father. McIntire doled out his “standard bromides, but this time they didn’t go down” (p.23). Anna later said that she “didn’t think McIntire was an internist who really knew what he was talking about” (p.93). In response, however, McIntire brought in Dr. Howard Bruenn, the Navy’s top cardiologist. Evidently, Lelyveld writes, McIntire had “known all along where the problem was to be found” (p.23). Breunn was apparently the first cardiologist to have examined Roosevelt.

McIntire promised to have Roosevelt’s medical records delivered to Bruenn prior to his initial examination of the president, but failed to do so, an “extraordinary lapse” (p.98) which Lelyveld regards as additional evidence that McIntire was responsible for the disappearance of those records after Roosevelt’s death the following year. Breunn found that Roosevelt was suffering from “acute congestive heart failure” (p.98). He recommended that the wartime president avoid “irritation,” severely cut back his work hours, rest more, and reduce his smoking habit, then a daily pack and a half of Camel’s cigarettes. In the midst of the country’s struggle to defeat Nazi Germany and imperial Japan, its leader was told that he “needed to sleep half his time and reduce his workload to that of a bank teller” (p.99), Lelyveld wryly notes.  Dr. Bruenn saw the president regularly from that point onward, traveling with him to Yalta in February 1945 and to Warm Springs in April of that year.

Ten days after Dr. Bruenn’s diagnosis, Roosevelt told a newspaper columnist, “I don’t work so hard any more. I’ve got this thing simplified . . . I imagine I don’t work as many hours a week as you do” (p.103). The president, Lelyveld concludes, “seems to have processed the admonition of the physicians – however it was delivered, bluntly or softly – and to be well on the way to convincing himself that if he could survive in his office by limiting his daily expenditure of energy, it was his duty to do so” (p.103).

At that time, Roosevelt had not indicated publicly whether he wished to seek a 4th precedential term and had not discussed this question with any of his advisors. Moreover, with the “most destructive military struggle in history approaching its climax, there was no one in the White House, or his party, or the whole of political Washington, who dared stand before him in the early months of 1944 and ask face-to-face for a clear answer to the question of whether he could contemplate stepping down” (p.3). The hard if unspoken political truth was that Roosevelt was the Democratic party’s only hope to retain the White House. There was no viable successor in the party’s ranks. But his re-election was far from assured, and public airing of concerns about his health would be unhelpful to say the least in his  re-election bid. Roosevelt did not make his actual decision to run until just weeks before the 1944 Democratic National Convention in Chicago.

At the convention, Roosevelt’s then vice-president, Henry Wallace, and his counselors Harry Hopkins, and Jimmy Byrnes jockeyed for the vice-presidential nomination, along with William Douglas, already a Supreme Court justice at age 45. There’s no indication that Senator Harry S. Truman actively sought to be Roosevelt’s running mate. Lelyveld writes that it is tribute to FDR’s “wiliness” that the notion has persisted over the years that he was “only fleetingly engaged in the selection” of his 1944 vice-president and that he was “simply oblivious when it came to the larger question of succession” (p.172). To the contrary, although he may not have used the used the word “succession” in connection with his vice-presidential choice, Roosevelt “cared enough about qualifications for the presidency to eliminate Wallace as a possibility and keep Byrnes’s hopes alive to the last moment, when, for the sake of party unity, he returned to Harry Truman as the safe choice” (p.172-73).

Having settled upon Truman as his running mate, Roosevelt indicated that he did not want to campaign as usual because the war was too important. But campaign he did, and Lelyveld shows how hard he campaigned – and how hard it was for him given his deteriorating health, which aggravated his mobility problems. The outcome was in doubt up until Election Day, but Roosevelt was resoundingly reelected to a fourth presidential term. The president could then turn his full attention to the war effort, focusing both upon how the war would be won and how the peace would be structured. Roosevelt’s foremost priority was structuring the peace; the details on winning the war were largely left to his staff and to the military commanders in the field.

Roosevelt badly wanted to avoid the mistakes that Woodrow Wilson had made after World War I. He was putting together the pieces of an organization already referred to as the United Nations and fervently sought  the participation and support of his war ally, the Soviet Union. He also wanted Soviet support for the war against Japan in the Pacific after the Nazi surrender, and for an independent and democratic Poland. In pursuit of these objectives, Roosevelt agreed to travel over 10,000 arduous miles to Yalta, to meet in February 1945 with Stalin and Churchill.

In Roosevelt’s mind, Stalin  was by then both the key to victory on the battlefield and for a lasting peace afterwards — and he was, in Roosevelt’s phrase, “get-at-able” (p.28) with the right doses of the legendary Roosevelt charm.   Roosevelt had begun his serious courtship of the Soviet leader at their first meeting in Teheran in December 1943.  His fixation on Stalin, “crossing over now and then into realms of fantasy” (p.28), continued at Yalta. Lelyveld’s treatment of Roosevelt at Yalta covers similar ground to that in Michael Dobbs’ Six Months That Shook the World, reviewed here in April 2015. In Lelyveld’s account, as in that of Dobbs, a mentally and physical exhausted Roosevelt at Yalta ignored the briefing books his staff prepared for him and relied instead upon improvisation and his political instincts, fully confident that he could win over Stalin by force of personality.

According to cardiologist Bruenn’s memoir, published a quarter of a century later, early in the conference Roosevelt showed worrying signs of oxygen deficiency in his blood. His habitually high blood pressure readings revealed a dangerous condition, pulsus alternans, in which every second heartbeat was weaker than the preceding one, a “warning signal from an overworked heart” (p.270).   Dr. Bruenn ordered Roosevelt to curtail his activities in the midst of the conference. Churchill’s physician, Lord Moran, wrote that Roosevelt had “all the symptoms of hardening of arteries in the brain” during the conference and gave the president “only a few months to live” (p.270-71). Churchill himself commented that his wartime ally “really was a pale reflection almost throughout” (p.270) the Yalta conference.

Yet, Roosevelt recovered sufficiently to return home from the conference and address Congress and the public on its results, plausibly claiming victory. The Soviet Union had agreed to participate in the United Nations and in the war in Asia, and to hold what could be construed as free elections in Poland. Had he lived longer, Roosevelt would have seen that Stalin delivered as promised on the Asian war. The Soviet Union also became a member of the United Nations and maintained its membership in the organization until its dissolution in 1991, but was rarely if ever the partner Roosevelt envisioned in keeping world peace. The possibility of a democratic Poland, “by far the knottiest and most time-consuming issue Roosevelt confronted at Yalta” (p.285), was by contrast slipping away even before Roosevelt’s death.

At one point in his remaining weeks, Roosevelt exclaimed, “We can’t do business with Stalin. He has broken every one of the promises he made at Yalta” on Poland (p.304; Dobbs includes the same quotation, adding that Roosevelt thumped on his wheelchair at the time of this outburst). But, like Dobbs, Lelyveld argues that even a more physically fit, fully focused and coldly realistic Roosevelt would likely have been unable to save Poland from Soviet clutches. When the allies met at Yalta, Stalin’s Red Army was in the process of consolidating military control over almost all of Polish territory.  If Roosevelt had been at the peak of vigor, Lelyveld concludes, the results on Poland “would have been much the same” (p.287).

Roosevelt was still trying to mend fences with Stalin on April 11, 1945, the day before his death in Warm Springs. Throughout the following morning, Roosevelt worked on matters of state: he received an update on the US military advances within Germany and even signed a bill, sustaining the Commodity Credit Corporation. Then, just before lunch Roosevelt collapsed. Dr. Bruenn arrived about 15 minutes later and diagnosed a hemorrhage in the brain, a stroke likely caused by the bursting of a blood vessel in the brain or the rupture of an aneurysm. “Roosevelt was doomed from the instant he was stricken” (p.323).  Around midnight, Daisy Suckley recorded in her diary that the president had died at 3:35 pm that afternoon. “Franklin D. Roosevelt, the hope of the world, is dead,” (p.324), she wrote.

Daisy was one of several women present at Warm Springs to provide company to the president during his final visit. Another was Eleanor Roosevelt’s former Secretary, Lucy Mercer Rutherford, by this time the primary Other Woman in the president’s life. Rutherford had driven down from South Carolina to be with the president, part of a recurring pattern in which Rutherford appeared in instances when wife Eleanor was absent, as if coordinated by a social secretary with the knowing consent of all concerned. But this orchestration broke down in Warm Springs in April 1945. After the president died, Rutherford had to flee in haste to make room for Eleanor. Still another woman in the president’s entourage, loquacious cousin Laura Delano, compounded Eleanor’s grief by letting her know that Rutherford had been in Warm Springs for the previous three days, adding gratuitously that Rutherford had also served as hostess at occasions at the White House when Eleanor was away. “Grief and bitter fury were folded tightly in a large knot” (p.325) for the former First Lady at Warm Springs.

Subsequently, Admiral McIntire asserted that Roosevelt had a “stout heart” and that his blood pressure was “not alarming at any time” (p.324-25), implying that the president’s death from a stroke had proven that McIntire had “always been right to downplay any suggestion that the president might have heart disease.” If not a flat-out falsehood, Lelyveld argues, McIntire’s assertion “at least raises the question of what it would have taken to alarm him” (p.325). Roosevelt’s medical file by this time had gone missing from the safe at Bethesda Naval Hospital, most likely removed by the Admiral because it would have revealed the “emptiness of the reassurances he’d fed the press and the public over the years, whenever questions arose about the president’s health” (p.325).

* * *

           Lelyveld declines to engage in what he terms an “argument without end” (p.92) on the degree to which Roosevelt’s deteriorating health impaired his job performance during his last months and final days. Rather, he  skillfully pieces together the limited historical record of Roosevelt’s medical condition to add new insights into the ailing but ever enigmatic president as he led his country nearly to the end of history’s most devastating war.

 

Thomas H. Peebles

La Châtaigneraie, France

March 28, 2017

 

 

 

3 Comments

Filed under American Politics, Biography, European History, History, United States History, World History

High Point of Modern International Economic Diplomacy

Ed Conway, The Summit: Bretton Woods 1944,

J.M. Keynes and the Reshaping of the Global Economy 

               During the first three weeks of July 1944, as World War II raged on the far sides of the Atlantic and Pacific oceans, 730 delegates from 44 countries gathered at the Mount Washington Hotel in Northern New Hampshire for what has come to be known as the Bretton Woods conference. The conference’s objective was audacious: create a new and more stable framework for the post-World War II monetary order, with the hope of avoiding future economic upheavals like the Great Depression of the 1930s.   To this end, the delegates reconsidered and in many cases rewrote some of the most basic rules of international finance and global capitalism, such as how money should flow between sovereign states, how exchange rates should interact, and how central banks should set interest rates. The conference took place at the venerable but aging Mount Washington Hotel, in an area informally known as Bretton Woods, not far from Mount Washington itself, Eastern United States’ highest peak.

In The Summit, Bretton Woods, 1944: J.M. Keynes and the Reshaping of the Global Economy, Ed Conway, formerly economics editor for Britain’s Daily Telegraph and Sunday Telegraph and presently economics editor for Sky News, provides new and fascinating detail about the conference. The word “summit” in his title carries a triple sense: it refers to Mount Washington and to the term that came into use in the following decade for a meeting of international leaders. But Conway also contends that the Bretton Woods conference now appears to have been another sort of summit. The conference marked the “only time countries ever came together to remold the world’s monetary system” (p.xx).  It stands in history as the “very highest point of modern international economic diplomacy” (p.xxv).

Conway differentiates his work from others on Bretton Woods by focusing on the interactions among the delegates and the “sheer human drama” (p.xxii) of the event.  As the sub-title indicates, British economist John Maynard Keynes is forefront among these delegates. Conway could have added to his subtitle the lesser-known Harry Dexter White, Chief International Economist at the US Treasury Department and Deputy to Treasury Secretary Henry Morgenthau, the head of the US delegation and formal president of the conference.  White’s name in the subtitle would have underscored that this book is a story about  the relationship between the two men who assumed de facto leadership of the conference. But the book is also a story about the uneasy relationship at Bretton Woods between the United States and the United Kingdom, the conference’s two lead delegations.

Although allies in the fight against Nazi Germany, the two countries were far from allies at Bretton Woods.  Great Britain, one of the world’s most indebted nations, came to the conference unable to pay for its own defense in the war against Nazi Germany and unable to protect and preserve its vast worldwide empire.  It was utterly outmatched at Bretton Woods by an already dominant United States, its principal creditor, which had little interest in providing debt relief to Britain or helping it maintain an empire. Even the force of Keynes’ dominating personality was insufficient to give Britain much more than a supplicant’s role at Bretton Woods.

Conway’s book also constitutes a useful and understandable historical overview of the international monetary order from pre-World War I days up to Bretton Woods and beyond.  The overview revolves around the gold standard as a basis for international currency exchanges and attempts over the years to find workable alternatives. Bretton Woods produced such an alternative, a standard pegged to the United States dollar — which, paradoxically, was itself tied to the price of gold.  Bretton Woods also produced two key institutions, the International Monetary Fund (IMF) and the International Bank for Reconstruction and Development, now known as the World Bank, designed to provide stability to the new economic order. But the Bretton Woods dollar standard remained in effect only until 1971, when US President Richard Nixon severed by presidential fiat the link between the dollar and gold, allowing currency values to float, as they had done in the 1930s.  In Conway’s view, the demise of Bretton Woods is to be regretted.

* * *

          Keynes was a legendary figure when he arrived at Bretton Woods in July 1944, a “genuine international celebrity, the only household name at Bretton Woods” (p.xv). Educated at Kings College, Cambridge, a member of the faculty of that august institution, and a peer in Britain’s House of Lords, Keynes was also a highly skilled writer and journalist, as well as a fearsome debater.  As a young man, he  established his reputation  with a famous critique of the 1919 Versailles Treaty, The Economic Consequences of the Peace, a tract that predicted with eerie accuracy the breakdown of the financial order that the post World War I treaty envisioned, based upon imposition of punitive reparations upon Germany. Although Keynes dazzled fellow delegates at Bretton Woods with his rhetorical brilliance, he was given to outlandish and provocative statements that hardly helped the bonhomie of the conference.   He suffered a heart attack toward the end of the conference and died less than two years later.

White was a contrast to Keynes in just about every way. He came from a modest first generation Jewish immigrant family from Boston and had to scramble for his education. Unusual for the time, in his 30s White earned an undergraduate degree from Stanford after having spent the better portion of a decade as a social worker. White had a dour personality, with none of Keynes’ flamboyance. Then there were the physical differences.   Keynes stood about six feet six inches tall (approximately 2.0 meters), whereas White was at least a foot smaller (approximately 1.7 meters). But if Keynes was the marquee star of the Bretton Woods because of his personality and reputation, White was its driving force because he represented the United States, undisputedly the conference’s driving force.

By the time of the Bretton Woods conference, however, White was also unduly familiar with Russian intelligence services. Although Conway hesitates to slap the “spy” label on him, there is little doubt that White provided a hefty amount of information to the Soviets, both at the conference and outside its confines. Of course, much of the “information sharing” took place during World War II, when the Soviet Union was allied with Britain and the United States in the fight against Nazi Germany and such sharing was seen in a different light than in the subsequent Cold War era.  One possibility, Conway speculates, was that White was “merely carrying out his own, personal form of diplomacy – unaware that the Soviets were construing this as espionage” (p.159; the Soviet Union attended the conference but did not join the international mechanisms which the conference established).

The reality, Conway concludes, is that we will “never know for certain whether White knowingly betrayed his country by passing information to the Soviets” (p.362).   Critically, there is “no evidence that White’s Soviet activities undermined the Bretton Woods agreement itself” (p.163;). White died in 1948, four years after the conference, and the FBI’s case against him became moot. From that point onward, the question whether White was a spy for the Soviet Union became one almost exclusively for historians, a question that today remains unresolved (ironically, after White’s death, young Congressman Richard Nixon remained just about the only public official still interested in White’s case; when Nixon became president two decades later, he terminated the Bretton Woods financial standards White had helped create).

The conference itself begins at about the book’s halfway point. Prior to his account of its deliberations, Conway shows how the gold standard operated and the search for workable alternatives. In the period up to World War I, the world’s powers guaranteed that they could redeem their currency for its value in gold. The World War I belligerents went off the gold standard so they could print the currency needed to pay for their war costs, causing hyperinflation, as the supply of money overwhelmed the demand.  In the 1920s, countries gradually resorted back to the gold standard.

But the stock market crash of 1929 and ensuing depression prompted countries to again abandon the gold standard. In the 1930s, what Conway terms a “gold exchange standard” prevailed, in which governments undertook competitive devaluations of their currency. President Franklin Roosevelt, for example, used a “primitive scheme” to set the dollar “where he wanted it – which meant as low against the [British] pound as possible” (p.83).  The competitive devaluations and floating rates of the 1930s led to restrictive trade policies, discouraged trade and investment, and encouraged destabilizing speculation, all of which many economists linked to the devastating war that broke out across the globe at the end of the decade.

Bretton Woods sought to eliminate these disruptions for the post-war world by crafting an international monetary system based upon cooperation among the world’s sovereign states. The conference was preceded by nearly two years of negotiations between the Treasury Departments of Great Britain and the United States — essentially exchanges between Keynes and White, each with a plan on how a new international monetary order should operate. Both were “determined to use the conference to safeguard their own economies” (p.18). Keynes wanted to protect not only the British Empire but also London’s place as the center of international finance. White saw little need to protect the empire and foresaw New York as the world’s new economic hub.  He also wanted to locate the two institutions that Bretton Woods would create, the IMF and World Bank, in the United States, whereas Keynes hoped that at least one would be located either in Britain or on the European continent. White and the Americans would win on these and almost all other points of difference.

But Keynes and White shared a broad general vision that Bretton Woods should produce a system designed to do away with the worst effects of both the gold standard and the interwar years of instability and depression.   There needed to be something in between the rigidity associated with the gold standard on the one hand and free-floating currencies, which were “associated with dangerous flows of ‘hot money’ and inescapable lurches in exchange rates” (p.124), on the other. To White and the American delegation, “Bretton Woods needed to look as similar as possible to the gold standard: politicians’ hands should be tied to prevent them from inflating away their debts. It was essential to avoid the threat of the competitive devaluations that had wreaked such havoc in the 1930s” (p.171).  For Keynes and his colleagues, “Bretton Woods should be about ensuring stable world trade – without the rigidity of the gold standard” (p.171).

The British and American delegations met in Atlantic City in June 1944 in an attempt to narrow their differences before travelling to Northern New Hampshire, where the floor would be opened to the conference’s additional delegations.  Much of what happened at Bretton Woods was confined to the business pages of the newspapers, with attention focused on the war effort and President Roosevelt’s re-election bid for a fourth presidential term.  This suited White, who “wanted the conference to look as uncontroversial, technical and boring as possible” (p.203).  The conference was split into three main parts. White chaired Commission I, dealing with the IMF, while Keynes chaired Commission II, whose focus was the World Bank.  Each commission divided into multiple committees and sub-committees.  Commission III, whose formal title was “Other Means of International Cooperation,” was in Conway’s view essentially a “toxic waste dump into which White and Keynes could jettison some of the summit’s trickier issues” (p.216).

The core principle to emerge from the Bretton Woods deliberations was that the world’s currencies, rather than being tied directly to gold or allowed to float, would be pegged to the US dollar which, in turn, was tied to gold at a value of $35 per ounce. Keynes and White anticipated that fixing currencies against the dollar would ensure that:

international trade was protected for exchange rate risk. Nations would determine their own interest rates for purely domestic economic reasons, whereas under the gold standard, rates had been set primarily in order to keep the country’s gold stocks at an acceptable level. Countries would be allowed to devalue their currency if they became uncompetitive – but they would have to notify the International Monetary Fund in advance: this element of international co-ordination was intended to guard against a repeat of the 1930s spiral of competitive devaluation (p.369).

 

The IMF’s primary purpose under the Bretton Woods framework was to provide relief in balance of payments crises such as those of the 1930s, when countries in deficit were unable to borrow and exporting countries failed to find markets for their goods. “Rather than leaving the market to its own devices – the laissez-faire strategy discredited in the Depression – the Fund would be able to step in and lend countries money, crucially in whichever currency they most needed. So as to avoid the threat of competitive devaluations, the Fund would also arbitrate whether a country could devalue its exchange rate” (p.169).

One of the most sensitive issues in structuring the IMF involved the contributions that each country was required to pay into the Fund, termed “quotas.” When short of reserves, each member state would be entitled to borrow needed foreign currency in amounts determined by the size of its quota.  Most countries wanted to contribute more rather than less, both as a matter of national pride and as a means to gain future leverage with the Fund. Heated quota battles ensued “both publicly in the conference rooms and privately in the hotel corridors, until the very end of the proceedings” (p.222-23), with the United States ultimately determining quota amounts according to a process most delegations considered opaque and secretive.

The World Bank, almost an afterthought at the conference, was to have the power to finance reconstruction in Europe and elsewhere after the war.  But the Marshall Plan, an “extraordinary program of aid devoted to shoring up Europe’s economy” (p.357), upended Bretton Woods’ visions for both institutions for nearly a decade.  It was the Marshall Plan that rebuilt Europe in the post-war years, not the IMF or the World Bank. The Fund’s main role in its initial years, Conway notes, was to funnel money to member countries “as a stop-gap before their Marshall Plan aid arrived” (p.357),

When Harry Truman became President in April 1945 after Roosevelt’s death, he replaced Roosevelt’s Treasury Secretary Henry Morgenthau, White’s boss, with future Supreme Court justice Fred Vinson. Never a fan of White, Vinson diminished his role at Treasury and White left the department in 1947. He died the following year, in August 1948 at age 55.  Although the August 1945 change in British Prime Ministers from Winston Churchill to Clement Atlee did not undermine Keynes to the same extent, his deteriorating health diminished his role after Bretton Woods as well. Keynes died in April 1946 at age 62, shortly after returning to Britain from the inaugural IMF meeting in Savannah, Georgia, his last encounter with White.

Throughout the 1950s, the US dollar assumed a “new degree of hegemony,” becoming “formally equivalent to gold. So when they sought to bolster their foreign exchange reserves to protect them from future crises, foreign governments built up large reserves of dollars” (p.374). But with more dollars in the world economy, the United States found it increasingly difficult to convert them back into gold at the official exchange rate of $35 per ounce.  When Richard Nixon became president in 1969, the United States held $10.5 billion in gold, but foreign governments had $40 billion in dollar reserves, and foreign investors and corporations held another $30 billion. The world’s monetary system had become, once again, an “inverted pyramid of paper money perched on a static stack of gold” and Bretton Woods was “buckling so badly it seemed almost certain to collapse” (p.377).

In a single secluded weekend in 1971 at the Presidential retreat at Camp David, Maryland, Nixon’s advisors fashioned a plan to “close the gold window”: the United States would no longer provide gold to official foreign holders of dollars and instead would impose “aggressive new surcharges and taxes on imports intended to push other countries into revaluing their own currencies” (p.381).  When Nixon agreed to his advisors’ proposal,  the Bretton Woods system, which had “begun with fanfare, an unprecedented series of conferences and the deepest investigation in history into the state of macro-economics” ended overnight, “without almost anyone realizing it” (p.385). The era of fixed exchange rates was over, with currency values henceforth to be determined by “what traders and investors thought they were worth” (p.392).  Since 1971, the world’s monetary system has operated on what Conway describes as an “ad hoc basis, with no particular sense of the direction in which to follow” (p.401).

* * *

            In his epilogue, Conway cites a 2011 Bank of England study that showed that between 1948 and the early 1970s, the world enjoyed a “period of economic growth and stability that has never been rivaled – before or since” (p.388).  In Bretton Woods member states during this period “life expectancy climbed swiftly higher, inequality fell, and social welfare systems were constructed which, for the time being at least, seemed eminently affordable” (p.388).  The “imperfect” and “short-lived” (p.406) system which Keynes and White fashioned at Bretton Woods may not be the full explanation for these developments but it surely contributed.  In the messy world of international economics, that system has “come to represent something hopeful, something closer to perfection” (p.408).  The two men at the center of this captivating story came to Bretton Woods intent upon repairing the world’s economic system and replacing it with something better — something that might avert future economic depressions and the resort to war to settle differences.  “For a time,” Conway concludes, “they succeeded” (p.408).

Thomas H. Peebles

La Châtaigneraie, France

March 8, 2017

7 Comments

Filed under British History, European History, History, United States History, World History

Do Something

zachary-1

zachary-2

Zachary Kaufman, United States Law and Policy on Transitional Justice:

Principles, Politics, and Pragmatics 

             The term “transitional justice” is applied most frequently to “post conflict” situations, where a nation state or region is emerging from some type of war or violent conflict that has given rise to genocide, war crimes, or crimes against humanity — each now a recognized concept under international law, with “mass atrocities” being a common shorthand used to embrace these and related concepts. In United States Law and Policy on Transitional Justice: Principles, Politics, and Pragmatics, Zachary Kaufman, a Senior Fellow and expert on human rights at Harvard University’s Kennedy School of Government, explores the circumstances which have led the United States to support that portion of the transitional justice process that determines how to deal with suspected perpetrators of mass atrocities, and why it chooses a particular means of support (disclosure: Kaufman and I worked together in the US Department of Justice’s overseas assistance unit between 2000 and 2002, although we had different portfolios: Kaufman’s involved Africa and the Middle East, while I handled Central and Eastern Europe).

          Kaufman’s book, adapted from his Oxford University PhD dissertation, centers around case studies of the United States’ role in four major transitional justice situations: Germany and Japan after World War II, and ex-Yugoslavia and Rwanda in the 1990s, after the end of the Cold War. It also looks more briefly at two secondary cases, the 1988 bombing of Pan American flight 103, attributed to Libyan nationals, and atrocities committed during Iraq’s 1990-91 occupation of Kuwait. Making extensive use of internal US government documents, many of which have been declassified, Kaufman digs deeply into the thought processes that informed the United States’ decisions on transnational justice in these six post-conflict situations. Kaufman brings a social science perspective to his work, attempting to tease of out of the case studies general rules about how the United States might act in future transitional justice situations.

          The term “transitional justice” implicitly affirms that a permanent and independent national justice system can and should be created or restored in the post-conflict state.  Kaufman notes at one point that dealing with suspected perpetrators of mass atrocities is just one of several critical tasks involved in creating or restoring a permanent national justice system in a post-conflict state.  Others can include: building or rebuilding sustainable judicial institutions, strengthening the post-conflict state’s legislation, improving capacity of its justice-sector personnel, and creating or upgrading the physical infrastructure needed for a functioning justice system. These latter tasks are not the focus of Kaufman’s work. Moreover, in determining how to deal with alleged perpetrators of mass atrocities, Kaufman’s focus is on the front end of the process: how and why the United States determined to support this portion of the process generally and why it chose particular mechanisms rather than others.   The outcomes that the mechanisms produce, although mentioned briefly, are not his focus either.

          In each of the four primary cases, the United States joined other nations to prosecuted those accused or suspected of involvement in mass atrocities before an international criminal tribunal, which Kaufman characterizes as the “most significant type of transitional justice institution” (p.12). Prosecution before an international tribunal, he notes, can promote stability, the rule of law and accountability, and can serve as a deterrent to future atrocities. But the process can be both slow and expensive, with significant political and legal risks. Kaufman’s work provides a useful reminder that prosecution by an international tribunal is far from the only option available to deal with alleged perpetrators of mass atrocities. Others include trials in other jurisdictions, including those of the post-conflict state, and several non-judicial alternatives: amnesty for those suspected of committing mass atrocities, with or without conditions; “lustration,” where suspected persons are disenfranchised from specific aspects of civic life (e.g., declared ineligible for the civil service or the military); and “doing nothing,” which Kaufman considers tantamount to unconditional amnesty.  Finally, there is the option of summary execution or other punishment, without benefit of trial. These options can be applied in combination, e.g., amnesty for some, trial for others.

         Kaufman weighs two models, “legalism” and “prudentialism,” as potential explanations for why and how the United States acted in the cases under study and is likely to act in the future.  Legalism contends that prosecution before an international tribunal of individuals suspected or accused of mass atrocities  is the only option a liberal democratic state may elect, consistent with its adherence to the rule of law.  In limited cases, amnesty or lustrations may be justified as a supplement to initiating cases before a tribunal. Summary execution may never be justified. Prudentialism is more ad hoc and flexible,with  the question whether to establish or invoke an international criminal tribunal or pursue other options determined by any number of different political, pragmatic and normative considerations, including such geo-political factors as promotion of stability in the post-conflict state and region, the determining state or states’ own national security interests, and the relationships between determining states. Almost by definition, legalism precludes consideration of these factors.

          Kaufman presents his cases in a highly systematic manner, with tight overall organization. An introduction and three initial chapters set forth the conceptual framework for the subsequent case studies, addressing matters like methodology and definitional parameters.  The four major cases are then treated in four separate chapters, each with its own introduction and conclusion, followed by an overall conclusion, also with its own introduction and conclusion (the two secondary cases, Libya and Iraq are treated within the chapter on ex-Yugoslavia).  Substantive headings throughout each chapter make his arguments easy to follow.   General readers may find jarring his extensive use of acronyms throughout the text, drawn from a three-page list contained at the outset. But amidst Kaufman’s deeply analytical exploration of the thinking that lay behind the United States’ actions, readers will appreciate his decidedly non-sociological hypothesis as to why the United States elects to engage in  the transitional justice process: a deeply felt American need in the wake of mass atrocities to “do something” (always in quotation marks).

* * *

          Kaufman begins his case studies with the best-known example of transitional justice, Nazi Germany after World War II. The United States supported creation of what has come to be known as the Nuremberg War Crimes tribunal, a military court administered by the four victorious allies, the United States, Soviet Union, Great Britain and France. The Nuremberg story is so well known, thanks in part to “Judgment at Nuremberg,” the best-selling book and popular film, that most readers will assume that the multi-lateral Nuremberg trials were the only option seriously under consideration at the time. To the contrary, Kaufman demonstrates that such trials were far from the only option on the table.

        For a while the United States seriously considered summary executions of accused Nazi leaders. British Prime Minister Winston Churchill pushed this option during wartime deliberations and, Kaufman indicates, President Roosevelt seemed at times on the cusp of agreeing to it. Equally surprisingly, Soviet Union leader Joseph Stalin lobbied early and hard for a trial process rather than summary executions. The Nuremberg Tribunal “might not have been created without Stalin’s early, constant, and forceful lobbying” (p.89), Kaufman contends.  Roosevelt abandoned his preference for summary executions after economic aspects of the Morgenthau Plan, which involved the “pastoralization” of Germany, were leaked to the press. When the American public “expressed its outrage at treating Germany so harshly through a form of economic sanctions,” Roosevelt concluded that Americans would be “unsupportive of severe treatment for the Germans through summary execution” (p.85).

          But the United States’ support for war crimes trials became unwavering only after Roosevelt died in April 1945 and Harry S. Truman assumed the presidency.  The details and mechanics of a multi-lateral trial process were not worked out until early August 1945 in the “London Agreement,” after Churchill had been voted out of office and Labor Prime Minister Clement Atlee represented Britain. Trials against 22 high level Nazi officials began in November 1945, with verdicts rendered in October 1946: twelve defendants were sentenced to death, seven drew prison sentences, and three were acquitted.

       Many lower level Nazi officials were tried in unilateral prosecutions by one of the allied powers.   Lustration, barring active Nazi party members from major public and private positions, was applied in the US, British, and Soviet sectors.  Numerous high level Nazi officials were allowed to emigrate to the United States to assist in Cold War endeavors, which Kaufman characterizes as a “conditional amnesty” (Nazi war criminals who emigrated to the United States is the subject of Eric Lichtblau’s The Nazis Next Door: How America Became a Safe Haven for Hitler’s Men, reviewed here in October 2015; Frederick Taylor’s Exorcising Hitler: The Occupation and Denazification of Germany, reviewed here in December 2012, addresses more generally the manner in which the Allies dealt with lower level Nazi officials). By 1949, the Cold War between the Soviet Union and the West undermined the allies’ appetite for prosecution, with the Korean War completing the process of diverting the world’s attention away from Nazi war criminals.

          The story behind creation of the International Military Tribunal for the Far East, designed to hold accountable accused Japanese perpetrators of mass atrocities, is far less known than that of Nuremberg, Kaufman observes.  What has come to be known as the “Tokyo Tribunal” largely followed the Nuremberg model, with some modifications. Even though 11 allies were involved, the United States was closer to the sole decision-maker on the options to pursue in Japan than it had been in Germany. As the lead occupier of post-war Japan, the United States had “no choice but to ‘do something’” (p.119).   Only the United States had both the means and will to oversee the post-conflict occupation and administration of Japan. That oversight authority was vested largely in a single individual, General Douglas MacArthur, Supreme Commander of the Allied forces, whose extraordinarily broad – nearly dictatorial — authority in post World War II Japan extended to the transitional justice process. MacArthur approved appointments to the tribunal, signed off on its indictments, and exercised review authority over its decisions.

            In the interest of securing the stability of post-war Japan, the United States accorded unconditional amnesty to Japan’s Emperor Hirohito. The Tokyo Tribunal indicted twenty-eight high-level Japanese officials, but more than fifty were not indicted, and thus also benefited from an unconditional amnesty. This included many suspected of “direct involvement in some of the most horrific crimes of WWII” (p.108), several of whom eventually returned to Japanese politics. Through lustration, more than 200,000 Japanese were removed or barred from public office, either permanently or temporarily.  As in Germany, by the late 1940s the emerging Cold War with the Soviet Union had chilled the United States’ enthusiasm for prosecuting Japanese suspected of war crimes.

           The next major United States engagements in transitional justice arose in the 1990s, when the former Yugoslavia collapsed and the country lapsed into a spasm of ethnic violence; and massive ethnic-based genocide erupted in Rwanda in 1994. By this time, the Soviet Union had itself collapsed and the Cold War was over. In both instances, heavy United States’ involvement in the post-conflict process was attributed in part to a sense of remorse for its lack of involvement in the conflicts themselves and its failure to halt the ethnic violence, resulting in a need to “do something.”  Rwanda marks the only instance among the four primary cases where mass atrocities arose out of an internal conflict.

       The ethnic conflicts in Yugoslavia led to the creation of the International Criminal Tribunal for Yugoslavia (ICTY), based in The Hague and administered under the auspices of the United Nations Security Council. Kaufman provides much useful insight into the thinking behind the United States’ support for the creation of the court and the decision to base it in The Hague as an authorized Security Council institution. His documentation shows that United States officials consistently invoked the Nuremberg experience. The United States supported a multi-lateral tribunal through the Security Council because the council could “obligate all states to honor its mandates, which would be critical to the tribunal’s success” (p.157). The United States saw the ICTY as critical in laying a foundation for regional peace and facilitating reconciliation among competing factions. But it also supported the ICTY and took a lead role in its design to “prevent it from becoming a permanent [tribunal] with global reach” (p.158), which it deemed “potentially problematic” (p.157).

             The United States’ willingness to involve itself in the post-conflict transitional process in Rwanda,   even more than in the ex-Yugoslavia, may be attributed to its failure to intervene during the worst moments of the genocide itself.  That the United States “did not send troops or other assistance to Rwanda perversely may have increased the likelihood of involvement in the immediate aftermath,” Kaufman writes. A “desire to compensate for its foreign policy failures in Rwanda, if not also feelings of guilt over not intervening, apparently motivated at least some [US] officials to support a transitional justice institution for Rwanda” (p.197).

        Once the Rwandan civil war subsided, there was a strong consensus within the international community that some kind of international tribunal was needed to impose accountability upon the most egregious génocidaires; that any such tribunal should operate under the auspices of the United Nations Security Council; that the tribunal should in some sense be modeled after the ICTY; and that the United States shouldtake the lead in establishing the tribunal. The ICTY precedent prompted US officials to “consider carefully the consistency with which they applied transitional justice solutions in different regions; they wanted the international community to view [the US] as treating Africans similarly to Europeans” (p.182). According to these officials, after the precedent of proactive United States involvement in the “arguably less egregious Balkans crisis,” the United States would have found it “politically difficult to justify inaction in post-genocide Rwanda” (p.182).

           The United States favored a tribunal modeled after and structurally similar to the ICTY, which came to be known as International Criminal Tribunal for Rwanda (ICTR). The ICTR was the first international court having competence to “prosecute and punish individuals for egregious crimes committed during an internal conflict” (p.174), a watershed development in international law and transitional justice.  To deal with lower level génocidaires, the Rwandan government and the international community later instituted additional prosecutorial measures, including prosecutions by Rwandan domestic courts and local domestic councils, termed gacaca.

          No international tribunals were created in the two secondary cases, Libya after the 1998 Pan Am flight 103 bombing, and the 1990-91 Iraqi invasion of Kuwait. At the time of the Pam Am bombing, several years prior to the September 11, 2001 attacks, United States officials considered terrorism a matter to be addressed “exclusively in domestic contexts” (p.156).  In the case of the bombing of Pan Am 103, where Americans had been killed, competent courts were available in the United States and the United Kingdom. There were numerous documented cases of Iraqi atrocities against Kuwaiti civilians committed during Iraq’s 1990-91 invasion of Kuwait.  But the 1991 Gulf War, while driving Iraq out of Kuwait, otherwise left Iraqi leader Saddam Hussein in power. The United States was therefore not in a position to impose accountability upon Iraqis for atrocities committed in Kuwait, as it had done after defeating Germany and Japan in World War II.

* * *

         In evaluating the prudentialism and legalism models as ways to explain the United States’ actions in the four primary cases, prudentialism emerges as a clear winner.  Kaufman convincingly demonstrates that the United States in each was open to multiple options and motivated by geo-political and other non-legal considerations.  Indeed, it is difficult to imagine that the United States – or any other state for that matter — would ever, in advance, agree to disregard such considerations, as the legalism model seems to demand. After reflecting upon Kaufman’s analysis, I concluded that legalism might best be understood as more aspirational than empirical, a forward-looking, prescriptive model as to how the United States should act in future transitional justice situations, favored in particular by human rights organizations.

         But Kaufman also shows that the United States’ approach in each of the four cases was not entirely an ad hoc weighing of geo-political and related considerations.  Critical to his analysis are the threads which link the four cases, what he terms “path dependency,” whereby the Nuremberg trial process for Nazi war criminals served as a powerful influence upon the process set up for their Japanese counterparts; the combined Nuremberg-Tokyo experience weighed heavily in the creation of ICTY; and ICTY strongly influenced the structure and procedure of ICTR.   This cumulative experience constitutes another factor in explaining why the United States in the end opted for international criminal tribunals in each of the four cases.

         If a general rule can be extracted from Kaufman’s four primary cases, it might therefore be that an international criminal tribunal has evolved into the “default option” for the United States in transitional justice situations,  showing the strong pull of the only option which the legalism model considers consistent with the rule of law.  But these precedents may exert less hold on US policy makers going forward, as an incoming administration reconsiders the United States’ role in the 21st century global order. Or, to use Kaufman’s apt phrase, there may be less need felt for the United States to “do something” in the wake of future mass atrocities.

Thomas H. Peebles

Venice, Italy

February 10, 2017

 

5 Comments

Filed under American Politics, United States History

Can’t Forget the Motor City

detroit-1

detroit-2

detroit-3

David Maraniss, Once In a Great City: A Detroit Story

     In 1960, Detroit was the automobile capital of the world, America’s undisputed center of manufacturing, and its fifth most populous city, with that year’s census tallying 1.67 million people. Fifty years later, the city had lost nearly a million people; its population had dropped to 677,000 and it ranked 21st in population among America’s cities in the 2010 census. Then, in 2013, the city reinforced its image as an urban basket case by ignominiously filing for bankruptcy. In Once In a Great City: A Detroit Story, David Maraniss, a native Detroiter of my generation and a highly skilled journalist whose previous works include books on Barack Obama, Bill Clinton and Vince Lombardi, focuses upon Detroit before its precipitous fall, an 18-month period from late 1962 to early 1964.   This was the city’s golden moment, Maraniss writes, when Detroit “seemed to be glowing with promise. . . a time of uncommon possibility and freedom when Detroit created wondrous and lasting things” (p.xii-xiii; in March 2012, I reviewed here two books on post World War II Detroit, under the title “Tales of Two Cities”).

      Detroit produced more cars in this 18 month period than Americans produced babies.  Barry Gordy Jr.’s popular music empire, known officially and affectionately as “Motown,” was selling a new, upbeat pop music sound across the nation and around the world.  Further, at a time when civil rights for African-Americans had become America’s most morally compelling issue, race relations in a city then about one-third black appeared to be as good as anywhere in the United States. With a slew of high-minded officials in the public and private sector dedicated to racial harmony and justice, Detroit sought to present itself as a model for the nation in securing opportunity for all its citizens.

     Maraniss begins his 18-month chronicle with dual events on the same day in November 1962: the burning of an iconic Detroit area memorial to the automobile industry, the Ford Rotunda, a “quintessentially American harmonic convergence of religiosity and consumerism” (p.1-2); and, later that afternoon, a police raid on the Gotham Hotel, once the “cultural and social epicenter of black Detroit” (p.10), but by then considered to be a den of illicit gambling controlled by organized crime groups.  He ends with President Lyndon Johnson’s landmark address in May 1964 on the campus of nearby University of Michigan in Ann Arbor, where Johnson outlined his grandiose vision of the Great Society.  Johnson chose Ann Arbor as the venue to deliver this address in large measure because of its proximity to Detroit. No place seemed “more important to his mission than Detroit,” Maraniss writes, a “great city that honored labor, built cars, made music, promoted civil rights, and helped lift working people into the middle class” (p.360).

     Maraniss’ chronicle unfolds between these bookend events, revolving around on what had attracted President Johnson to the Detroit area in May 1964: building cars, making music, promoting civil rights, and lifting working people into the middle class. He skillfully weaves these strands into an affectionate, deeply researched yet easy-to-read portrait of Detroit during this 18-month golden period.  But Maraniss  does not ignore the fissures, visible to those perceptive enough to recognize them, which would lead to Detroit’s later unraveling.  Detroit may have found the right formula for bringing a middle class life style to working class Americans, black and white alike. But already Detroit was losing population as its white working class was taking advantage of newfound prosperity to leave the city for nearby suburbs.  Moreover, many in Detroit’s black community found the city to be anything but a model of racial harmony.

* * *

     An advertising executive described Detroit in 1963 as “intensely an automobile community – everybody lives, breathes, and sleeps automobiles. It’s like a feudal city ” (p.111). Maraniss’ inside account of Detroit’s automobile industry focuses principally upon the remarkable relationship between Ford Motor Company’s chief executive, Henry Ford II (sometimes referred to as “HF2” or “the Deuce”) and the head of the United Auto Workers, Walter Reuther, during this 18 month golden age (Manariss accords far less attention to the other two members of Detroit’s “Big Three,” General Motors and Chrysler, or to the upstart American Motors Corporation, whose chief executive, George Romney, was elected governor in November 1962 as a Republican). Ford and Reuther could not have been more different.

     Ford, from Detroit’s most famous industrial family, was a graduate of Hotchkiss School and Yale University who had been called home from military service during World War II to run the family business when his father Edsel Ford, then company president, died in 1943. Maraniss mischievously describes the Deuce as having a “touch of the peasant, with his manicured nails and beer gut and . . . frat-boy party demeanor” (p.28). Yet, Ford earnestly sought to modernize a company that he thought had grown too stodgy.  And, early in his tenure, he had famously said, “Labor unions are here to stay” (p.212).

      Reuther was a graduate of the “school of hard knocks,” the son of German immigrants whose father had worked in the West Virginia coalmines.   Reuther himself had worked his way up the automobile assembly line hierarchy to head its powerful union. George Romney once called Reuther the “most dangerous man in Detroit” (p.136). But Reuther prided himself on “pragmatic progressivism over purity, getting things done over making noise. . . [He was] not Marxist but Rooseveltian – in his case meaning as much Eleanor as Franklin” (p.136). Reuther believed that big government was necessary to solve big problems. During the Cold War, he won the support of Democratic presidents by “steering international trade unionists away from communism” (p.138).

     A quarter of a century after the infamous confrontation between Reuther and goons recruited by the Deuce’s grandfather Henry Ford to oppose unionization in the automobile industry — an altercation in which Reuther was seriously injured — the younger Ford’s partnership with Reuther blossomed. Rather than bitter and violent confrontation, the odd couple worked together to lift huge swaths of Detroit’s blue-collar auto workers into the middle class – arguably Detroit’s most significant contribution to American society in the second half of the 20th century. “When considering all that Detroit has meant to America,” Maraniss writes, “it can be said in a profound sense that Detroit gave blue-collar workers a way into the middle class . . . Henry Ford II and Walter Reuther, two giants of the mid-twentieth century, were essential to that result” (p.212).

      Reuther was aware that, despite higher wages and improved benefits, life on the assembly lines remained “tedious and soul sapping if not dehumanizing and dangerous” for autoworkers (p.215). He therefore consistently supported improving leisure time for workers outside the factory.  Music was one longstanding outlet for Detroiters, including its autoworkers. The city’s rich history of gospel, jazz and rhythm and blues musicians gave Detroit an “unmatched creative melody” (p.100), Maraniss observes.   By the early 1960s, Detroit’s musical tradition had become identified with the work of Motown founder, mastermind and chief executive, Berry Gordy Jr.

     Gordy was an ambitious man of “inimitable skills and imagination . . . in assessing talent and figuring out how to make it shine” (p.100).  Gordy aimed to market his Motown sound to white and black listeners alike, transcending the racial confines of the traditional rhythm and blues market. He set up what Maraniss terms a “musical assembly line” that “nurtured freedom through discipline” (p.195) for his many talented performers. The songs which Gordy wrote and championed captured the spirit of working class life: “clear story lines, basic and universal music for all people, focusing on love and heartbreak, work and play, joy and pain” (p.53).

     Gordy’s team included a mind-boggling array of established stars: Mary Wells, Marvin Gaye, Smokey Robinson and his Miracles, Martha Reeves and her Mandelas, Diana Ross and her Supremes, and the twelve-year-old prodigy, Little Stevie Wonder.  Among Gordy’s rising future stars were the Temptations and the Four Tops. The Motown team was never more talented than in the summer of 1963, Maraniss contends. Ten Motown singles rose to Billboard’s Top 10 that year, and eight more to the Top 20.  Wonder, who dropped “Little” before his name in 1963, saw his “Fingertips Part 2” rocket up the charts to No. 1.  Martha and the Vandellas made their mark with “Heat Wave,” a song with “irrepressibly joyous momentum” (p.197).  But the title could have referred equally to the rising intensity of the nationwide quest for racial justice and civil rights for African-Americans that summer.

      In June 1963, nine weeks before the 1963 March on Washington, Maraniss reminds us that Dr. Martin Luther King, Jr. delivered the outlines of his famous “I Have a Dream” speech at the end of a huge Detroit “Walk to Freedom” rally that took place almost exactly 20 years after a devastating racial confrontation between blacks and whites in wartime Detroit. The Walk drew an estimated 100,000 marchers, including a significant if limited number of whites. What King said that June 1963 afternoon, Maraniss writes, was “virtually lost to history, overwhelmed by what was to come, but the first time King dreamed his dream at a large public gathering, he dreamed it in Detroit” (p.182). Concerns about disorderly conduct and violence preceded both the Detroit Walk to Freedom and the March on Washington two months later. Yet, the two  events were for all practical purposes free of violence.  Just as the March on Washington energized King’s non-violent quest for Civil Rights nation-wide, the Walk to Freedom buoyed Detroit’s claim to be a model of racial justice in the urban north.

      In the Walk for Freedom and in the nationwide quest for racial justice, Walter Reuther was an unsung hero. Under Reuther’s leadership, the UAW made an “unequivocal moral and financial commitment to civil rights action and legislation” (p.126).   Once John Kennedy assumed the presidency, Reuther consistently pressed the administration to move on civil rights.  The White House in turn relied on Reuther to serve as a liaison to black civil rights leaders, especially to Dr. King and his southern desegregation campaign. The UAW functioned as what Maraniss  terms the “bank” (p.140) of the Civil Right movement, providing needed funding at critical junctures. To be sure, Maraniss emphasizes, not all rank-and-file UAW members shared Reuther’s passionate commitment to the Walk for Freedom, the March on Washington, or to the cause of civil rights for African-Americans.

     Even within Detroit’s black community, not all leaders supported the Walk for Freedom. Maraniss  provides a close look at the struggle between the Reverend C.L. Franklin and the Reverend Albert Cleage for control over the details of the March for Freedom and, more generally, for control over the direction of the quest for racial justice in Detroit. Reverend Franklin, Detroit’s “flashiest and most entertaining preacher” (p.12; also the father of singer Aretha, who somehow escaped Gordy’s clutches to perform for Columbia Records and later Atlantic), was King’s closest ally in Detroit’s black community. Cleage, whose church later became known as the Shrine of the Black Madonna, founded on the belief that Jesus was black, was not wedded to Dr. King’s brand of non-violence. Cleage sought to limit the influence of Reuther, the UAW and whites generally in the Walk for Freedom. Franklin was able to retain the upper hand in setting the terms and conditions for the June 1963 rally.  But the dispute between Reverends Franklin and Cleage reflected the more fundamental difference between black nationalism and Martin Luther King style integration, and was thus an “early formulation of a dispute that would persist throughout the decade” (p.232),

     In November of 1963, Cleage sponsored a conference that featured black nationalist Malcolm X’s “Message to the Grass Roots,” an important if less well known counterpoint to King’s “I Have A Dream” speech in Washington in August of that year.  In tone and substance, Malcolm’s address “marked a break from the past and laid out a path for the black power movement to follow from then on” (p.279). Malcolm referred in his speech to the highly publicized police killing of prostitute Cynthia Scott the previous summer, which had generated outrage throughout Detroit’s black community and exacerbated long simmering tensions between the community and a police force that was more than 95% white.

     Scott’s killing “discombobulated the dynamics of race in the city. Any communal black and white sensibility resulting from the June 23 [Walk to Freedom] rally had dissipated, and the prevailing feeling was again us versus them” (p.229).  The tension between police and community did not abate when Police Commissioner George Edwards, a long standing liberal who enjoyed strong support within the black community, considered the Scott case carefully and ruled that the shooting was “regrettable and unwise . . . but by the standards of the law it was justified” (p.199).

      Then there was the contentious issue of a proposed Open Housing ordinance that would have forbidden property owners from refusing to sell their property on the basis of race. The proposed ordinance required passage from the city’s nine person City Council, elected at large in a city that was one-third black – no one on the council represented directly the city’s black neighborhoods. The proposal was similar in intent to future national legislation, the Fair Housing Act of 1968, and had the enthusiastic support of Detroit’s progressive Mayor, Jerome Cavanaugh, a youthful Irish Catholic who deliberately cast himself as a mid-western John Kennedy.

      But the proposal evoked bitter opposition from white homeowner associations across the city, revealing the racial fissures within Detroit. “On one side were white homeowner groups who said they were fighting on behalf of individual rights and the sanctity and safety of their neighborhoods. On the other side were African American churches and social groups, white and black religious leaders, and the Detroit Commission on Community Relations, which had been established . . . to try to bridge the racial divide in the city” (p.242).   Notwithstanding the support of the Mayor and leaders like Reuther and Reverend Franklin, white homeowner opposition doomed the proposed ordinance. The City Council rejected the proposal 7-2, a stinging rebuke to the city’s self-image as a model of racial progress and harmony.

       Detroit’s failed bid for the 1968 Olympics was an equally stinging rebuke to the self-image of a city that loved sports as much as music. Detroit bested more glamorous Los Angeles for the right to represent the United States in international competition for the games. A delegation of city leaders, including Governor Romney and Mayor Cavanaugh, traveled to Baden Baden, Germany, where they made a well-received presentation to the International Olympic Committee. While Detroit was making its presentation, the Committee received a letter from an African American resident of Detroit who alluded to the Scott case and the failed Open Housing Ordinance to argue against awarding the games to the city on the ground that fair play “has not become a living part of Detroit” (p.262). Although bookmakers had made Detroit a 2-1 favorite for the 1968 games, the Committee awarded them to Mexico City. Its selection was based largely upon what Maraniss considers Cold War considerations, with Soviet bloc countries voting against Detroit. The delegation dismissed the view that the letter to the Committee might have undermined Detroit’s bid, but its actual effect on the Committee’s decision remains undetermined.

         Maraniss asks whether Detroit might have been able to better contain or even ward off the devastating 1967 riots had it been awarded the 1968 Olympic games. “Unanswerable, but worth pondering” is his response (p.271). In explaining the demise of Detroit, many, myself included, start with the 1967 riots which in a few short but violent days destroyed large swaths of the city, obliterating once solid neighborhoods and accelerating white flight to the suburbs.  But Maraniss emphasizes that white flight was already well underway long before the 1967 disorders. The city’s population had dropped from just under 1.9 million in the 1950 census to 1.67 million in 1960. In January of 1963, Wayne State University demographers published “The Population Revolution in Detroit,” a study which foresaw an even more precipitous emigration of Detroit’s working class in the decades ahead. The Wayne State demographers “predicted a dire future long before it became popular to attribute Detroit’s fall to a grab bag of Rust Belt infirmities, from high labor costs to harsh weather, and before the city staggered from more blows of municipal corruption and incompetence. Before any of that, the forces of deterioration were already set in motion” (p..91). Only a minor story in January 1963, the findings and projections of the Wayne State study in retrospect were of “startling importance and haunting prescience” (p.89).

* * *

      My high school classmates are likely to find Maraniss’ book a nostalgic trip down memory lane: his 18 month period begins with our senior year in a suburban Detroit high school and ends with our freshman college year — our own time of soaring youthful dreams, however unrealistic. But for those readers lacking a direct connection to the book’s time and place, and particularly for those who may still think of Detroit only as an urban basket case, Maraniss provides a useful reminder that it was not always thus.  He nails the point in a powerful sentence: “The automobile, music, labor, civil rights, the middle class – so much of what defines our society and culture can be traced to Detroit, either made there or tested there or strengthened there” (p.xii).  To this, he could have added, borrowing from Martha and the Vandellas’ 1964 hit, “Dancing in the Streets,” that America can’t afford to forget the Motor City.

 

                   Thomas H. Peebles

Berlin, Germany

October 28, 2016

9 Comments

Filed under American Politics, American Society, United States History

Becoming FLOTUS

michelleo-1

michelleo-2

Peter Slevin, Michelle Obama: A Life 

             In Michelle Obama: A Life, Peter Slevin, a former Washington Post correspondent presently teaching at Northwestern University, explores the improbable story of Michelle LaVaughn Robinson, now Michelle Obama, the First Lady of the United States (a position known affectionately in government memos as “FLOTUS”). Slevin’s sympathetic yet probing biography shows how Michelle’s life was and still is shaped by the blue collar, working class environment of Chicago’s South Side, where she was born and raised. Michelle’s life in many ways is a microcosm of 20th century African-American experience. Michelle’s ancestors were slaves, and her grandparents were part of the “Great Migration” of the first half of the 20th century that sent millions of African-Americans from the rigidly segregated south to northern urban centers in search of a better life.  Michelle was born in 1964, during the high point of the American civil rights movement, and is thus part of the generation that grew up after that movement had widened the opportunities available to African Americans.

            The first half of the book treats Michelle’s early life as a girl growing up on the South Side of Chicago and her experiences as an African-American at two of America’s ultra-elite institutions, Princeton University and Harvard Law School.  The centerpiece of this half is the loving environment that Michelle’s parents, Fraser Robinson III and his wife Marian Shields Robinson, created for Michelle and her older brother Craig, born two years earlier in 1962.  The Robinson family emphasized the primacy of education as the key to a better future, along with hard work and discipline, dedication to family, regular church attendance, and community service.

            Michelle’s post-Harvard professional and personal lives form the book’s second half. Early in her professional career, Michelle met a young man from Hawaii with an exotic background and equally exotic name, Barack Hussein Obama. Slevin provides an endearing account of their courtship and marriage (their initial date is also the subject of a recent movie “Southside With You”). Once Barack enters the scene, however, the story becomes as much about his entry and dizzying rise in politics as it is about Michelle, and thus likely to be familiar to many readers.

            But in this half of the book, we also learn about Michelle’s career in Chicago; how she balanced her professional obligations with her parental responsibilities; her misgivings about the political course Barack seemed intent upon pursuing; her at first reluctant, then full throated support for Barack’s long-shot bid for the presidency; and how she elected to utilize the platform which the White House provided to her as the FLOTUS.  Throughout, we see how Michelle retained the values of her South Side upbringing.

* * *

        Slevin provides an incisive description of 20th century Chicago, beginning in the 1920s, when Michelle’s grandparents migrated from the rural south.  He emphasizes the barriers that African Americans experienced, limiting where they could live and work, their educational opportunities, and more. Michelle’s father Fraser, after serving in the U.S. army, worked in a Chicago water filtration plant up to his death in 1991 from multiple sclerosis at age 55. Marian, still living (‘the First Grandmother”), was mainly a “stay-at-home Mom.”  In a city that “recognized them first and foremost as black,” Fraser and Marian refused to utilize the oppressive shackles of racism as an excuse for themselves or their children.  The Robinson parents “saw it as their mission to provide strength, wisdom, and a measure of insulation to Michelle and Craig” (p.26). Their message to their children was that no matter what obstacles they faced because of their race or their working class roots, “life’s possibilities were unbounded. Fulfillment of those possibilities was up to them. No excuses” (p.47).

     The South Side neighborhood where Michelle and Craig were raised, although part of Chicago’s rigidly segregated housing patterns, offered a stable and secure environment, with well-kept if modest homes and strong neighborhood schools. The neighborhood and the Robinson household provided Michelle and Craig with what Craig later termed the “Shangri-La of upbringings” (p.33).  Fraser and Marian both regretted deeply that they were not college graduates. The couple consequently placed an unusually high premium on education for their children, adopting a savvy approach which parents today would be wise to emulate.

      Learning to read and write  for the two Robinson children was a means toward the even more important goal of learning to think. Fraser and Marian advised their children to “use their heads, yet not to be afraid to make mistakes – in each case learning from what goes wrong” (p.46).  We told them, Marian recounted, “Make sure you respect your teachers, but don’t hesitate to question them. Don’t even allow even us to say just anything to you” (p.47). Fraser and Marian granted their children freedom to explore, test ideas and make their own decisions, but always within a framework that emphasized “hard work, honesty, and self-discipline. There were obligations and occasional punishment. But the goal was free thinking” (p.46).

       Both Robinson children were good students, but with diametrically opposite study methods. Michelle was methodical and obsessive, putting in long hours, while Craig largely coasted to good grades. Michelle went to Princeton in part because Craig was already a student there, but she did so with misgivings and concerns that she might not be up to its high standards. Prior to Princeton, Craig and Michelle had had little exposure to whites. If they experienced animosity in their early years, Slevin writes, it was “likely from African American kids who heard their good grammar, saw their classroom diligence, and accused them of ‘trying to sound white’” (p.49). At Princeton, however, a school which “telegraphed privilege” (p.71), Michelle began a serious contemplation of what it meant to be an African-American in a society where whites held most of the levers of power.

       As an undergraduate between 1982 and 1986, Michelle came to see a separate black culture existing apart from white culture. Black culture had its own music, language, and history which, as she wrote in a college term paper, should be attributed to the “injustices and oppressions suffered by this race of people which are not comparable to the experience of any other race of people through this country’s history” (p.91). Michelle observed that black public officials must persuade the white community that they are “above issues of race and that they are representing all people and not just Black people” (p.91-92). Slevin notes that Michelle’s description “strikingly foreshadowed a challenge that she and her husband would face twenty two years later as they aimed for the White House” (p.91). Michelle’s college experience was a vindication of the framework Fraser and Marian had created that allowed Michelle to flourish. At Princeton, Michelle learned that the girl from blue collar Chicago could “play in the big leagues” (p.94), as Slevin puts it.

            In the fall of 1986, Michelle entered Harvard Law School, another “lofty perch, every bit as privileged as Princeton, but certainly more competitive once classes began” (p.95). In law school, she was active in an effort to bring more African American professors to a faculty that was made up almost exclusively of white males. She worked for the Legal Aid Society, providing services to low income individuals. When she graduated from law school in 1989, she returned to Chicago – it doesn’t seem that she ever considered other locations. But, notwithstanding her activist leanings as a student, she chose to work as an associate in one of Chicago’s most prestigious corporate law firms, Sidley and Austin.

       Although located only a few miles from the South Side neighborhood where Michelle had grown up, Sidley and Austin was a world apart, another bastion of privilege, with some of America’s best known and most powerful businesses as its clients. The firm offered Michelle the opportunity to sharpen her legal skills, particularly in intellectual property protection and, at least equally importantly, pay off some of her student loans. But, like many idealistic young law graduates, she did not find work in a corporate law firm satisfying and left after two years.

        Michelle landed a job with the City of Chicago as an assistant to Valerie Jarret, then the City of Chicago’s Commissioner for Planning and Economic Development, who later became a valued White House advisor to President Obama. Michelle’s position was more operational than legal, serving as a “trouble shooter” with a discretionary budget that could be utilized to advance city programs at the neighborhood level on subjects as varied as business development, infant mortality, mobile immunization, and after school programs. But working for the City of Chicago was nothing if not political, and Michelle left after 18 months to take a position in 1993 at the University of Chicago, located on Chicago’s South Side, not far from where she grew up.

    Although still another of America’s most prestigious educational institutions, the University of Chicago had always seemed like hostile territory to Michelle, incongrous with its surrounding low and middle-income neighborhoods. But Michelle landed a position with a university program, Public Alliance, designed to improve the University’s relationship with the surrounding communities. Notwithstanding her lack of warm feelings for the university, the position was an excellent fit.  It afforded Michelle the opportunity to try her hand at bridging some of the gaps between the university and its less privileged neighbors.

          After nine years  with Public Allies, Michelle took a position in 2002 with the University of Chicago Hospital, again involved in public outreach, focused on the way the hospital could better serve the medical needs of the surrounding community. This position, Slevin notes, brought home to Michelle the massive inequalities within the American health care system, divided between the haves with affordable insurance and the have nots without it.  Michelle stayed in this position until early 2008, when she left to work on her husband’s long shot bid for the presidency. In her positions with the city and the university, Michelle developed a demanding leadership style for her staffs that she brought to the White House: result-oriented, given to micro-management, and sometimes “blistering” (p.330) to staff members whose performance fell short in her eyes.

* * *

       While working at Sidley and Austin, Michelle interviewed the young man from Hawaii, then in his first year at Harvard Law School, for a summer associate position. Michelle in Slevin’s account found the young man “very charming” and “handsome,” and sensed that, as she stated subsequently, he “liked my dry sense of humor and my sarcasm” (p.121). But if there was mutual attraction, it was the attraction of opposites. Barack Obama was still trying to figure out where his roots lay. Michelle Robinson, quite obviously, never had to address that question. Slevin notes that the contrast could “hardly have been greater” between Barack’s “untethered life and the world of the Robinson and Shields clans, so numerous and so firmly anchored in Chicago. He felt embraced and it surprised him” (p.128; Barack’s untethered life figures prominently in Janny Scott’s biography of Barack’s mother, Ann Dunham, reviewed here in July 2012).  For Barack, meeting the Robinson family for the first time was, as he later wrote, like “dropping in on the set of Leave It to Beaver” (p.127).  The couple married in 1992.

        Barack served three 2-year terms in the Illinois Senate, from 1997 to 2004. In 2000, he ran unsuccessfully for the United States House of Representatives, losing in a landslide. He had his breakthrough moment in 2004, when John Kerry, the Democratic Presidential candidate, invited him to deliver a now famous keynote address to that year’s Democratic National Convention.  Later that year, he won  a vacant seat in the United States Senate  by a landslide when his Republican opponent had to drop out due to a sex scandal.  In early 2007, he decided to run for the presidency.

       Michelle’s mistrust of politics was “deeply rooted and would linger long into Barack’s political career” (p.161), Slevin notes.  Her distrust was at the root of discernible frictions within their marriage, especially after their daughters were born — Malia in 1998 and Sasha in 2001. Barack’s political campaigning and professional obligations kept him away from home much of the time, to Michelle’s dismay. Michelle felt that she had accomplished more professionally than Barack, and was also saddled with parental duties in his absence. “It sometimes bothered her that Barack’s career always took priority over hers. Like many professional women of her age and station, Michelle was struggling with balance and a partner who was less involved – and less evolved – than she had expected” (p.180-81).

        Michelle was, to put it mildly, skeptical when her husband told her in 2006 that he was considering running for the presidency. She worried about further losing her own identity, giving up her career for four years, maybe eight, and living with the real possibility that her husband could be assassinated. Yet, once it became apparent that Barack was serious about such a run and had reached the “no turning back” point, Michelle was all in.  She became a passionate, fully committed member of Barack’s election team, a strategic partner who was “not shy about speaking up when she believed the Obama campaign was falling short” (p.219).

         With Barack’s victory over Senator John McCain in the 2008 presidential election, Michelle became what Slevin terms the “unlikeliest first lady in modern history” (p.4). The projects and messages she chose to advance as FLOTUS “reflected a hard-won determination to help working class and the disadvantaged, to unstack the deck. She was more urban and more mindful of inequality than any first lady since Eleanor Roosevelt” (p.5). Michelle reached out to children in the less favored communities in Washington, mostly African American, and thereafter to poor children around the world. She also concentrated on issues of obesity, physical fitness and nutrition, famously launching a White House organic vegetable garden. She developed programs to support the wives of American military personnel deployed in Iraq and Afghanistan, women struggling to “keep a toehold in the middle class” (p.293).

        In Barack’s second term, she adopted a new mission, called Reach Higher, which aimed to push disadvantaged teenagers toward college. Throughout her time as FLOTUS, Michelle tried valiantly to provide her two daughters with as close to a normal childhood as life in the White House bubble might permit. Slevin’s account stops just prior to the 2014 Congressional elections, when the Republicans gained control of the United States Senate, after gaining control of the House of Representatives in the prior mid-term elections in 2010.

       Slevin does not overlook the incessant Republican and conservative critics of Michelle. She appeared to many whites in the 2008 campaign as an “angry black woman,” which Slevin dismisses as a “simplistic and pernicious stereotype” (p.236). Right wing commentator Rush Limbaugh began calling her “Moochelle,” much to the delight of his listening audience. The moniker conjured images of a fat cow or a leech – synonymous with the term “moocher” which Ayn Rand used in her novels to describe those who “supposedly lived off the hard work of the producers” (p.316) — all the while slyly associating Michelle with “big government, the welfare state, big-spending Democrats, and black people living on the dole” (p.315).  Vitriol such as this, Slevin cautiously concludes, “could be traced to racism and sexism or, at a charitable minimum, a lack of familiarity with a black woman as accomplished and outspoken as Michelle” (p.286). In addition, criticism emerged from the political left, which “viewed Michelle positively but asked why, given her education, her experience, and her extraordinary platform, she did not speak or act more directly on a host of progressive issues, whether abortion rights, gender inequity, or the structural obstacles facing the urban poor” (p.286).

* * *

       Slevin’s book is not hagiography. As a conscientious biographer whose credibility is directly connected to his objectivity, Slevin undoubtedly looked long and hard for the Michelle’s weak points and less endearing qualities. He did not come up with much, unless you consider being a strong, focused woman a negative quality. There is no real dark side to Michelle Obama in Slevin’s account, no apparent skeletons in any of her closets. Rather, the unlikely FLOTUS depicted here continues to reflect the values she acquired while growing up in Fraser and Marian Robinson’s remarkable South Side household.

 

Thomas H. Peebles

La Châtaigneraie, France

September 17, 2016

 

 

 

 

 

5 Comments

Filed under American Politics, American Society, Biography, Gender Issues, Politics, United States History

Turning the Ship of Ideas in a Different Direction

Judt.1

Judt.2

Tony Judt, When the Facts Change,

Essays 1995-2010 , edited by Jennifer Homans

      In a 2013 review of Rethinking the 20th Century, I explained how the late Tony Judt became my “main man.” He was an expert in the very areas of my greatest, albeit amateurish, interest: French and European 20th century history and political theory; what to make of Communism, Nazism and Fascism; and, later in his career, the contributions of Central and Eastern European thinkers to our understanding of Europe and what he often termed the “murderous” 20th century. Moreover, Judt was a contemporary, born in Great Britain in 1948, the son of Jewish refugees. Raised in South London and educated at Kings College, Cambridge, Judt spent time as a recently-minted Cambridge graduate at Paris’ fabled Ecole Normale Supérieure; he lived on a kibbutz in Israel and contributed to the cause in the 1967 Six Day War; and had what he termed a mid-life crisis, which he spent in Prague, learning the Czech language and absorbing the rich Czech intellectual and cultural heritage.  Judt also had several teaching stints in the United States and became an American citizen. In 1995, he founded the Remarque Institute at New York University, where he remained until he died in 2010, age 62, of amyotrophic lateral sclerosis, ALS, which Americans know as “Lou Gehrig’s Disease.”

.
      Rethinking the 20th Century was more of an informal conversation with Yale historian Timothy Snyder than a book written by Judt. Judt’s best-known work was a magisterial history of post-World War II Europe, entitled simply Post War. His other published writings included incisive studies of obscure left-wing French political theorists and the “public intellectuals” who animated France’s always lively 20th century debate about the role of the individual and the state (key subjects of Sudhir Hazareesingh’s How the French Think: An Affectionate Portrait of an Intellectual People, reviewed here in June).  Among French public intellectuals, Judt reserved particular affection for Albert Camus and particular scorn for Jean-Paul Sartre.  While at the Remarque Institute, Judt became himself the epitome of a public intellectual, gaining much attention outside academic circles for his commentaries on contemporary events.  Judt’s contributions to public debate are on full display in When the Facts Change, Essays 1995-2010, a collection of 28 essays edited by Judt’s wife Jennifer Homans, former dance critic for The New Republic.

      The collection includes book reviews and articles originally published elsewhere, especially in The New York Review of Books, along with a single previously unpublished entry. The title refers to a quotation which Homans considers likely apocryphal, attributed to John Maynard Keynes: “when the facts change, I change my mind – what do you do, sir” (p.4). In Judt’s case, the major changes of mind occurred early in his professional life, when he repudiated his youthful infatuation with Marxism and Zionism. But throughout his adult life and especially in his last fifteen years, Homans indicates, as facts changed and events unfolded, Judt “found himself turned increasingly and unhappily against the current, fighting with all of his intellectual might to turn the ship of ideas, however slightly, in a different direction” (p.1).  While wide-ranging in subject-matter, the collection’s entries bring into particularly sharp focus Judt’s outspoken opposition to the 2003 American invasion of Iraq, his harsh criticism of Israeli policies toward its Palestinian population, and his often-eloquent support for European continental social democracy.

* * *

      The first essay in the collection, a 1995 review of Eric Hobsbawm’s The Age of Extremes: A History of the World, 1914-1991, should be of special interest to tomsbooks readers. Last fall, I reviewed Fractured Times: Culture and Society in the Twentieth Century, a collection of Hobsbawm’s essays.  Judt noted that Hobsbawm had “irrevocably shaped” all who took up the study of history between 1959 and 1975 — what Judt termed the “Hobsbawm generation” of historians (p.13). But Judt contended that Hobsbawm’s relationship to the Soviet Union — he was a lifelong member of Britain’s Communist Party – clouded his analysis of 20th century Europe. The “desire to find at least some residual meaning in the whole Communist experience” explains what Judt found to be a “rather flat quality to Hobsbawm’s account of the Stalinist terror” (p.26). That the Soviet Union “purported to stand for a good cause, indeed the only worthwhile cause,” Judt concluded, is what “mitigated its crimes for many in Hobsbawm’s generation.” Others – likely speaking for himself — “might say it just made them worse” (p.26-27).

      In the first decade of the 21st century, Judt became known as an early and fervently outspoken critic of the 2003 American intervention in Iraq.  Judt wrote in the New York Review of Books in May 2003, two months after the U.S.-led invasion, that President Bush and his advisers had “[u]nbelievably” managed to “make America seem the greatest threat to international stability.” A mere eighteen months after September 11, 2001:

the United States may have gambled away the confidence of the world. By staking a monopoly claim on Western values and their defense, the United States has prompted other Westerners to reflect on what divides them from America. By enthusiastically asserting its right to reconfigure the Muslim world, Washington has reminded Europeans in particular of the growing Muslim presence in their own cultures and its political implications. In short, the United States has given a lot of people occasion to rethink their relationship with it” (p.231).

Using Madeline Albright’s formulation, Judt asked whether the world’s “indispensable nation” had miscalculated and overreached. “Almost certainly” was his response to his question, to which he added: “When the earthquake abates, the tectonic plates of international politics will have shifted forever” (p.232). Thirteen years later, in the age of ISIS, Iranian ascendancy and interminable civil wars in Iraq and Syria, Judt’s May 2003 prognostication strikes me as frightfully accurate.

      Judt’s essays dealing with the state of Israel and the seemingly intractable Israeli-Palestinian conflict generated rage, drawing in particular the wrath of pro-Israeli American lobbying groups. Judt, who contributed to Israeli’s war effort in the 1967 Six Day War as a driver and translator for the Iraqi military, came to consider the state of Israel an anachronism. The idea of a Jewish state, in which “Jews and the Jewish religion have exclusive privileges from which non-Jewish citizens are forever excluded,” he wrote in 2003, is “rooted in another time and place” (p.116). Although “multi-cultural in all but name,” Israel was “distinctive among democratic states in its resort to ethno-religious criteria with which to denominate and rank its citizens” (p.121).

      Judt noted in 2009 that the Israel of Benjamin Netanyahu was “certainly less hypocritical than that of the old Labor governments. Unlike most of its predecessors reaching back to 1967, it does not even pretend to seek reconciliation with the Arabs over which it rules” (p. 157-58). Israel’s “abusive treatment of the Palestinians,” he warned, is the “chief proximate cause of the resurgence of anti-Semitism worldwide. It is the single most effective recruiting agent for radical Islamic movements” (p.167). Vilified for these contentions, Judt repeatedly pleaded for recognition of what should be, but unfortunately is not, the self-evident proposition that one can criticize Israeli policies without being anti-Semitic or even anti-Israel.

      Judt was arguably the most influential American proponent of European social democracy, the form of governance that flourished in Western Europe between roughly 1950 and 1980 and became the model for Eastern European states emerging from communism after 1989, with a strong social safety net, free but heavily regulated markets, and strong respect for individual liberties and the rule of law. Judt characterized social democracy as the “prose of contemporary European politics” (p.331). With the fall of communism and the demise of an authoritarian Left, the emphasis upon democracy had become “largely redundant,” Judt contended. “We are all democrats today. But ‘social’ still means something – arguably more now than some decades back when a role for the public sector was uncontentiously conceded by all sides” (p.332). Judt saw social democracy as the counterpoint to what he termed “neo-liberalism” or globalization, characterized by the rise of income inequality, the cult of privatization, and the tendency – most pronounced in the Anglo-American world – to regard unfettered free markets as the key to widespread prosperity.

      Judt asked 21st century policy makers to take what he termed a “second glance” at how “our twentieth century predecessors responded to the political challenge of economic uncertainty” (p.315). In a 2007 review of Robert Reich’s Supercapitalism: The Transformation of Business, Democracy, and Everyday Life, Judt argued that the universal provision of social services and some restriction upon inequalities of income and wealth are “important economic variables in themselves, furnishing the necessary public cohesion and political confidence for a sustained prosperity – and that only the state has the resources and the authority to provide those services and enforce those restrictions in our collective name” (p.315).  A second glance would also reveal that a healthy democracy, “far from being threatened by the regulatory state, actually depends upon it: that in a world increasingly polarized between insecure individuals and unregulated global forces, the legitimate authority of the democratic state may be the best kind of intermediate institution we can devise” (p.315-16).

      Judt’s review of Reich’s book anticipated the anxieties that one sees in both Europe and America today. Fear of the type last seen in the 1920s and 1930s had remerged as an “active ingredient of political life in Western democracies” (p.314), Judt observed one year prior to the economic downturn of 2008.  Indeed, one can be forgiven for thinking that Judt had the convulsive phenomena of Brexit in Britain and Donald Trump in the United States in mind when he emphasized how fear had woven itself into the fabric of modern political life:

Fear of terrorism, of course, but also, and perhaps more insidiously, fear of uncontrollable speed of change, fear of the loss of employment, fear of losing ground to others in an increasingly unequal distribution of resources, fear of losing control of the circumstances and routines of one’s daily life.  And perhaps above all, fear that it is not just we who can no longer shape our lives but that those in authority have lost control as well, to forces beyond their reach.. . This is already happening in many countries: note the arising attraction of protectionism in American politics, the appeal of ‘anti-immigrant parties across Western Europe, the calls for ‘walls,’ ‘barriers,’ and ‘tests’ everywhere (p.314).

       Judt buttressed his case for social democracy with a tribute to the railroad as a symbol of 19th and 20th century modernity and social cohesion.  In essays that were intended to be part of a separate book, Judt contended that the railways “were and remain the necessary and natural accompaniment to the emergence of civil society. They are a collective project for individual benefit. They cannot exist without common accord . . . and by design they offer a practical benefit to individual and collectivity alike” (p.301). Although we “no longer see the modern world through the image of the train,” we nonetheless “continue to live in the world the trains made.”  The post-railway world of cars and planes, “turns out, like so much else about the decades 1950-1990, to have been a parenthesis: driven, in this case, by the illusion of perennially cheap fuel and the attendant cult of privatization. . . What was, for a while, old-fashioned has once again become very modern” (p.299).

      In a November 2001 essay appearing in The New York Review of Books, Judt offered a novel interpretation of Camus’ The Plague as an allegory for France in the aftermath of German occupation, a “firebell in the night of complacency and forgetting” (p.181).  Camus used The Plague to counter the “smug myth of heroism that had grown up in postwar France” (p.178), Judt argued.  The collection concludes with three Judt elegies to thinkers he revered, François Furet, Amos Elon, and Lesek Kołakowski, a French historian, an Israeli writer and a Polish communist dissident, representing key points along Judt’s own intellectual journey.

***

      The 28 essays which Homans has artfully pieced together showcase Judt’s prowess as an interpreter and advocate – as a public intellectual — informed by his wide-ranging academic and scholarly work.  They convey little of Judt’s personal side.  Readers seeking to know more about Judt the man may look to his The Memory Chalet, a memoir posthumously published in 2010. In this collection, they will find an opportunity to savor Judt’s incisive if often acerbic brilliance and appreciate how he brought his prodigious learning to bear upon key issues of his time.

Thomas H. Peebles
La Châtaigneraie, France
July 6, 2016

3 Comments

Filed under American Politics, European History, France, French History, History, Intellectual History, Politics, Uncategorized, United States History, World History

Affirmative Government Advocate

Artie.photo

Artie.cov

Andrew and Stephen Schlesinger, eds.,
The Letters of Arthur Schlesinger, Jr. 

      Arthur Schlesinger, Jr., was what we would today likely describe as a “public intellectual,” a top-notch historian who was also deeply engaged in political issues throughout his adult life.  Schlesinger’s father, Arthur Schlesinger, Sr., was himself a top-notch historian.  Both father and son taught at Harvard, with the younger Schlesinger finishing his academic career at the City University of New York.  Born in 1917, the younger Schlesinger was the author of a highly respected book on Andrew Jackson (“The Age of Jackson”) and a three volume series on Franklin Roosevelt (“The Age of Roosevelt”). He also wrote an influential 1949 political tract, The Vital Center, an argument for liberal democracy, based on civil liberties, the rule of law, and regulated capitalism, as the only realistic alternative to  fascism on the right and communism on the left.  Schlesinger was one of the founders of Americans for Democratic Action, ADA which, more than any other single organization, epitomized mainstream post-World War II liberalism. He was also a loyal, always passionate, and often-elegant spokesman for the liberal wing of the Democratic Party.

      Schlesinger served as an advisor to President John Kennedy, whom he revered. After Kennedy’s assassination in November 1963, Schlesinger wrote an account of the short Kennedy administration, A Thousand Days: John F. Kennedy in the White House, which won the 1966 Pulitzer Prize for Biography. Schlesinger stayed on briefly as an advisor to President Lyndon Johnson after Kennedy’s death but came to detest Johnson and his decision to escalate the Vietnam War. He returned to academia at City University of New York after his stint with the Johnson administration, where he remained until his retirement in 1994. He died in 2007 at the age of 89. Over the course of a long lifetime, Schlesinger wrote letters – lots of letters.

        Both the quality and the quantity of Schlesinger’s letter writing habits are on full display in this nearly 600-page collection, The Letters of Arthur Schlesinger, Jr., edited by Schlesinger’s sons Andrew and Stephen. The sons have culled together selected letters to and from their father and arranged them in chronological order, adding editorial comments by way of footnotes. They estimate that they reviewed approximately 35,000 letters before making their choice of those contained here. Spaced over 71 years of Schlesinger’s adult life, from age 18 to his death at age 89, 35,000 letters amounts to an astounding average of almost 1½ letters per day during Schlesinger’s adult years.  Schlesinger corresponded regularly with presidents and presidential candidates, Congressional leaders, Supreme Court justices, cabinet officials, writers, journalists, religious leaders, intellectuals and scholars. He also answered questions from members of the public, including school students.

       Formal letter writing is today largely an extinct practice, replaced by email exchanges that occasionally resemble letters of old, although more often are less formal and far more cursory. Throughout most of Schlesinger’s life, however, letters were a frequent and frequently consequential mode of communication. Schlesinger, the editors observe in their introduction, “may indeed be one of the last of the old-fashioned breed of American figures for whom letters were the paramount means of communication – a phenomena that seems oddly arcane in a digital age” (p.xii).

       The “abiding theme” of the letters contained here, the editors indicate, was Schlesinger’s preoccupation with political liberalism and its prospects. “He was always in some way promoting and advancing the liberal agenda; it was his mission, purpose, and justification.” (p.xi), they write. Through their selection of letters, the editors seek to show their father’s “intellectual and political development as one of the nation’s leading liberal voices” (p.xiii). The collection they have assembled easily meets this objective.  It allows the reader to piece together the constituent elements of what might be termed classical, post-World War II mainstream American liberalism.

     Schlesinger’s brand of liberalism was staunchly anti-communist in post World War II America, yet supported civil liberties even for communists and therefore vigorously opposed the “mad, brutal and unrestrained fanaticism” (p.76) of Senator Joseph McCarthy’s anti-communist campaigns.  Schlesinger’s liberalism supported civil rights in the United States, a strong stand against the Soviet Union — a “monstrous police despotism” (p.27) — across the globe, and independence for colonized countries in Asia, Africa and the Middle East.  Above all, Schlesinger’s liberalism was predicated upon what he termed “affirmative government,” the use of federal authority to regulate capitalism, assist the men and women working within the capitalist economy, and advance the national interest.  As McCarthy’s intemperate brand of anti-communism gradually faded in the late 1950s, Schlesinger’s anti-communist fervor also subsided. By the end of the 1960s, the plight of newly independent states no longer seemed to be a preoccupation, and Schlesinger had by then recognized that communism bore many faces in addition to that of the Soviet Union.  By contrast, support for affirmative government, civil rights and civil liberties remained at the core of Schlesinger’s credo until his death in 2007.

* * *

       In numerous letters, Schlesinger warned against the Democratic Party becoming too pro-business.  We already have one pro-business party in the United States, Schlesinger argued with correspondents, we don’t need another.  In a 1957 letter to then Senate Majority Leader Lyndon Johnson, whose support Schlesinger recognized as essential to driving a liberal legislative agenda through Congress, Schlesinger sought to dissuade Johnson from prioritizing budget cutting.  Schlesinger described the “great tradition of the Democratic party” as the “tradition of affirmative government – the tradition of Jackson, Bryan, Wilson and FDR – not the tradition which hates the national government, but the one which regards it as an indispensable means of promoting the national welfare. If Democrats reject this tradition, they reject any chance of national political success. And a frenzy for budget cutting as an end in itself amounts certainly to a rejection of this tradition” (p.144).

      Schlesinger remained an advocate for affirmative government throughout his adult life. After Jimmy Carter lost his bid for re-election to Ronald Reagan in 1980, Schlesinger criticized Carter for his “systematic attack on the great creative contribution of the modern Democratic party – the idea of affirmative government,” an attack which he considered “demagoguery” and pandering to the “most vulgar American prejudices” (p.470). He advised 1984 Democratic presidential nominee Walter Mondale to avoid deficit spending as a political issue: “The Republicans have used the deficit as an issue for fifty years . . . The only people who worry about the deficit are businessmen most of whom have always voted Republican and will doubtless do so again” (p.485).  After Republicans gained control of the House of Representatives in November 1994, Schlesinger sent a long, and apparently unsolicited, set of suggestions to President Bill Clinton on themes for his forthcoming January 1995 State of Union address. Arguing that the Clinton administration “cannot succeed by trying to out-Republican the Republicans” (p.549), Schlesinger urged Clinton to reject the view that the election was a “repudiation of activist government” (p.548) and to “outgrow the illusion” that “power taken away from government falls to the people; much of it goes rather to corporations not accountable (as government is) to the people.” The United States cannot solve its  problems by “turning them over the marketplace and thinking they will solve themselves” (p.550), Schlesinger contended.

       Schlesinger’s numerous letters to presidential candidate Adlai Stevenson are among the richest in this collection. Schlesinger supported and advised Stevenson in his two bids for the presidency, in 1952 and 1956. Although an admirer of Stevenson’s cerebral qualities, Schlesinger perceived an infuriating “Calvinism” in Stevenson. He “cannot bear to have things come easy or to say things which please everybody” (p.60), Schlesinger wrote in 1953.  Schlesinger was incensed in 1956 that Stevenson appeared to back “gradualism” in desegregating public schools after the Supreme Court’s decision in Brown v. Board of Education declared segregated schools unconstitutional. He compared Stevenson’s queasiness on desegregation to that of Massachusetts Senator John Kennedy, then campaigning openly for the Vice-Presidential nomination, who called on Democrats to take a forthright stand in support of the Supreme Court’s decision despite the possibility of alienating southern voters. I know Kennedy is “damn anxious to get southern support for the Vice-Presidency,” Schlesinger wrote to Stevenson speechwriter Willard Wirtz.   Yet Kennedy gives an “altogether different impression of his feelings on the subject [of civil rights]” (p.130),

       When President Dwight Eisenhower’s health became an issue prior to the 1956 presidential elections, Schlesinger talked himself into the view that Stevenson had a shot at being elected.  On several occasions, Schlesinger felt forced to remind Stevenson that the “one important doubt” the American people had about him was “whether you want to be President” (p.103), as he stated in a 1955 letter to Stevenson. A few months prior to the 1956 election, Schlesinger sent Stevenson a lengthy letter coaching the presidential aspirant on how to respond to questions at a forthcoming political event:

Don’t say that problems are intricate and complicated. Everyone knows that they are. . . Don’t profess ignorance on questions, or say that you don’t know enough to give a definite answer. If you are running for the Presidency, people expect not necessarily a detailed technical answer, but a clear and definite expression of the way you would propose to tackle the problem. Don’t hesitate to give a short answer. . . Do not think that all this is in any sense a counsel of dishonesty. Politics, as its best, is an educational process” (p.134-45, italics in original).

       After Stevenson went on to suffer his second lopsided loss to Eisenhower in the 1956 elections, Schlesinger turned his attention to Senator Kennedy.  When he first met Kennedy at a dinner party in 1946, Schlesinger described the young man from Massachusetts (born in 1917, the same year as Schlesinger) as “very sincere and not unintelligent, but kind of on the conservative side” (p.17). In supporting Kennedy’s run for the presidency in 1960, Schlesinger sought to coax the Senator to move toward more liberal positions.  Perceiving lethargy in the campaign after Kennedy received the Democratic Party nomination in August 1960, for example, Schlesinger urged Kennedy to “exploit one of your strongest assets – i.e., that you are far more liberal than Nixon. There is no point, it seems to me, in playing this down and hope to catch some votes in Virginia at the price of losing New York . . . I think you should take a strong liberal line from now on” (p.215).

         Schlesinger was among the many “brightest and the best” whom Kennedy assembled to be part of his administration, and Schlesinger frequently remarked that his opportunity to serve in the Kennedy administration was the high point of his career. However, there are not many letters here from Schlesinger’s time at the White House, perhaps because he did not feel free to comment to outsiders on administration business, perhaps because he did not have the time in that position to write letters with the frequency he had had as an academic.  Schlesinger stayed with the Johnson administration only through January 1964, and quickly became a caustic critic of Johnson’s decision to escalate the war in Vietnam.

       Schlesinger refused to endorse Vice-President Hubert Humphrey, his long-time friend and former ADA ally, for the Democratic Party nomination in 1968 (Schlesinger was a strong supporter of Robert Kennedy for the nomination until his assassination in June 1968, after which he supported George McGovern). In July 1968, Schlesinger responded to ADA lawyer David Ginsburg’s statement that Humphrey’s approach to Vietnam and that of the Republican candidate, former Vice-President Richard Nixon, would not be “too far apart.” If this is so, Schlesinger replied, “then give me Nixon – on the simple ground that, with the Democratic party in the opposition, we could stop his [Nixon’s] idiocy quicker.” If we are to have a “stupid and reactionary foreign policy, it should be carried out by a Republican administration, not by a Democratic administration” (p.358).

       Although Schlesinger never embraced Jimmy Carter and his presidency, he saw the Reagan years as a disaster for the United States. He therefore eagerly backed the candidacy of Bill Clinton, even though Clinton seemed to be like Carter, running against affirmative government.  After the Clinton presidency, Schlesinger offered advice and support for 2000 presidential candidate Al Gore, Jr.  When Gore lost that election despite winning the national popular vote by a wide margin, Schlesinger withdrew from active counseling of presidential aspirants.

      The collection’s most amusing correspondence involves Schlesinger’s quibble with conservative pundit William F. Buckley, Jr., over a “blurb” on a Buckley book, Rumbles Right and Left, which quoted Schlesinger in 1963 as asserting that Buckley had a “facility for rhetoric which I envy, as well as a wit which I seek clumsily and vainly to emulate” (p.262). Schlesinger vehemently denied he ever said anything like this about Buckley and threatened to sue Buckley’s publisher to retract the attribution. Their feud reveals that Buckley did indeed have a first class wit, and that Schlesinger was humor-challenged.  Buckley signed one letter “Wm. F. ‘Envy His Rhetoric’ Buckley, Jr.” (p.263).  When Schlesinger refused to go on Buckley’s television show Firing Line — Buckley said that he was informed that Schlesinger did not wish to “help” Buckley’s program – Buckley taunted Schlesinger by asking him, “shouldn’t you search out opportunities to expose yourself to my rhetoric and my wit? How else will you fulfill your lifelong dream of emulating them?” (p.389). To this, the dour Schlesinger could only reply, “[c]an it be that you are getting a little tetchy in your declining years?” (p.389).

        Readers are likely to find curious Schlesinger’s frequent correspondence with Mrs. Marietta Tree, a socialite and Democratic party activist, the granddaughter of Reverend Endicott Peabody, founder and first headmaster of the Groton School, and the wife of a British Member of Parliament, Ronald Tree, himself the grandson of famed Chicago businessman Marshall Field.  Schlesinger wrote to Tree in exceptionally endearing terms over the course of nearly two decades. In one particularly impassioned flourish, Schlesinger told his “Darling M” that he could not “resist writing to you from the heart of the Middle West [Topeka, Kansas]. Why won’t you come with me on one of these trips? You gently bred eastern girls ought to get to know America. . . It is long since we have had a good, old-fashioned evening together and I need one desperately.  All dearest love, A” (p.167-68, italics in original). Schlesinger’s sons point out in a footnote that their father and Tree “were never lovers, despite the words of endearment in their correspondence. Her passion was reserved for Adlai Stevenson” (p.57*).  Judging by the language of his letters, however, their father was plainly smitten by the enticing Tree.

       Then, suddenly, the letters to Mrs. Tree stop.  This comes at a time when we learn via another editorial footnote that Schlesinger and his first wife Marion, whom he married in 1940, divorced in 1970, and that he remarried Alexandra Emmet Allen in 1971. But there are no letters here containing references to a deteriorating marriage relationship or a developing interest in another woman. This may be the result of an editorial decision on the part of his sons to eschew the personal side of Schlesinger and emphasize the political.

* * *

       The lack of references to key moments in Schlesinger’s personal life is also a reminder that a collection of letters should not be confused with biography or autobiography. This smartly compiled collection nonetheless provides a keen sense of how the galvanizing political and public issues of Schlesinger’s adult life looked not only for Schlesinger himself but also for the robust and unapologetic liberalism that he articulated from the early post-World War II years into the first decade of the 2st century.

Thomas H. Peebles
La Châtaigneraie, France
December 8, 2015

6 Comments

Filed under American Politics, American Society, History, Intellectual History, Politics, United States History

Never Rely on Experts

Dallek

Robert Dallek, Camelot’s Court:
Inside the Kennedy White House

     During his short presidency, John Kennedy surrounded himself with some of the country’s sharpest minds and most credentialed individuals, yet was exasperated much of the time by the inadequacy of the advice they provided him. In Camelot’s Court: Inside the Kennedy White House, Robert Dallek elaborates upon this theme in a work that is above all a portrait of President Kennedy and a study of how he received and handled information and advice. Dallek is a prolific writer, the author of major works on Lyndon Johnson and on Richard Nixon’s relationship with Henry Kissinger, along with a full biography of Kennedy, An Unfinished Life: John F. Kennedy, 1917-63.

    International crises in Cuba and Vietnam dominate Dallek’s book, far more than the Cold War confrontation over Berlin, which looms in the background but is surprisingly not a major topic (Berlin was the subject of a book reviewed here in February 2013, Frederick Kempe’s Berlin 1961: Kennedy, Khrushchev and the Most Dangerous Place on Earth). Behind Cuba and Vietnam in a distant third place among the book’s substantive topics is the Civil Rights movement within the United States. Kennedy believed that the cause was just and important but looked at the issues raised primarily as a distraction from more pressing international ones. The main mission of the Kennedy White House, Dallek writes, was to “inhibit communist advance and avert a nuclear war” (p.xi).

     Kennedy is often described as a hardline, anti-Communist Cold Warrior and, given the times, it is difficult to see how he could have been anything else. Throughout his short presidency, Kennedy was obsessed with not appearing weak and inexperienced, especially in standing up to the Soviet Union. But the Kennedy in these pages is also exceptionally wary of the use and misuse of American military power to advance national interests in a dangerous nuclear age, way more than a surprising number of his closest advisors. As President, Kennedy consistently and often heroically resisted the urgings of these hard liners.

     Among Kennedy’s advisors, his brother Robert Kennedy, who formally served as Attorney General in his brother’s administration, occupied a special position as the president’s “leading advisor on every major question” (p.65). Robert Kennedy was his brother’s alter ego, an “enforcer” whom “everyone had to answer to if they fell short of the president’s expectations” (p.175). When the president needed to stay above the debate, brother Robert “could freely state his brother’s views” and, as needed outside the presence of his brother, “openly announce that he was declaring what the president wanted done” (p.334). John Kennedy came to believe that “only Bobby could be entirely trusted to act on his instructions” (p.328).

    By contrast, President Kennedy’s relationship with the career military officers in his entourage was fraught with tension and mistrust from the outset of his administration. Most Americans considered Kennedy a naval war hero, based on his widely publicized rescue of the crew of PT-109, a torpedo boat cut in half by the Japanese. The military, however, accustomed to serving former World War II Supreme Allied Commander Dwight Eisenhower during the previous eight years, “questioned the new president’s qualifications to manage the country’s national defense” (p.69). General Lyman Lemnitzer, Kennedy’s first Chairman of the Joint Chiefs of Staff, the administration’s highest ranked career military official, looked derisively at the young president as a man with “no military experience at all, sort of a patrol boat skipper in World War II” (p.70). But the real issue between Kennedy and the military, Dallek emphasizes, was “not Kennedy’s inexperience and limited understanding of how to ensure the country’s safety,” but rather “Kennedy’s doubts about the wisdom of using nuclear arms and the military’s excessive reliance on them as a deterrent against communist aggression” (p.70).

     Dallek begins with a long biographical sketch of John Kennedy that culminates in his narrow victory in 1960 over Vice-President Richard Nixon, familiar ground for most readers. He follows with a similar sketch of brother Robert, in a chapter entitled “Adviser-in-Chief;” and with still another chapter describing the background of some of the “extraordinary group of academics, businessmen, lawyers, foreign policy and military experts” (p.x) whom Kennedy tapped to work in his administration. This chapter, entitled a “Ministry of Talent” — a term borrowed from Theodore Sorensen, one of Kennedy’s leading advisors – includes short portraits of many individuals likely to be familiar to most readers: Defense Secretary Robert McNamara; Secretary of State Dean Rusk; Vice President Lyndon Johnson; US Ambassador to the UN and two time Democratic presidential nominee Adlai Stevenson; and National Security Advisor McGeorge Bundy, among others.

     Dallek’s substantive account begins only after this lengthy introductory material, about a third of the way into the book, where he focuses on how President Kennedy received and handled the advice provided him, especially during the Bay of Pigs operation in Cuba in April 1961; the Cuban Missile Crisis of October 1962; and Vietnam throughout his presidency. In Dallek’s account, Kennedy was ill-advised and misled by his advisors during the Bay of Pigs operation; admirably led his advisors during the Cuban Missile Crisis; and defaulted to them on Vietnam.

* * *

      Dallek’ addresses the ill-fated CIA Bay of Pigs operation in Cuba, which took place less than 90 days into the Kennedy presidency, in a chapter entitled “Never Rely on Experts.” The far-fetched operation was hatched during the Eisenhower administration and was presented to the president as a way to rid the hemisphere of nemesis Fidel Castro and what the United States feared was his very contagious form of communism. The plot consisted of utilizing approximately 1,500 Cuban exiles to invade the island, on the assumption that this small force would incite the local population to rise up and throw out Castro (the plot figures prominently in Steven Kinzer’s The Brothers, reviewed here in October 2014).

       Although Kennedy shared a sense of urgency in removing this communist threat just 150 kilometers from the United States’ southern coast, he worried about the perception in the rest of Latin America of any operation in Cuba tied to the United States. The question was not whether to strike against Castro, but rather how to bring him down “without provoking accusations that the new government in Washington was no more than a traditional defender of selfish U.S. interests at the expense of Latin [American] autonomy”(p.133). Kennedy was willing to accept the project’s dubious assumption that the operation could be executed without revealing U.S. government involvement, but opposed from the outset the commitment of U.S. military forces to supplement the exiles’ operation. Dallek suggests that Kennedy gave the green light to the operation primarily for political reasons, fearing the conservative reaction if he refused to go forward. As the world now knows, the operation was a colossal failure, badly wounding the inexperienced president early in his tenure.

      Dallek documents several key instances where advice to the president was, at best, incomplete, as well as some key facts that were withheld in their entirety. Deputy CIA Director Richard Bissell failed to tell the president that the CIA had concluded that the mission could not be successful without the engagement of direct U.S. military support, an option that Kennedy had all but ruled out. Bissell further told the president that if the initial invasion action were to falter, the exiles could escape into nearby mountains to regroup and lead the anti-Castro rebellion. However, he neglected to tell the president that they would have to cross about 80 miles of swampland to reach those mountains.

     Secretary of State Dean Rusk and Undersecretary of State Chester Bowles shared Kennedy’s doubts about the flawed scheme but failed to stand up to the CIA in internal deliberations, discrediting both in the eyes of the president. Then, after the operation failed, Bowles leaked a document to the press showing the State Department’s reservations, infuriating Kennedy. As he tried to recover from this devastating early blow to his presidency, Kennedy’s wariness of military advice transformed into a more generalized distrust for the advice of all experts.

* * *

      The Cuba story had a largely successful denouement the following year, with the famous October 1962 Cuban Missile Crisis. Although the United States knew by August of that year that unusual Soviet activity had been going on in Cuba, it was not until October 15th that intelligence officials definitively concluded that offensive missiles had been installed on the island, with a capacity to reach well over half of the United States. Over the next two weeks, the Cold War’s hottest crisis ensued. Kennedy’s strategy at the outset was to “broaden the group of consultants in order to ensure the widest possible judgments on how to end the Soviet threat peacefully, if possible,” notwithstanding the “poor record of his advisors on Cuba” (p.296). But Kennedy also “needed to guard against a domestic explosion of war fever, which meant hiding the crisis for as long as possible from the press and the public” (p.296).

     Kennedy’s Joint Chiefs of Staff predictably favored an air strike upon Cuba, followed up by a military invasion of the island. Several advisors, including former Secretary of State Dean Acheson, also urged air strikes against the missiles, with the possibility of subsequent military invasion. The aging Acheson, who disdained Kennedy, seems especially casual in Dallek’s account about using American military force. Defense Secretary McNamara was a counterpoint to the hawkish views of Acheson and of the military men under his command.

      McNamara developed early in the discussions the idea of a naval blockade rather than a military strike. The turning point came when Robert Lovett suggested that they call the blockade a “quarantine,” defining the U.S. action as “more of a defensive measure than an act of war” (p.315). Lovett’s “long experience in government and reputation for moderate good sense helped sway Kennedy. By contrast with Acheson, who urged prompt military action . . . Lovett thought the blockade was the best way to resolve the crisis, with force as a last resort” (p.315).

      Secretary of State Dean Rusk, whom Kennedy had considered weak and passive during the Bay of Pigs fiasco, revived his standing with Kennedy as a “cautious but steady presence” throughout the crisis, a “voice of reason that helped Kennedy resist the rash urgings of the military Chiefs” (p.333). Former Ambassador to the Soviet Union Llewellyn “Tommy” Thompson drew on his experience in Moscow to provide Kennedy with his assessment of how Soviet leader Nikita Khrushchev was likely to react and respond. Thompson thought that Khrushchev might be at odds with his own military chiefs and was able to convince Kennedy that “negotiating proposals might pressure [Khrushchev] into conciliatory talks” (p.313). Critical to the approach Kennedy finally adopted, Thompson advised the president to make it as easy as possible for Khrushchev to back down. Throughout the deliberations, Robert Kennedy retained his unique role, “less a thoughtful commentator” and more an “instrument of his brother’s ideas and intentions” (p.334).

      Even after  Khrushchev ordered missile-bearing Soviet ships to turn around and had otherwise signaled to the United States his willingness to defuse the crisis, the Joint Chiefs continued to advocate for the air strike and military invasion option. Kennedy considered this option “mad,” (p.332) and it appears even more so a half-century later. It is impossible to say, Dallek writes, “whether an invasion would have provoked a nuclear exchange with the Soviets.” But it is clear that the Soviets had “tactical nuclear weapons ready to fire if U.S. forces had invaded the island. Whether they would have fired them is unknowable, but the risk was there and certainly great enough for firings to occur in response to an invasion” (p.332).

      Having successfully defused the missile crisis, Kennedy “found it impossible to shelve plans for a change of regimes in Cuba” (p.373) during the remaining thirteen months of his administration prior to his assassination in Dallas in November 1963. But the nationalist uprising in Vietnam and the inability of the South Vietnamese government to resist that uprising was another cause of concern throughout the Kennedy administration.

* * *

     Kennedy appeared to accept the “domino theory,” that the fall of one developing country to international communism would lead to the fall of many if not most of its neighbors. He did not want to be the president who “lost” Vietnam, as Truman’s opponents labeled him the president who “lost” China. Equally important, he did not want to give the Republicans an issue they could use against him in the upcoming 1964 presidential elections. Yet, Kennedy was extremely reluctant to commit the United States to another land war in a distant location, all too reminiscent of the Korean War that had undermined Truman’s presidency. “For all Kennedy’s skepticism about involvement in a jungle war that could provoke cries of U.S. imperialism, he also saw Vietnam as a testing ground the United States could not ignore” (p.166-67). Kennedy never reconciled “his eagerness to prevent a communist victory in Vietnam” with his “reluctance, indeed refusal, to turn the conflict into America’s war, which risked [South Vietnam’s] collapse” (p.429).

     Dallek documents a series of tense and sharply divided internal meetings with the president on Vietnam. Not surprisingly, Kennedy’s career military advisors saw Vietnam primarily as a military problem, with a military solution. But, after the Cuban Missile Crisis, Kennedy seems to have concluded that they had little to offer in terms of substantive advice. Kennedy’s Deputy National Security Advisor Walt Rostow, a brilliant MIT professor with an “unlimited faith in social engineering” (p.165), also consistently offered hawkish views. Rostow was “apocalyptic about the consequences of inaction: ‘The whole world is asking. . . what will the U.S. do. . .?’ The outcome of indecisive U.S. action would be nothing less than the fall of Southeast Asia and a larger war” (p.243). McNamara, the putative boss of the military chiefs, initially favored the Rostow approach, as did Secretary of State Dean Rusk, although both ultimately came to advocate a political rather than military solution in Vietnam.

      John Kenneth Galbraith, the Harvard professor whom Kennedy had appointed as Ambassador to India, regularly sent letters directly to Kennedy, rather than through his boss, Secretary of State Rusk. Galbraith argued that there were no direct or obvious U.S. interests involved in Vietnam, and that it would be a mistake to commit American military resources to the defense of South Vietnam, its weak and wavering ally. Galbraith saw direct military involvement in Vietnam as leading the United States down the same path the French had traveled a decade earlier. Instinctively, Kennedy wanted to go with Galbraith’s position, but he never adopted that position, either. Rather, he mostly dithered.

     Kennedy repeatedly sent high-level advisors on short fact-finding trips to Vietnam. They typically returned to provide the president with upbeat reports on South Vietnam’s capabilities of defending itself, but with few if any realistic recommendations on how the United States should proceed. In September 1963, after the last such fact-finding trip to Vietnam during the Kennedy administration, General Victor Krulak, Special Assistant for Counterinsurgency to the Joint Chiefs of Staff, and Joseph Mendenhall, a State Department Asian expert, reported back to the president. Krulak “described a war that was moving in the absolutely right direction and was going to be won” (p.406), whereas Mendenhall saw an “entirely different universe: ‘a virtual breakdown of the civil government in Saigon’” (p.406-07). The astonished and plainly frustrated Kennedy retorted, “The two of you did visit the same country, didn’t you?”(p.407).

      The specific Vietnam item on Kennedy’s agenda by that time was whether to support a coup aimed at ridding South Vietnam of its leader Ngo Dinh Diem. By early 1963, the United States had concluded that Diem, a “staunch anticommunist Catholic” (p.230) with an “authoritarian and perhaps paranoid personality” (p.163), was unable to lead his country in resisting the North Vietnamese. What to do about Diem was the predominate issue over the final months of the Kennedy presidency, a “war within the war” (p.350). The pressure on Kennedy to give the go-ahead for a coup was “unrelenting” (p.403).

      But with no explicit orders from the president forthcoming, Undersecretary of State George Ball, acting in the absence of Secretary of State Rusk, finally told Ambassador Henry Cabot Lodge, Jr., in Saigon to tell anti-Diem generals that Washington approved a coup. Kennedy had “neither approved nor opposed a coup, but simply said he didn’t want it blamed on the United States. Kennedy’s uncertainty about what to do about Vietnam allowed advisers to fill the policy vacuum” (p.415). The coup took place on November 1, 1963, without Kennedy’s authorization and apparently with at best only minimal U.S. involvement. It ended up assassinating Diem and his brother Nhu, not sending them into exile, as Kennedy had desired.

     Kennedy allowed his administration’s Vietnam problem to “fester rather than confront a hard decision to expand U.S, involvement or shut it down,” Dallek writes. Kennedy’s hope was eventually to withdraw from Vietnam with “at least the appearance, if not the actuality, of victory. It was something of a pipe dream, but simply walking away from Vietnam did not strike him as a viable option – for both domestic political and national security reasons” (p.342).

     Dallek’s account of Kennedy’s Hamlet-like deliberations over Vietnam sets the stage for the question that Americans have been asking ever since: had Kennedy lived, would he have resisted the urgings to which successor Lyndon Johnson succumbed to escalate the war in Vietnam through large-scale US military participation. There is plenty of evidence to support either a yes or a no answer, Dallek indicates, and it is “impossible to say just what Kennedy would have done about Vietnam in a second term, if he had had one.” But, “given the hesitation he showed about Vietnam during his thousand-day administration, it is entirely plausible that he would have found a way out of the conflict or at least not to expand the war to the extent Lyndon Johnson did” (p.419), Dallek concludes.

* * *

     Kennedy scholars may find that Dallek’s work contains little that is new or fresh about the already extensively studied Kennedy administration. Yet, any reader who has worked in a bureaucracy, public or private, and has ever left a key meeting unsure whether the boss fully understood his or her brilliant arguments, is likely to appreciate Dallek’s close up depictions of how the ever skeptical and often distrustful Kennedy interacted with his advisors.  In Dallek’s telling, the boss fully understood his advisors’ arguments.

Thomas H. Peebles
La Châtaigneraie, France
November 24, 2015

8 Comments

Filed under American Politics, History, Politics, United States History

Moralizing Credibly to the World

Keys

Barbara Keys, Reclaiming American Virtue:
The Human Rights Revolution of the 1970s 

     During the 1970s, political liberalism in the United States embraced the notion of international human rights as a priority consideration in shaping American foreign policy. The liberal argument that gained traction during the latter portion of the decade was that the United States should not support or provide assistance to governments that engaged in practices violating international human rights norms, particularly torture and repression of dissent. But this liberal argument could gain its traction only after the end in early 1973 of America’s role as a belligerent in the Vietnam War.  Such is the premise which Barbara Keys, a Harvard-educated Senior Lecturer in American and International History at the University of Melbourne, Australia, expounds in her thoroughly researched and solidly written work, Reclaiming American Virtue: The Human Rights Revolution of the 1970s.

    Human rights as a “liberal foreign policy paradigm” was an “intellectual impossibility” while America was mired in Vietnam, Keys contends, and therefore “unthinkable in the circumstances of the war” (p.53).  As long as the war continued, a “profound fatigue with and abhorrence of the very idea of intervention precluded the development of any new, systematic effort to inject American power or values abroad . . . Only once the war was over would American liberals feel they could credibly moralize to the world” (p. 53-54).  What Keys describes as the “human rights revolution” of the 1970s in the United States was for American liberals an “emotional response to the trauma of the Vietnam War” (p.8) – or, as Keys’ title indicates, a means to reclaim American virtue.

* * *

     The term “human rights” came into vogue only after World War II, with the United Nations’ 1948 Universal Declaration of Human Rights, or UDHR, which established norms defining the basic rights that all humans were entitled to demand from their governments. Arising out of the destruction and devastation of World War II, the UDHR was one of the first international instruments to refer to human rights in general, rather than to the rights of specific groups. But the UDHR was mostly aspirational, a document “intended to be a beacon, not a guide to actual behavior” (p.22). It contained no enforcement mechanisms and numerous clauses indicated that it did not seek to infringe upon state sovereignty.

     Throughout the 1950s and 1960s, the term “human rights” was largely dormant in the United States, except as associated with the ineffectual UDHR, and played little discernible role in American foreign policy. These were also the decades when the term “civil rights” became part of the national vocabulary. Although civil rights might be thought of as the specific name for the movement for human rights for African-Americans, the two terms have different lineages. The notion of human rights Keys emphasizes, seeks “legitimacy and solutions in international law resting above the authority of the nation-state,” whereas the civil rights movement in the United States above all sought “American remedies to American injustice” (p.33-34).

      When American involvement in the war in Vietnam ended in 1973, “emotions spilled into new areas, casting old questions in fresh light and creating novel possibilities for action. Slowly, as a process of accumulation rather than epiphany, human rights became one of those possibilities” (p.127-28). The end of combat activities in Vietnam “opened the way for members of Congress to vent long-brewing anger at the conduct and content of U.S. foreign policy” (p.133-34). A loose group of Congressmen dubbed the “new internationalists” pursued support for human rights abroad as part of an American foreign policy orientation that also prioritized economic cooperation, cultural exchanges and support for democracy, with less emphasis upon military assistance.

     Among the new internationalists, a now-obscure Democratic Congressman from Minnesota, Donald Fraser, more than any other national official, was “responsible for creating a framework that linked disparate global problems under the heading of human rights” (p.76). In the House of Representatives, Fraser led hearings in late 1973 that are “often regarded as the moment when a movement for international human rights in the United States began to take off,” generating a “blueprint for much of the congressional human rights efforts of the next few years” (p.141). The blueprint included several changes to the administration of American foreign aid that made it more difficult for the United States to provide assistance to foreign governments that engaged in human rights abuses, especially torture and detention of political prisoners. Section 32 of the 1973 Foreign Assistance Act, which came to be known as the “Fraser Amendment,” provided for “reductions (or, more often, the threat of reductions) in security aid for gross violations such as torture, coupled with the requirement that the State Department issue reports critiquing foreign countries’ human rights records” (p.165).

     In the aftermath of the Fraser Amendment, Congress used country-specific public hearings to “shape public opinion and signal concern about human rights abuses”(p.176). It focused on “sensational abuses, torture above all,” and made cuts in aid to “friendly but strategically expendable governments” (p.176). The results were “inevitably ad hoc and inconsistent, with some countries and some abuses drawing attention and sanctions while others were largely ignored” (p.176). Liberals hoped that cutting aid would stimulate reforms and reduce repression but, as Fraser and others admitted, they had “little evidence that targeting aid would work as planned” (p.160). Tangible effects were not, however, the measure of success. The crucial task was to “restore a commitment to American values by dissociating from regimes that tortured and murdered political opponents” (p.160) – and thereby reclaim American virtue.

     In Paraguay, for example, a country with “little significance to the United States,” human rights abuses were met with a “solid front: diplomatic isolation, total cutoffs in aid, and blocked loans in international forums” (p.257). Between 1974 and 1976, liberals also pushed through aid measures that reduced or cut off aid to South Korea, Chile, and Uruguay. Allies in these years included conservatives who supported dissidents in the Soviet Union, mostly Jewish, who wished to emigrate, most frequently to Israel.

     The spokesman for this group was another Democrat, albeit one considered highly conservative, Senator Henry “Scoop” Jackson from the State of Washington. Joining his cause were several intellectuals who were later labeled “neo-conservatives,” including Jeanne Kirkpatrick, Irving Kristol and Daniel Moynihan. With Senator Jackson leading the charge in Congress, “unrepentant Cold Warriors took the rhetoric of human rights newly popularized internationally by Soviet dissidents and fashioned a straightforwardly anticommunist policy around the universalist language [of the UDHR]. It was a stunning shift in the rhetoric of conservative anticommunism, which in the 1950s and 1960s had been overtly hostile to the UN and . . . had seen UN human rights instruments as a dangerous threat to American values” (p.104).

      But this neo-conservative embrace of human rights was driven by a fervent rejection of the shame and guilt that had characterized the anti-Vietnam War movement and the campaign rhetoric of 1972 presidential candidate George McGovern. For the conservative proponents of Soviet Jewry, the Vietnam War “required no apology;” it had been not immoral but rather an “admirable expression of the nation’s moral principles, as well as a strategic necessity, and consonant with America’s consistently beneficent role in the world” (p.116).  Jackson and his cohorts believed that the “self-doubt provoked by the Vietnam War threatened to weaken America’s resolve in what remained a life-or-death struggle against communism” (p.104).

     The cause of human rights in the Soviet Union pulled liberals in two directions. While sympathetic to Jews who wished to emigrate, they also “strongly supported improved U.S.-Soviet ties, reduced tensions, and the broad aims of détente” which the Nixon and Ford administrations were pursuing. Their aims therefore “diverged from those of hardliners like Jackson who sought to derail détente” (p.125). The foil to this odd liberal-conservative alliance was Henry Kissinger, Secretary of State to Presidents Nixon and Ford.

      Kissinger expounded a realpolitik approach to foreign policy, which gave priority to America’s geo-political interests and allowed little room for judgments about a country’s internal human rights record. Kissinger argued that it was dangerous to “make the domestic policy of countries around the world a direct objective of American foreign policy” (p.133) at a time when the administration was seeking to reduce tensions with the Soviet Union and thereby reduce the risk of nuclear war. Although Kissinger believed that human rights initiatives would hurt relations with America’s allies, what most spurred his opposition was resentment at what he considered congressional intrusions into executive branch prerogatives to shape the nation’s foreign policy.

     For 1970s liberals, Kissinger was the personification of all that was wrong with the way American foreign policy was conducted. But neither did he have many fans among the neo-conservatives pushing the Soviet Union on Jewish emigration. They regarded détente with the Soviet Union, pursued by both the Nixon and Ford administrations, as wrong headed and dangerous. Kissinger’s adamant defense of realpolitik and executive prerogatives backfired, playing a “pivotal role in moving human rights from the sidelines to the center of American diplomacy,” Keys argues.  Ironically, Kissinger would be a serious contender for designation as the person “most responsible for advancing the cause of international human rights in the mid-1970s” (p.153), she writes.

      Jimmy Carter, who won the presidency in the 1976 election, is often thought of as the catalyst for bringing human rights into the mainstream of American foreign policy. As a presidential candidate, however, Carter had been skeptical about elevating human rights to a foreign policy priority position. He did not share the deep emotional concern of Jackson and his cohorts for Soviet Jews, “nor was it his instinct to identify with political prisoners around the world” (p.236). His embrace of human rights was “both late and serendipitous” (p.215). But Carter “eventually came around to the issue because it resonated with his theme of restoring morality and, more pragmatically, because it would enhance his standing among Jewish voters” (p.236).

     Discovering what human rights promotion meant in practice was for the Carter administration “far more complicated than anyone had anticipated. The difficulties the administration encountered in formulating a human rights agenda attest both to a lack of specific planning and the sheer novelty of a human rights based foreign policy. There were no precedents to draw on, no prior models from which to borrow,” leaving the impression of “incoherence and muddle” (p.250). Given inflation, gas lines and above all the 444-day hostage crisis in Iran, which the Carter administration was unable to resolve, Carter’s four-year term was frequently viewed as a failure.

     Ronald Reagan, who defeated Carter in the 1980 presidential election, explicitly disavowed human rights as a priority consideration in the foreign policy of his administration. But, thanks especially to a credible human rights lobby that had taken shape during the Carter administration, Reagan could not ignore human rights entirely. In particular, Keys emphasizes how the American branch of Amnesty International, AI USA, evolved during the Carter administration into an organization with serious clout on Capitol Hill and with the State Department.

      AI USA focused initially on political prisoners, lobbying for aid cuts to regimes that tortured and jailed opponents in large numbers, a narrow focus “ideally suited to the Zeitgeist of the seventies” (p.181), Keys argues. Rather than seeking to effectuate wholesale structural changes within selected governments, AI USA aimed more modestly at making specific and targeted changes to practices and individual behavior within those governments. Amnesty “resolutely portrayed itself as nonpartisan – indeed as beyond politics” (p.192). But despite its apolitical mantra, its “most prominent activities and the majority of its leaders and grassroots members were on the left of the political spectrum” (p.192). Charitable tax law enjoined the organization from directly lobbying the government and AI rules prohibited it from taking a position on foreign aid. The office nonetheless worked closely with State Department officials and sympathetic members of Congress, providing information, requesting action, and prodding them to ask questions.

      Keys concludes that in light of the terrorists attacks of September 11, 2001, and the United States’ protracted military involvement in Afghanistan and Iraq, “Americans seem to be losing interest in the idea [of human rights] as a guide to U.S. foreign policy” (p.277). While American public sentiment could well be turning inward, repudiation of human rights in the formulation of American foreign policy would be far more difficult today than in the Reagan administration. Several other human rights organizations have cropped up beside AI USA, such as Human Rights Watch and Freedom House, to convey human rights concerns to Washington policy makers and the public. The clout of these organizations alone would make a repudiation of human rights unlikely. Moreover, the State Department is required to address human rights in a multitude of contexts.

      The Department’s annual country-by-country human rights report, coordinated by a vast bureaucracy within the State Department, the Bureau of Democracy, Rights, and Labor, details individual countries’ human rights records in a strikingly broad array of areas. The report is read closely and taken seriously around the world.  Further, the United States’ anti-human trafficking legislation requires the State Department to produce another report, coordinated by another bureaucracy within the Department, which sets forth individual countries’ progress in curtailing human trafficking. The legislation provides for sanctions for those countries deemed to be making insufficient progress. During my career working in U.S. Embassies, I was frequently involved in the preparation of these reports.

       I was even more involved in what is termed “Leahy Vetting,” a process established by an amendment to the Foreign Assistance Act of 1961 sponsored by Vermont Senator Patrick Leahy. Leahy Vetting mandates a formal State Department determination that any specific instance of U.S. assistance to overseas law enforcement and security units will  not include officers or units that had engaged in serious human rights abuses. Although realpolitik of the Kissinger variety has hardly disappeared from the United States’ foreign policy formulation process, today it competes with human rights and a wide range of other institutionalized considerations in determining that policy.

* * *

     As a means of “coming to terms with the Vietnam War” and a “way to heal the country” (p.3), the human rights revolution of the 1970s which Keys depicts represents still another legacy of the traumatic Vietnam conflict.  But Keys also demonstrates that human rights rose to its prominent position as a result of diverse pressures and motivations, which she methodically ties together.  Writing  in straightforward if not quite riveting prose, Keys  casts incisive light on an often overlooked aspect of modern American liberalism, now thoroughly mainstream; and on how and why the human rights records of other governments came to play a prominent role in defining America’s relationship with the rest of the world.

Thomas H. Peebles
La Châtaigneraie, France
November 3, 2015

3 Comments

Filed under American Politics, Politics, United States History, World History

Often Our Neighbors, Too Often Our Friends

 AmericanNazis

Eric Lichtblau, The Nazis Next Door:

How America Became a Safe Haven for Hitler’s Men  

      Among those who served in Hitler’s killing machines and committed war crimes during the Second World War, a countless number escaped any punishment for their crimes after the war ended in 1945. Many were deemed critical to the rebuilding of Germany, both in the Soviet and Western zones, and were welcomed into the post-war structures and institutions needed for Germany’s rebuilding. Others escaped to foreign destinations, often with the assistance of the Vatican and Red Cross, with Latin America in particular a favored destination. Adolf Eichmann was one spectacular example, and one of the few who did not live out his life in Latin America in relative tranquility (see Deborah’s Lipset’s account of the Israeli capture and trial of Eichmann, reviewed here in October 2013).

     But, to a surprising extent, the United States was also a prominent and even welcoming destination for former Nazi war criminals, both Germans and collaborators from Nazi-occupied Eastern European countries. They ranged from camp guards to Nazi policymakers. They settled in all regions of the United States. Most lived unobjectionable lives in their adopted country as factory workers, businessmen, scientists, and even prominent religious leaders. In The Nazis Next Door: How America Became a Safe Haven for Hitler’s Men, Eric Lichtblau, an investigative reporter for The New York Times, weaves together several cases of suspected Nazi war criminals living in the United States. Lichtblau spotlights how the cases came to the attention of U.S. authorities, how they were handled, and the personalities on each side, those seeking to remove ex-Nazi war criminals from the United States and those opposing their removal.

* * *

     The Cold War between the Soviet Union and the Western powers that erupted almost simultaneously with the defeat of Nazi Germany was “always at the center of American’s calculations over what to do about the Nazis” (p.31), Lichtblau writes. By the early 1950s, Allen Dulles at the CIA, J. Edgar Hoover at the FBI, and a handful of other senior intelligence officials had in place around the globe a “formidable network of their own of loosely linked and far-flung ex-SS men and Nazi operatives. They were the spy agencies’ foot soldiers in the Cold War” (p.29). “Nobody hates the Commies more than the Nazis” seemed to be the justification U.S. agencies invoked, often shielding their sources from other US agencies interested in tracking down Nazi criminals and holding them accountable.

      Within the United States, the network of former Nazis grew by its own momentum.

One ex-Nazi agent recruited to work for the United States would lead to the next, and the next; one anti-Communist spy ring made up of scores of ex-SS men would produce another, and another. . . [H]undreds of Nazi officers who were the nation’s sworn enemies just years earlier were now ostensibly on America’s side as spies, informants, and intelligence “assets”; fed and housed; paid and protected; dispatched and debriefed; code-named; cleansed, and coddled by their American handlers. That they had once worked for Hitler’s Third Reich was of little concern (p.30).

     Many of the ex-Nazis whisked into the United States were scientists, operating under a top-secret project named “Project Paperclip.” Although officially closed to “ardent” Nazis who took part in wartime atrocities, this exclusion was what Lichtblau terms a “fig leaf, a bureaucratic cover that was routinely ignored, as the U.S. government brought in professionals with direct links to Nazi atrocities and helped them ‘cleanse’ their war record’” (p.10). American officials were “determined to claim the Nazi brain trust for themselves,” regarding recruitment of top Nazi scientists as a “matter of survival in the postwar world” (p.24).

      The project included not just rocket scientists like Werner von Braun but also “doctors and biologists; engineers and metallurgists; even a nutritionist, a printing pressman, and a curator of insects from the Berlin Museum” (p.25). The Soviet Union, the new enemy, was also enticing German scientists to its side with “all sorts of promises” and there were reports that Moscow was “kidnapping unwilling scientists and bringing them to the Russian occupation zone. The Americans wanted their share. For both Washington and Moscow, Hitler’s scientists had become the spoils of war” (p.24).

          Lichtblau estimates that over 10,000 immigrants with clear ties to the Nazi regime found refuge in the United States, although the precise number will never be known because the United States had made it “so easy for them to fade seamlessly into the fabric of the country” (p.228). America’s disinterest in identifying suspected Nazi war criminals after the war was “so prolonged, its obsession with the Cold War so acute, its immigration policies so porous, that Hitler’s minions had little reason to fear they would be discovered” (p.228). Yet, beginning in the 1970s, many were discovered, thanks to the work of a handful of individuals both within and outside the United States government.

      Within the government, Congresswoman Elizabeth Holtzman almost singlehandedly focused the attention of her legislative colleagues and American authorities on Nazi war criminals living in the United States. When Holtzman arrived in Congress, the Immigration and Naturalization Service (INS) had primary responsibility for the exclusion of Nazi war criminals, usually on the basis of having provided misleading or incomplete information for entry into the country (exclusion was the strongest sanction available to American authorities; no legislation criminalized Nazi atrocities committed during World War II, and the ex post facto clause of the U.S. Constitution would have barred post-war prosecutions of such acts).

       In 1978, Holtzman spearheaded a major change to American immigration legislation – termed the “Holtzman Amendment” – making participation in wartime persecution of civilians an independent basis for denaturalization and deportation. The following year, Holtzman engineered the creation of the Office of Special Investigations (OSI) as a unit within the Criminal Division of the United States Department of Justice. From that point, OSI led the government’s efforts to identify Nazis war criminals living n the United States and seek their removal from the country.

* * *

        Lichtblau’s case studies skillfully portray the personalities involved on all sides of the hunt for Nazi war criminals. Lichtblau begins with ex-Nazi SS recruit Tscherim “Tom” Soobzokov, and returns to Soobzokov’s improbable story at several subsequent points. From the North Caucuses (the area of today’s Russia between the Black and Caspian Seas, north of Georgia and Azerbaijan), Soobzokov was accused having been “Hitler’s henchman” who “turned on his own people” and “led roaming Third Reich ‘execution squads’ that gunned down Jews and Communists” (p.xiii). Some called Soobzokov the Führer of the North Caucuses.

        After the war, the CIA recruited Soobzokov. He served for a while as an agency source in Jordan and, with CIA assistance in cleansing his wartime record, came to the United States in 1955. He settled in Patterson, New Jersey where he became a mid-level county official and influential member of the local Democratic political machine. With his passionate anti-communism, Soobzokov was also recruited by the FBI and charged with keeping track of other North Caucuses immigrants with potential communist leanings. In the late 1950s, however, the CIA concluded that Soobzokov was an “incorrigible fabricator” (p.64) and cut its ties with him.

     But Soobzokov remained an informant for the FBI and his immigrant success story in Patterson continued unabated for another two decades, until 1977. That year, a best selling book, Wanted: The Search for Nazis in America, written by Howard Blum, a young investigative reporter for The Village Voice, identified Soobzokov as one of the leading ex-Nazis living in the United States. Written in a “suspenseful style and an outraged tone” (p.117, a description that could also be applied to Lichtblau’s work), Wanted was another crucial factor in focusing Americans’ attention on the Nazi war criminals living in their midst.

     The Justice Department opened a case against Soobzokov, seeking to strip him of his American citizenship and remove him from the country on the ground that, when first admitted into the United States he had “willfully concealed from the authorities his membership in the German SS during the war” (p.121). Soobzokov’s lawyers countered that Soobzkov had fully informed American authorities of the full extent of his SS involvement. To the dismay of the Justice lawyers, they were able to produce two State Department documents in CIA possession – after the State Department told Justice it maintained no records on Soobzokov — showing precisely what Soobzokov claimed, that prior to his admission into the country he had indeed fully informed American authorities of his role as an SS Nazi collaborator. Once the two documents had been authenticated, the Justice Department had no choice but to drop its suit against Soobzokov.

     Soobzokov also brought what seemed like an audacious libel suit against multiple individuals and entities, including the publisher of Wanted, an affiliate of The New York Times. Soobzokov’s libel suit turned out to be one of few that the newspaper agreed to settle. But before he could enjoy his apparent vindication, Soobzokov died of injuries suffered when a bomb went off in his New Jersey home. His case was never solved, even though all indicia pointed to the militant Jewish Defense League as responsible for the crime.

      Among the scientists included in Project Paperclip, the most famous by far was Werner von Braun, an admired figure in the United States despite having been what Lichtblau terms a “committed Nazi” who used “slave laborers in a mountain factory to build the V-2 rockets that bombed London” (p.10). Two decades later, with help from Walt Disney, von Braun became a “celebrated televangelist for space exploration” (p.93) in the United States and went on to play a prominent role with the National Aeronautics and Space Agency (NASA) in the 1969 Apollo moon-landing project. Von Braun was quite simply “too powerful and too revered to attack directly” (p.95) and his Nazi past never seemed to interest American authorities. This was not the case for the lesser known Dr. Hubertus Strughold, who rose to prominence at NASA as America’s leading expert on “space medicine,” the effects upon the human body of space travel.

        “Struggie,” as he was called in America, had been a colonel in the German Luftwaffe and director of a Berlin research institute. He was tied to grisly experiments on human reaction to extreme conditions, both at his research institute and at the infamous Dachau prison camp. One at Dachau locked prisoners in an airtight ball and subjected them to sudden changes in pressure to simulate rapid drops from high altitudes, with many dying. Another utilized what Nazi documentation termed “asocial gyspsy half-breeds” (p.103) to test the effects of drinking seawater on airmen shot down over water. Strughold’s name was mentioned 61 times during the Nuremberg trials, where 23 medical doctors were tried, with seven sentenced to death. Somehow, Strughold was not among those placed on trial at Nuremberg. Rather, he mysteriously showed up in the United States to launch a second career in his adopted country.

       When the INS began to focus on Strughold’s background in the early 1970s, Texas Congressman Henry Gonzalez came to his defense. The Congressman argued that Strughold was a “distinguished scientist of international reputation.” For the INS to subject him to public suspicion was “no better than the oppressors we abhor” (p.105). With the support of Gonzalez, the case against Strughold went away for about ten years until the Justice Department began to refocus upon him. But Sturghold died while the investigation was unfolding. In 2010, the Institute for Space Medicine finally ceased to label its yearly prize the “Strughold Award.”

     The most wrenching case Lichtblau presents involved Jacob Tennebaum, Jewish and a Holocaust survivor who lost most of his family to the Nazis, including his wife, infant daughter, and five siblings. Imprisoned by the Nazis, Tennenbam became a kapo, a camp overseer who, other prisoners recounted, brought unusual cruelty to the task. Tennenbaum seemed to “thrive on the power the Nazis had given him,” routinely beating Jewish prisoners “even when the SS officers were not watching” (p.195-96). The case which the Justice Department’s OSI brought against Tennenbaum “proved polarizing from the start” (p.197). The previous head of OSI, then in private practice, told his former colleagues that he considered the case “dubious as a matter of law” and “improper if not outrageous, as a matter of policy” (p.197).

       Although OSI nonetheless proceeded with the case, it allowed Tennebaum to stay in the United States because of his poor health. In exchange, Tennebaum gave up his American citizenship and admitted to “brutalizing and physically abusing prisoners outside the presence of German SS personnel” (p.197). The judge hearing the case, a war veteran who had been at Dachau after its liberation, was torn by its ethical complexities. “I have often wondered how much moral and physical courage we have a right to demand or expect of somebody in the position of Mr. Tannenbaum. . . I sometimes wonder whether I might have passed that test” (p.197).

       The most spectacular case — the most spectacular failure for OSI — involved John Demjanjuk, a retired Ukranian-American autoworker who changed his name from Ivan to John when he settled in Cleveland after World War II. In 1977, 18 survivors of the notorious Treblinka camp in Poland identified Demjanjuk as “Ivan the Terrible,” a guard at Treblinka trained by the Germans to operate gas chambers. Ivan was a man of “monstrous savagery . . . the barbaric executioner, a sadist who corralled women and children in the gas chamber, beating and torturing them as they went” (p.202-03).

      Largely on the basis of the Treblinka survivors’ identification, a court in Cleveland stripped Demjanjuk of his United States citizenship and extradited him to Israel, where he was the first ex-Nazi to be tried since Eichmann. After a lengthy trial, Demjanjuk was convicted of war crimes and sentenced to death. But five years later, the Israeli Supreme Court overturned the conviction when new evidence, made available by the Soviet Union, indicated that Demjanjuk had been a less prominent guard at Sobibor, another prison camp in Poland, not Treblinka.

     The Demjanjuk case marked a low point for OSI. The federal appeals court in Cincinnati severely criticized the unit, suggesting that Jewish advocacy groups had unduly influenced its pursuit of the wrong man. OSI nonetheless proceeded with a second case against Demjanjuk for his role in the killing of 27,900 Jews at Sobibor. Demjanjuk was extradited a second time, to Germany in 2009, to face charges in a Munich court. In May 2011, the Munich court found Demjanjuk guilty and sentenced him to five years imprisonment. Demjanjuk died not long after being sentenced.

       In addition to the usual array of family members defending Demjanjuk, he also had on his side Patrick Buchanan. An advisor to presidents Nixon and Reagan and a writer and television commentator, the fiercely anti-communist Buchanan opposed the deportation of numerous individuals targeted by U.S. authorities for having participated in  Nazi war crimes. Over and over, Buchanan argued that the U.S. Nazi hunters were going after wrongly accused elderly men, who were defenseless and presumed guilty. Testimony from witnesses who survived the Nazis was deeply suspect, Buchanan contended, with a “Holocaust survivor syndrome” leading to “group fantasies of martyrdom and heroics” (p.194). Lichtblau does not delve into Buchanan’s psyche, but quotes Buchanan as having written that for all his faults, Hitler himself was an “individual of great courage” (p.194).

      Buchanan’s opposite was investigative journalist Chuck Allen. Before public attention turned in the 1970s to the issue of Nazi war criminals living in the United States, Allen more than any other individual kept the issue alive. A Swarthmore graduate with a Quaker background, the brash Allen was a “modern Don Quixote, armed with a poison pen instead of a lance. . . [who] tilted not at windmills, but at swastikas” (p.78). If Americans were blind to the Holocaust and its aftermath, Allen figured he would “strong-arm them into remembering” (p.78). Well ahead of other journalists and the United States government, Allen gained access to the Soviet Union’s treasure trove of documents and eyewitness accounts of Nazi atrocities. The Russians had “long accused the United States of going easy on Nazi collaborators, and so they were eager to help Allen in his research” (p.119). Although Allen’s journalistic pieces failed to gain much national traction, he paved the way for other journalists and U.S. government agencies to begin to shine a spotlight on “Nazi war criminals in our midst” (p.77), as Allen framed the issue.

      Among these agencies, the Department of Justice’s OSI, created in 1979 to energize the effort to identify and take legal action against ex-Nazis found in the United States, receives most of Lichtblau’s attention. Given its failed cases against Soobzokov and Demjanjuk, as well as the controversy surrounding the Tennenbaum case, readers might conclude that OSI fell far short of the objectives Congress had in mind when it created the unit. But OSI won most of the cases it brought, despite the difficulty in marshaling decades-old evidence and relying on traumatized and elderly witnesses to make cases against defendants who were themselves elderly and often in poor health. One small criticism to Lichtbau’s otherwise superb account is that he could have given greater emphasis to the extent of OSI’s successes in excluding former Nazi operatives from the United States.

      Today, the OSI mission of identifying and proceeding against former Nazi operatives is about at its end. Any putative Nazi war criminals still alive are almost certainly well into their 90s — a person 90 years old this year would have been only 20 when World War II ended in 1945 — and likely to die before protracted legal proceedings against them could be completed. OSI itself has become part of a unit termed Human Rights and Special Prosecutions, which has a broader mandate to seek sanctions against any human rights violators with connections to the United States.

* * *

      Lichtblau’s readers are likely to be surprised to learn that in the years following World War II, key agencies such as the CIA and FBI, driven by Cold War imperatives, were entirely indifferent to notions of accountability for individuals living in the United States who had participated in wartime atrocities on behalf of Hitler’s Third Reich. It was not until the 1970s that the American government began to take such notions seriously. With few if any legal proceedings against Nazi operatives likely to unfold in the future, Lichtblau’s disquieting story serves as a timely summation of the United States’ uneven record in dealing with former Nazis living comfortably within its borders.

 

Thomas H. Peebles

La Châtaigneraie, France

October 3, 2015

 

6 Comments

Filed under American Politics, European History, German History, History, Politics, United States History