Liberals, Where Are They Coming From?

 

Helena Rosenblatt, The Lost History of Liberalism: From Ancient Rome

To the Twenty-First Century

(Princeton University Press) 

             If you spent any time watching or listening to the political conventions of the two major American parties last month,  you probably did not hear the word “liberal” much, if at all, during the Democratic National Convention.  But you may have heard the word frequently at the Republican National Convention, with liberalism perhaps described as something akin to a “disease or a poison,” or a danger to American “moral values.”  These, however, are not the words of Donald Trump Jr. or Rudy Giuliani, but rather of Helena Rosenblatt, a professor at the Graduate Center, City University of New York, in The Lost History of Liberalism: From Ancient Rome to the Twenty-First Century (at p.265).  American Democrats, Rosenblatt further notes, avoid using the word “liberal” to describe themselves “for fear that it will render them unelectable” (p.265). What the heck is wrong with being a “liberal”? What is “liberalism” after all?

Rosenblatt argues that we are “muddled” about what we mean by “liberalism”:

People use the term in all sorts of different ways, often unwittingly, sometime intentionally. They talk past each other, precluding any possibility of reasonable debate. It would be good to know what we are speaking about when we speak about liberalism (p.1).

Clarifying the meaning of the terms “liberal” and “liberalism” is the lofty goal Rosenblatt sets for herself in this ambitious work, a work that at its heart is an etymological stud — a “word history of liberalism” (p.3) — in which she explores how these two terms have evolved in political and social discourse over the centuries, from Roman to present times.

The word “liberal,” Rosenblatt argues, took on an overtly political connotation only in the early 19th century, in the aftermath of the French Revolution. Up until that time, beginning with the Roman authors Cicero and Seneca, through the medieval and Renaissance periods in Europe, “liberal” was a word referring to one’s character.  Being “liberal” meant demonstrating the “virtues of a citizen, showing devotion to the common good, and respecting the importance of mutual connectedness” (p.8-9).  During the 18th century Enlightenment, the educated public began for the first time to speak not only of liberal individuals but also of liberal sentiments, ideas, ways of thinking, even constitutions.

Liberal political principles emerged as part of an effort to safeguard the achievements of the French Revolution and to protect them from the forces of extremism — from the revolution’s most radical proponents on one side to its most reactionary opponents on the other.  These principles included support for the broad ideals of the French Revolution, “liberté, égalité, fraternité;” opposition to absolute monarchy and aristocratic and ecclesiastical privilege; and such auxiliary concepts as popular sovereignty, constitutional and representative government, the rule of law and individual rights, particularly freedom of the press and freedom of religion.  Beyond that, what could be considered a liberal principle was “somewhat vague and debatable” (p.52).

Rosenblatt is strongest on how 19th century liberalism evolved, particularly in France and Germany, but also in Great Britain and the United States.  France and French thinkers were the center points in the history of 19th century liberalism, she contends, while Germany’s contributions are “usually underplayed, if not completely ignored” (p.3).  More cursory is her treatment of liberalism in the 20th century, packed into the last two of eight chapters and an epilogue.  The 20th century in her interpretation saw the United States and Great Britain become centers of liberal thinking, eclipsing France and Germany.  But since World War II, she argues, liberalism as defined in America has limited itself narrowly to the protection of individual rights and interests, without the moralism or  dedication to the common good that were at the heart of 19th and early 20th century liberalism.

From the early 19th century through World War II, Rosenblatt insists, liberalism had “nothing to do with the atomistic individualism we hear of today.”  For a century and a half, most liberals were “moralists” who “never spoke about rights without stressing duties” (p.4).  People have rights because they have duties.  Liberals rejected the idea that a viable community could be “constructed on the basis of self-interestedness alone” (p.4).  Being a liberal meant “being a giving and a civic-minded citizen; it meant understanding one’s connectedness to other citizens and acting in ways conducive to the common good” (p.3-4).  The moral content to the political liberalism that emerged after the French Revolution constitutes the “lost” aspect of the history that Rosenblatt seeks to bring to light.

Throughout much of the 19th century, however, being a liberal did not mean being a democrat in the modern sense of the term.  Endorsing popular sovereignty, as did most early liberals, did not mean endorsing universal suffrage.  Voting was a trust, not a right.  Extending suffrage beyond property-holding males was an invitation to mob rule.  Only toward the end of the century did most liberals accept expansion of the franchise, as liberalism gradually became  synonymous with democracy, paving the way for the 20th century term “liberal democracy.”

While 19th century liberalism was often criticized as opposed to religion, Rosenblatt suggests that it would be more accurate to say that it opposed the privileged position of the Catholic Church and aligned more easily with Protestantism, especially some forms emerging in Germany (although a small number of 19th century Catholic thinkers could also claim the term liberal).  But by the middle decades of the 19th century, liberalism’s challenges included not only the opposition of monarchists and the Catholic Church, but also what came to be known as “socialism” — the political movements representing a working class that was “self-conscious, politicized and angry” (p.101) as the Industrial Revolution was changing the face of Europe.

Liberalism’s response to socialism gave rise in the second half of the 19th century to the defining debate over its nature: was liberalism compatible with socialist demands for government intervention in the economy and direct government assistance to the working class and the destitute?  Or were the broad objectives of liberalism better advanced by the policies of economic laissez faire, in which the government avoided intervention in the economy and, as many liberals advocated, rejected what was termed “public charity” in favor of concentrating upon the moral improvement of the working classes and the poor so that they might lift themselves out of poverty?  This debate carried over into the 20th century and, Rosenblatt indicates, is still with us.

* * *

With surprising specificity, Rosenblatt attributes the origins of modern political liberalism to the work of the Swiss couple Benjamin Constant and his partner Madame de Staël, born Anne-Louise Germaine Necker, the daughter of Jacques Necker, a Swiss banker who served as finance minister to French King Louis XIV (Rosenblatt is also the author of a biography of Constant).  The couple arrived in Paris from Geneva in 1795, a year after the so-called Reign of Terror had ended with the execution of its most prominent advocate, Maximilien Robespierre.  As they reacted to the pressing circumstances brought about by the revolution, Rosenblatt contends, Constant and de Staël formulated the cluster of ideas that collectively came to be known as “liberalism,” although neither ever termed their ideas “liberal.”  Constant, the “first theorist of liberalism” (p.66), argued that it was not the “form of government that mattered,” but rather the amount. “Monarchies and republics could be equally oppressive. It was not to whom you granted political authority that counted, but how much authority you granted.  Political power is dangerously corrupting” (p.66).

Influenced in particular by several German theologians, Constant spoke eloquently about the need for a new and more enlightened version of Protestantism in the liberal state.  Religion was an “essential moralizing force” that “inspired selflessness, high-minded principles, and moral values, all crucial in a liberal society. But it mattered which religion, and it mattered what its relationship was to the state” (p.66).  A liberal government needed to be based upon religious toleration, that is, the removal of all legal disabilities attached to the faith one professed.  Liberalism envisioned strict separation of church and state and what we would today call “secularism,” ideas that placed it in direct conflict with the Catholic Church throughout the 19th century.

Constant and Madame de Staël initially supported Napoleon Bonaparte’s 1799 coup d’état.  They hoped Napoleon would thwart the counterrevolution and consolidate and protect the core liberal principles of the revolution. But as Napoleon placed the authority of the state in his own hands, pursued wars of conquest abroad, and allied himself with the Catholic Church, Constant and Madame de Staël became fervent critics of his increasingly authoritarian rule.

After Napoleon fell from power in 1815, an aggressive counter-attack on liberalism took place in France, led by the Catholic Church, in which liberals were accused of trying to “destroy religion, monarchy, and the family.  They were not just misguided but wicked and sinful.  Peddlers of heresy, they had no belief in duty, no respect for tradition or community.  In the writings of counter-revolutionaries, liberalism became a virtual symbol for atheism, violence, and anarchy” (p.68).  English conservative commentators frequently equated liberalism with Jacobinism.  For these commentators, liberals were “proud, selfish and licentious,” primarily interested in the “unbounded gratification of their passions” while refusing “restraints of any kind” (p.76).

Liberals hopes were buoyed, however, when the  bloodless three day 1830 Revolution in France deposed the ultra-royalist and strongly pro-Catholic Charles X in favor of the less reactionary Louis Philippe.  Among those initially supporting the 1830 Revolution was Alexis de Tocqueville, 19th century France’s most consequential liberal thinker after Constant and Madame de Staël.  Tocqueville famously toured the United States in the 1830s and offered his perspective on the country’s direction in Democracy in America, published in two volumes in 1835 and 1840, followed by his analysis in 1856 of the implications of the French Revolution, The Old Regime and the Revolution.

Tocqueville shared many of the widespread concerns of his age about democracy, especially its tendency to foster egoism and individualism.  He worried about the masses’ lack of “capacity.” He was one of the first to warn against what he called “democratic despotism,” where majority sentiment would be in a position to override the rights and liberties of minorities.  But Tocqueville also foresaw the forward march of democracy and the movement toward equality of all citizens as unstoppable, based primarily upon what he had observed in the United States (although he was aware of how the institution of slavery undermined American claims to be a society of equals).  Tocqueville counseled liberals in France not to try to stop democracy, but, as Rosenblatt puts it, to “instruct and tame” democracy, so that it “did not threaten liberty and devolve into the new kind of despotism France had seen under Napoleon” (p.95).

Tocqueville’s concerns about democracy and “excessive” equality were related to anxieties about how to accommodate the diverse movements that termed themselves socialist.  Initially, Rosenblatt stresses, the term socialist described “anyone who sympathized with the plight of the working poor . . . [T]here was no necessary contradiction between being liberal and being socialist” (p.103).   The great majority of mid-19th liberals, she notes, whether British, French, or German, believed in free circulation of goods, ideas and persons but were “not all that adverse to government intervention” and did not advocate “absolute property rights” (p.114).

In the last quarter of the 19th century, a growing number of British liberals began to favor a “new type of liberalism” that advocated “more government intervention on behalf of the poor.  They called for the state to a take action to eliminate poverty, ignorance and disease, and the excessive inequality in the distribution of wealth .  They began to say that people should be accorded not just freedom, but the conditions of freedom” (p. p.226).   French commentators in the same time period began to urge that a middle way be forged between laissez-faire and socialism, termed “liberal socialism,” where the state became an “instrument of civilization” (p.147).

But it was in 1870s Germany where the debate crystalized between what came to be known as “classical” laissez faire liberalism and the “progressive” version, thanks in large part to the unlikely figure of Otto von Bismarck.   Although no liberal, Bismarck, who masterminded German unification in 1871 and served as the first Chancellor of the newly united nation, instituted a host of sweeping social welfare reforms for workers, including full and comprehensive insurance against sickness, industrial accidents, and disability.  Most historians attribute his social welfare measures to a desire to coopt and destroy the German socialist movement (a point Jonathan Steinberg makes in his masterful Bismarck biography, reviewed here in 2013).

Bismarck’s social welfare measures coincided with an academic assault on economic laissez faire led by a school of “ethical economists,” a small band of German university professors who attacked laissez faire with arguments that were empirical but also moral, based on a view of man as not a “solitary, self-interested individual” but a “social being with ethical obligations “(p.222).  Laissez-faire “allowed for the exploitation of workers and did nothing to remedy endemic poverty,” they contended, “making life worse, not better, for the majority of the inhabitants of industrializing countries” (p.222).  Industrial conditions would “only deteriorate and spread if governments took no action” (p.222).

In the late 19th and early 20th centuries, many young Americans studied in Germany under the ethical economists and their progeny.  They returned to the United States “increasingly certain that laissez-faire was simply wrong, both morally and empirically,” and “began to advocate more government intervention in the economy” (p.226).  On both sides of the Atlantic, liberalism and socialism were drawing closer together, but the debate between laissez faire liberalism and the interventionist version played out primarily on the American side.

* * *

During World War I, Rosenblatt argues, liberalism, democracy and Western civilization became “virtually synonymous,” with America, because of its rising strength, “cast as their principal defender” (p.258).  Germany’s contribution to liberalism was progressively forgotten or pushed aside and the French contribution minimalized.  Two key World War I era American thinkers, Herbert Croly and John Dewy, contended that only the interventionist, or progressive, version of liberalism could claim to be truly liberal.

Croly, cofounder of the flagship progressive magazine The New Republic, delivered a stinging indictment of laissez-faire economics and a strong argument for government intervention in his 1909 work, The Promise of American Life.  By 1914, Croly had begun to call his own ideas liberal, and by mid-1916 the term was in common use in The New Republic as “another way to describe progressive legislation” (p.246).

The philosopher John Dewey acknowledged that there were “two streams” of liberalism.  But one was more humanitarian and therefore open to government intervention and social legislation, while the other was “beholden to big industry, banking, and commerce, and was therefore committed to laissez-faire” (p.261).  American liberalism, Dewey contended, had nothing with laissez-faire, and never had.  Nor did it have anything to do with what was called the “gospel of individualism.”  American liberalism stood for “‘liberality and generosity, especially of mind and character.’ Its aim was to promote greater equality and to combat plutocracy with the aid of government” (p.261).

Rosenblatt credits President Franklin D. Roosevelt’s New Deal with demonstrating how progressive liberalism could work in the political arena. Roosevelt, 20th century America’s most talented liberal practitioner, consistently claimed the moral high ground for liberalism.  He argued that liberals believed in “generosity and social mindedness and were willing to sacrifice for the public good” (p.261).  For Roosevelt, the core of the liberal faith was a belief in the “effectiveness of people helping each other” (p.261). But despite his high-minded advocacy for progressive liberalism – buttressed by his leadership of the country during the Great Depression and in World War II – Roosevelt did not vanquish the argument that economic laissez faire constituted the “true” liberalism.

In 1944, with America at war with Nazi Germany and Roosevelt within months of unprecedented fourth term, the eminent Austrian economist Friedrich Hayek, then teaching at the London School of Economics, published The Road to Serfdom, the 20th century’s most concerted intellectual challenge to the interventionist strand of liberalism.  Any sort of state intervention or “collectivist experiment” threatened individual liberty and put countries on a slippery slope to fascism, Hayek argued in his surprise best seller.  Hayek grounded his arguments in English and American notions of individual freedom.  “Progressive liberalism,” which he considered a contradiction in terms, had its roots in Bismarck’s Germany, he argued, and leads ineluctably to totalitarianism.  “[I]t is Germany whose fate we are in some danger of repeating” (p.268), Hayek warned his British and American readers in 1944.

Although Hayek always insisted that he was a liberal, his ideas became part of the American post World War II conservative argument against both fascism and communism (meanwhile, in France laissez faire economics became synonymous with liberalism; “liberal” is a political epithet in today’s France, but means a free market advocate, diametrically opposed to its American meaning).  During the anti-Communist fervor of the Cold War that followed World War II, the interventionist liberalism that Croly and Dewey had preached and Roosevelt had put into practice was labeled “socialist” and even “communist.”  To American conservatives, those who accepted the interventionist version of liberalism were not really liberal; they were “totalitarian.”

* * *

The intellectual climate of the Cold War bred defensiveness in American liberals, Rosenblatt argues, provoking a need to “clarify and accentuate what made their liberalism not totalitarianism. It was in so doing that they toned down their plans for social reconstruction and emphasized, rather, their commitment to defending the rights of individuals” (p.271).  Post World War II American liberalism thus lost “much of its moral core and centuries-long dedication to the public good.  Individualism replaced it as liberals lowered their sights and moderated their goals” (p.271).  In bowing to Cold War realities, American liberals in the second half of the 20th century “willingly adopted the argument traditionally used to malign them . . . that liberalism was, at its core, an individualist, if not selfish, philosophy” (p.273).   Today, Rosenblatt finds, liberals “overwhelmingly stress a commitment to individual rights and choices; they rarely mention duties, patriotism, self-sacrifice, or generosity to others” (p.265-66).

Unfortunately, Rosenblatt provides scant elaboration for these provocative propositions, rendering her work incomplete.  A valuable follow up to this enlightening and erudite volume could concentrate on how the term “liberalism” has evolved over the past three quarters of a century, further helping us out of the muddle that surrounds the term.

Thomas H. Peebles

La Châtaigneraie, France

September 7, 2020

 

1 Comment

Filed under American Politics, English History, European History, France, French History, German History, History, Intellectual History, Political Theory

Being Ordinary in Nazi Germany

 

Konrad Jarausch, Broken Lives:

How Ordinary Germans Experienced the Twentieth Century  

(Princeton University Press)

The words “ordinary Germans” in Konrad Jarausch’s Broken Lives: How Ordinary Germans Experienced the Twentieth Century sent me back immediately to a searing academic debate that riveted a recently reunited Germany and much of the rest of the world in the 1990s between two American scholars, Christopher Browning and Daniel Goldhagen.  Browning and Goldhagen sparred over the extent to which the anti-Semitism that gave rise to the Holocaust, Nazi Germany’s project to eradicate Europe’s Jewish population, was engrained in the German population.

Browning’s 1992 work Ordinary Men: Reserve Police Battalion 101 and the Final Solution in Poland, and Goldhagen’s Hitler’s Willing Executioners: Ordinary Germans and the Holocaust, published four years later, both examined mass executions committed by Reserve Battalion 101 of the German Uniformed Police Force (Ordungspolizel) in Poland in 1942.  The reservists had been randomly conscripted and were mostly middle-aged, with few ideological fanatics or even Nazi party members among them.  Their commander, the two scholars agreed, gave the reservists the opportunity  not to participate in the killing, but most declined to  opt out.   The two scholars offered contrasting interpretations of why so many chose, seemingly willingly, to participate in mass killing, with the difference captured by the words ordinary “men” in Browning’s title and ordinary “Germans” in that of Goldhagen.

Browning attributed the reservists’ participation in mass atrocities to universal factors, such as peer pressure, deference to authority, and adaptation to roles within an occupation unit stationed in enemy territory during wartime — attributes of human nature and group dynamics not unique to Germany or Germans.  Goldhagen, by contrast, emphasized what he described as an “eliminationist” anti-Semitism so deeply embedded in German history and culture that it caused virtually all Germans to kill Jews with enthusiastic cruelty when given the opportunity.  In essence, Goldhagen was arguing that a uniquely lethal anti-Semitism was embedded into the German national character.

There is no explicit mention of the Browning-Goldhagen exchange in Broken Lives, although implicitly Jarusch seems to be distancing himself from Goldhagen in this work (and it’s worth noting that Jarausch and Browning both served in the history department at the University of North Carolina and both did their graduate work at the University of Wisconsin in the same time frame).  But Jarusch’s work might be considered an attempt to restore an ordinary meaning to the term “ordinary Germans.”  Broken Lives gathers together a treasure trove of around 70 autobiographies, memoirs, diaries and journals composed by Germans who lived through and survived the Nazi era, ranging, Jarusch indicates, from enthusiastic supporters of the Nazi regime to courageous opponents.

Somewhat awkwardly, Jarusch calls his group the “Weimar cohort,” although he also uses “memoirists” and “authors.”  Most were born in the decade after World War I and retain childhood memories of the Germany of the 1920s, during its ill-fated experiment with liberal democracy, the Weimar Republic.  About two-thirds of the Weimar cohort were men, one-third women.  These are not “elite memoirs,” Jarusch emphasizes, but rather “untutored accounts” which provide a “more vivid and personal picture of what it meant to live through the twentieth century” (p.11).

The Weimar cohort consists mostly of “apolitical folks who merely took pride in surviving the Third Reich through ingenious stratagems” (p.4; among the cohort, assiduous readers of this blog will recall Joachim Fest, later a noted German historian, whose Not I, a memoir on growing up in the Third Reich, was reviewed here in 2016).  The cohort does not include those “fanatical Nazis” who “did not want to write about their complicity in crimes” (p.4).  Alluding to his title, Jarusch observes that the autobiographies and memoirs collectively demonstrate that Germans “overwhelmingly experienced a sense of broken lives in the twentieth century, disrupted beyond repair” (p.12), in which the normal life cycle progression from childhood to adult was interrupted by forces outside the memoirists’ control.

Jarusch utilizes the individual accounts to compile a fact-intensive, before-during-after narrative of what everyday life was like from World War I onward in Germany, divisible into five segments: 1) the Weimar cohort’s youth in the 1920s; 2) the different ways members of the cohort reacted as the Nazis gained power in 1933 and tightened their grip as the decade proceeded; 3) the cohort’s war experiences; 4) their immediate post-war experiences, in the initial months and years after the end of hostilities; and 5) longer term post-war experiences, as Germany became a Cold War battleground, with a Soviet satellite in the East and a largely capitalist West, and thereafter reunited in 1990 (these five segments coincide with those that Columbia University professor Fritz Stern described in his 2006 memoir, Five Germanys I Have Known; Stern is listed among the “minor individuals” who contributed to Broken Lives, indicating that Jarusch makes only limited reference to his contribution).

Virtually all the Weimar cohort reported “surprisingly benign” childhoods during the 1920s, “quite happy in contrast to later suffering”(p.368-69).  The period of innocence began to give way to more trying times for the cohort with the stock market crash of 1929 and the ensuring worldwide economic depression, which hit Germany particularly hard.  The Nazi assumption of political power in 1933 “largely dispelled ‘the carefree atmosphere of childhood’” for the cohort (p.63-64).  The war itself, which began in September 1939 and lasted through May 1945, constitutes the pivotal experience of the autobiographies and memoirs, with some memoirists detailing life on the front lines and in combat, while others, mostly women, describe their lives on the home front.  In the post-war period, memoirists living in West Germany enjoyed economic prosperity and “considerable freedom for individual growth that allowed a plurality of life plans to unfold into maturity and old age” (p.303), while the autobiographies and memoirs of those living in East Germany exhibit a “peculiarly defensive character, since their authors attempt to justify individual lives in spite of having to admit the [East German] system’s failure” (p.357).  West Germans memoirists supported reunification in 1990 more enthusiastically than did those from East Germany.

In his narrative , Jarausch uses the material that the cohort provided  primarily in short snippets, inserted into the text in a mechanical manner.  The members of the cohort come and go quickly, a short sentence about x here, another about y there; then we don’t see x or y for several pages or the next or a following chapter.  We never really develop a rapport with x or y, or any of the 17 “key individuals” whose experiences are the flesh and bones of this work.  Despite the richness of the historical raw material, the resulting narrative is wooden and desultory.  Moreover, Jarusch does little questioning of his sources in the narrative.

Almost all the memoirs are retrospectives, written many years – most often decades – after the events they recount, where experiences and memories “blend in a unique fashion that can make it difficult to tell them apart.” The memoirs are thus “selective, biased, and exculpatory, offering an incomplete picture” (p.13).  While their assertions nonetheless may be valuable in “showing how earlier experiences are remembered” (p.13), Jarusch argues, for the most part they are incorporated as fact into the text, with few attempts at qualification.

Women’s memoirs differ from those of men, Jarusch notes at several points.   Both male and female memoirists used their texts to reconcile early Nazi enthusiasm with later disillusionment.  But Jarusch finds more ambivalence in the memoirs of women about their roles in the Third Reich, especially their war experiences.  Drafted by the Wehrmacht and sent to the front, men experienced the war as a “paroxysm of gendered violence against enemies, foreign women, and racial inferiors” (p.102).  Most women in contrast “followed a traditional pattern of caring for their families, dealing with shortages, and keeping up the home front.  At the same time, they coped with the separation from the men by writing letters to soldiers and sending packages in order to keep up their morale” (p.187).

As war fortunes worsened for Germany, many women were required to go to work in war production facilities.  Women were equally exposed to Allied saturation bombing, which often failed to differentiate military from civilian targets.  With the collapse of the Eastern Front and the advance of the Red Army, many women had to organize precipitous flight, combined with the “crowning indignity of mass rape, with its attendant brutalization, pregnancy, and shame that sexually signaled defeat” (p.370).  After the war, the female memoirists were frequently “haunted by ‘anger, grief, shame, and remorse’” (p.187), more so than the men.

Any portrait of Germans being ordinary in extraordinary times cannot escape issues of German victimhood, comparative suffering, and collective guilt.  Jarusch notes the tendency of the autobiographies and memoirs to emphasize personal suffering while making at best only passing reference to the pain of Nazi victims.  Most of his authors, writing years and decades after the Nazi era, saw themselves as “misguided victims in order to minimize their own contribution to the Third Reich” (p.95).  German soldiers, instead of being heroes, came to “see themselves as betrayed victims of a megalomaniac Führer and Nazi dictatorship.  Because their heroism had become meaningless, they were left only with claims of victimization” (p.231).

In several passages, Jarusch considers the degree to which the ordinary Germans of the Weimar cohort might have aided and abetted the crimes of the Third Reich, as close as he comes to weighing in on the Browning-Goldhagen debate.  The memoirs and autobiographies of the cohort suggest that “more ordinary Germans were involved in the Holocaust than apologists admit, but at the same time fewer participated than some critics claim,” (p.218), he writes.  The memoirs and autobiographies “only hint at the full extent of the violence because its graphic description would be too unsettling” (p.134).  Although none admitted to involvement in militarily unnecessary atrocities, the texts “do reveal a widespread knowledge of the Nazi project of annihilation” (p.135).  While most of the cohort “witnessed the persecution without intervening and did their duty during the war effort,” many continued after the war to “claim not to have harmed anyone directly” (p.218-19).

Jarusch addresses the controversy over the exhibition of the Hamburg Institute of Social Research, “War of Annihilation: Crimes of the Wehrmacht 1941–1944,” which toured Germany between 1995 and 1999, at the same time the Browning-Goldhagen debated was roiling the country.  The exhibit raised the question whether the German army, the Wehrmacht, should be considered itself a criminal organization, and if so, whether that makes everyone who was part of it a criminal (in Learning from the Germans: Race and the Memory of Evil, reviewed here last month, Susan Neiman also considered the impact of this exhibit).  Without answering the question directly, Jarusch concludes that the autobiographical accounts “tend to support the version of considerable military participation in atrocities and murder” (p.103), hardly a startling conclusion.

Overall, the dominant experience shared in the retrospectives of the Weimar cohort, Jarusch concludes, was the “disruption of their own life courses by historic forces outside of their control.  Their existence was not the expected progression from happy childhood via turbulent adolescences to mature adulthood with professional success and loving family, but rather a constant struggle against the surprising challenges of depression, dictatorship, war, privation, and the like” (p.366).  Capturing how ordinary Germans lived through and survived the Nazi era is assuredly a tale worth telling.  But while Broken Lives provides much valuable ground-level information  about the ordinary Germans of the Weimar cohort,  it falls disappointingly short in creating affinity with the individual members of the cohort.

 

Thomas H. Peebles

La Châtaigneraie, France

August 21, 2020

 

8 Comments

Filed under German History, History

German Lessons: Is Mississippi Learning?

 

Susan Neiman, Learning from the Germans:

Race and the Memory of Evil (Farrar, Straus & Giroux) 

Less than two months ago, protests and public demonstrations erupted on an unprecedented scale across the United States and throughout the world over the killing of African American George Floyd at the hands of a Minneapolis, Minnesota, police officer, captured on videotape.  Fueled by the movement known as “Black Lives Matter,” the protests and demonstrations that continue to this day focus most directly on police violence and reform of criminal justice practices.  But at a deeper level the protests also seek to call attention to the endurance of systemic racism in the United States, the subject that hovers over Susan Neiman’s thought-provoking Learning from the Germans: Race and the Memory of Evil,  giving her work a timeliness she probably never imagined when it first appeared last year.

To address systemic racism, Neiman argues, the United States needs to confront more directly and honestly the realities of its racist past: human bondage dating from the early 17th century which plunged the United States into a Civil War in the mid-19th century, followed by an additional century of legally enforced segregation, rampant discrimination, racial terrorism and second class citizenship, with official sanction of racial discrimination not ending until passage of the Civil Rights Act in 1964 and the Voting Rights Act the following year.  Neiman’s title, moreover, is a give away to her surprising suggestion that Americans can learn much from how Germany finally confronted its own racist past, specifically the Holocaust, Nazi Germany’s project to exterminate Europe’s Jewish population that it perpetrated over a 12-year period, from 1933 to 1945.

When Neiman looked at the contentious issue of monuments honoring Southern Civil War veterans from the perspective of Germany, where she has lived on and off since 1982, she found it hard to “imagine a Germany filled with monuments to the men who fought for the Nazis.  My imagination failed. For anyone who has lived in contemporary Germany, the vision of statutes honoring those men is inconceivable.” Germans who lost family members during World War II realize that their loved ones “cannot be publicly honored without honoring the cause for which they died” (p.267).  In the United States, by contrast, the president and a substantial if declining portion of the public still support maintaining statutes and memorials honoring the cause of the Southern Confederacy, a reflection of the broader differences between the two countries in coming to terms with their racist pasts that Neiman seeks to highlight.

Learning from the Germans is not an attempt to compare the evils of slavery and discrimination against African Americans in the United States to those of the murder of Jews and others during the Holocaust, an exercise Neiman considers fruitless.  Rather, her work revolves around what might be characterized as “comparative atonement,” for which she uses her preferred if foreboding German word, Vergangenheitsaufarbeitung, translated into English as “working off the past.”  The word came into use in German in the 1960s as an “abstract polysyllable way of saying We have to do something about the Nazis” (p.30).  In atoning for its racist past, Germany is markedly further down the path to Vergangenheitsaufarbeitung than the United States, Neiman argues, but she also emphasizes how East Germany, when it existed and despite its many faults, was further along this path than West Germany.  Only after German reunification in 1990 did efforts of the former West Germany to atone for its racist crimes begin to gather serious momentum.

The first part of Neiman’s three-part work, “German Lessons,” outlines Germany’s attempt to come to terms with its crimes of the Nazi period, both before and after unification.  The second part, “Southern Discomfort,” looks at the legacy of racism in the American Deep South, heavily concentrated on the state of Mississippi and on the persistence of the notion of the Lost Cause, a romanticized version of the American Civil War that insists that the war was fought not over slavery but over “states’ rights” — an “abstract phrase that veils the question of what, exactly, Southern states thought they had a right to do” (p.186).  In her third part, “Setting Things Straight,” Neiman considers in broad terms how the American South and the United States as a whole can make strides in coming to terms with a racist past, with the German experience serving as a partial guide.  But this part is a more an invitation to debate than a provision of definitive answers.

Neiman, a Jewish American with no direct family connection to the Holocaust, was raised in the American South, in Atlanta, Georgia.  A philosopher by training who studied at Harvard under John Rawls and taught at Yale, she is today the Director of the Einstein Forum, a German think tank located in Potsdam, just outside Berlin.  After nearly a quarter century living and working in Germany, Neiman spent a year at the Winter Institute for Racial Reconstruction in Oxford, Mississippi, a forward-looking institution dedicated explicitly to encouraging people to “honestly engage in their history in order to live more truthfully in the present, where the inequities of the past no longer dictate the possibilities of the future” (p.143).  Utilizing these diverse professional and personal experiences, she mixes analysis and anecdote while introducing her readers to an impressive array of Germans and Americans working on what might be described as the front lines of Vergangenheitsaufarbeitung in their respective countries.

Although her analysis of the United States concentrates on the state of Mississippi, Neiman recognizes that Mississippi is hardly representative of the United States as a whole, and not even of the states of the former Confederacy.  But she contends that awareness of history is arguably more acute in Mississippi than anywhere else in United States.  “Focusing on the Deep South,” moreover, is “not a matter of ignoring the rest of the country, but of holding a magnifying glass to it” (p.17-18), she writes.   Although just about everyone in the United States now accepts that slavery was wrong, , the “national sense of shame” which she finds in today’s Germany is “entirely absent” in the United States; shame  is “not the American way” (p.268).

During the nearly three years that Neiman worked on her book, many of the Germans  she met with laughed at her proposed title and rejected the idea that Germany had anything to teach Americans about dealing with their racist past.  Most Germans today are defensive about their country’s efforts to work toward Vergangenheitsaufarbeitung, she observes.  They think they  took way too long to transition from looking at themselves as victims, with some adding  that many of their fellow citizens never made the transition.  “Good taste,” she writes, “prevents good Germans from anything that could possibly be construed as boasting about repentance” (p.56).   Neiman sees this widespread defensiveness as “itself a sign of how far Germany has come in taking responsibility for its criminal history” (p.17).  But how Germany arrived at this position is not easy to pinpoint.

* * *

Competitive victimhood, Neiman writes, “may be as close to a universal law of human nature as we’re ever going to get.”   Postwar Germany was no less inclined than the defeated American South to participate in this “old and universal sport”(p.63).   Although 80 years separate the defeat of the American South from that of Nazi Germany, Neiman perceives similar litanies: “the loss of their bravest sons, the destruction of their homes, the poverty and hunger that followed – combined with resentment at occupying forces they regarded as generally loutish, who had the gall to insist their suffering was deserved” (p.63).  For decades after World War II, Germans were “obsessed with the suffering they’d endured, not the suffering they’d caused” (p.40).

In the immediate aftermath of World War II, the United States, Britain and France, the occupying powers in what became West Germany, aimed to institute a process of de-Nazification.  Among its many aims, de-Nazification was supposed to purge former Nazis and Nazi sympathizers from positions of influence.  More broadly, as Frederick Taylor argued in Exorcising Hitler: The Occupation and Denazification of Germany (reviewed here in December 2012), de-Nazification was “perhaps the most ambitious scheme to change a nation’s psyche ever mounted in human history.” But de-Nazification was a failed scheme.  West Germans mocked the Allied attempt to impose a change of consciousness.

The Allies, moreover, lacked the resources to make de-Nazification successful, and Cold War rivalries and realities intruded.  The Allies were “far more interested in securing [German] allies against the Soviet Union than in digging up their sordid pasts” (p.99).  The de-Nazification program was turned over to the West German government, which had “no inclination to pursue it” (p.99).  Well into the 1960s, West German commitments to democratic governance were “precarious, and the possibility of a return to a sanitized Nazism could not be ruled out” (p.55).  The implicit message of Konrad Adenauer, West Germany’s first post-war chancellor, seemed to be: behave yourself, don’t call attention to your past, and we won’t look too deeply into that past.

East Germany worked off its Nazi past differently.  Although its official name was the German Democratic Republic (GDR), there was little that was democratic about East Germany.  Its borders were closed, its media heavily censored, and its elections a national joke.  Yet, East German leaders had been by and large genuinely anti-fascist, anti-Nazi during the war; the same cannot be said of West German leaders.  East Germany put far more old Nazis on trial proportionately and convicted more than in the West.  The West never invited Jewish émigrés to return; the East did.  Overall, Neiman concludes, East Germany quite simply “did a better job of working off the Nazi past than West Germany” (p.81).

It was not until around 1968 that West Germany began to get serious aboutVergangenheitsaufarbeitung, embarking on a path out of denial in conjunction with the student protests that roiled Europe and the United States that year.  Because their parents could not “mourn, acknowledge responsibility, or even speak about the war” (p.70), the 68ers, as the generation born in the 1940s was called, felt compelled to confront their parents over their war experiences and their subsequent silence about those experiences.  A decade later, the American TV series “Holocaust” served as a catalyst for “public discussion of the Holocaust that had been missing for decades” (p.370-71).  Then, on May 8, 1985, 40 years after Germany’s surrender, West German president Richard von Weizsäcker made headlines when he termed that day one of liberation. Up to that point, May 8 in West Germany had been called the Day of Defeat or Day of Unconditional Surrender (yet Weizsäcker even then symbolized the ambivalence of West German Vergangenheitsaufarbeitung: his father had been a high-level Nazi, an assistant to Foreign Minister Joachim von Ribbentrop; Weizsäcker defended his father at the post-war Nuremberg trials and always maintained that his father was trying only to make a bad situation better).

By the time of reunification in 1990, expressions of pro-Nazi sentiment had become “socially unacceptable” and have since become “morally unacceptable” (p.311).  A 1995 exhibit on the Wehrmacht, the Nazi army with 18 million members, demonstrated convincingly that it had systematically committed war crimes,  thereby breaking West Germany’s “final taboo” (p.24).  The exhibit was extended to 33 different cities in Germany and Austria and “ignited media discussions, filled talk shows, and eventually provoked a debate in parliament” (p.24).

Today, the right-wing Alternative for Germany, AfD in German, continues to rise in influence in Germany on an anti-immigrant platform many consider neo-Nazi.  Germany gained further unwanted attention earlier this month when Nazi sympathizers were revealed to  have infiltrated an elite German security unit:

https://www.nytimes.com/2020/07/03/world/europe/germany-military-neo-nazis-ksk.html

But today’s Germany has nonetheless reached the point where “open expressions of racism are politically ruinous,” Neiman concludes, which may be the “best outcome we can hope for and it may also be enough. . . Very often, social change begins with lip service” (p.310-11).

* * *

As in Germany, Neiman observes, “the War” throughout the American South is a singular reference.  “Everybody knows that one was decisive, and its repercussions are with us today.”  This knowledge is “more conscious in a Deep South that was occupied, and almost as devastated as Germany, than in the rest of the United States” (p.37).  But the Lost Cause narrative that arose in the American South was an exercise in Civil War historical revisionism that flourished toward end of the 19th and into the early 20th century, in which the war was rebranded as a “noble fight for Southern freedom,” with the post-war Reconstruction period becoming a “violent effort by ignorant ex-slaves and mercenary Yankees to debase the honor of the South in general, and its white women in particular” (p.181).

Reconciliation under the Lost Cause mythology was “between white members of the opposing armies” to be achieved by “valorizing the defeated, and ignoring the cause for which they fought” (p.182). Reconciliation between white and black folk was not on the agenda.  Slowly and hazily, the Lost Cause narrative “came to capture the hearts of the North. Weary of war, eager for reconciliation, and keen to get on with the business of industrialization that was changing the American economy, Northerners conceded most of the mythmaking to the South. Not many had been enthusiastic abolitionists anyway” (p.186-87).

The Winter Institute, where Neiman conducted much of the research for this book, has sought to counter the Lost Cause narrative through such institutional reforms as creating and implementing school criteria on human rights, fostering inter-racial dialogue in communities known for racial violence, and promoting academic investigation and scholarship on patterns and legacies of racial inequities.   What keeps the Winter Institute going is the notion that “if you can change Mississippi communities, you can probably change anything” (p.142).  The primary lesson Neiman derived from her time at the Winter Institute: “national reconciliation begins at the bottom. Very personal encounters between members of different races, people who represent the victims as well as those who represent the perpetrators, are the foundation of any larger attempt to treat national wounds . . . It is a long and weary process, but it is hard to see an alternative” (p.301).

Neiman discusses at length two notorious murderous acts in mid-20th century Mississippi: the 1955 murder of Emmitt Till, a 14-year-old Chicago boy brutally killed during a summer visit to Mississippi; and the murders of  Andrew Goodman, Mickey Schwerner, and James Chaney, three civil rights workers, two young white men from New York and a black man from Mississippi, killed near Philadelphia, Mississippi during the following decade while organizing African-Americans to exercise their right to vote.  The two men tried for the Till murder were promptly acquitted.  Protected   by the Double Jeopardy Clause of the United States Constitution, they thereafter took money from Look magazine to confess that they had killed the teenager.  No trial at all ensued in the immediate aftermath of the killing of Goodman, Schwerner, and Chaney.

While the world knew the story of the ghastly Till murder, for decades nobody in the local Mississippi Delta community, black or white, wanted to talk about it.  Neiman sees a similarity to the silence that prevailed in Germany in the first decades after the war, where to both non-Jewish and Jewish families, “anything connected to the war was off-limits.  Neither side could bear to talk about it, one side afraid of facing its own guilt, the other afraid of succumbing to pain and rage” (p.217).

In 1989, the Mississippi Secretary of State issued a public apology to the families of the three slain civil rights workers, the first local white man to publicly acknowledge the crime.  Most Mississippians think that is the reason he lost when he ran for governor the following year.  With a strong push from the Winter Institute, a trial in the case finally took place in in 2005.  The prime suspect, Edgar Ray Killen, then 80 years old, was convicted, but only of manslaughter.  Killen received three 20-year sentences and died in prison in 2018.  Neiman wonders whether the trial has helped a “healing process” or allowed Mississippi to “rest in the self-satisfaction that the horrors that stigmatized the state all belonged to the past” (p.301).

* * *

In her final section, Neiman runs through the most common arguments against reparations to descendants of victims of slavery, and proffers counter arguments.  She glosses over what in my mind is the most difficult: how to determine who gets what amount.   She notes that West Germany paid Israel what amounted to reparations early in the history of the two states, the “price for acceptance into the Western Community and the price was relatively cheap . . . Reparations were paid in exchange for world recognition and the opportunity to keep silent about the quantity of Nazis, and Nazi thinking, that permeated the Federal Republic” (p.289).  Iin the United States, she argues, reparations  need not take the form of precise compensation to individual African Americans but should be the subject of public debate.

On the current polemic surrounding statutes and memorials honoring Confederate war veterans,  Neiman reminds her readers that most were erected in the early part of the 20th century with the express purpose of reinforcing and providing legitimacy to the regime of rigid segregation and discrimination.  They should not be seen as “innocuous shrines to history; they were provocative assertions of white supremacy at moments when its defenders felt under threat.  Knowing when they were built is part of knowing why they were built. . What is at stake is not the past, but the present and the future. When we choose to memorialize a historical moment, we are choosing the values we want to defend, and pass on” (p.263).

* * *

“Forgetting past evils may be initially safer,” Neiman writes, but in the long run, the “dangers of forgetting are greater than the dangers of remembering — provided, of course, that we use the failures of past attempts to learn how to do it better” (p.373).  Although there is no single pathway to Vergangenheitsaufarbeitung, understanding the distance Germany has traveled in coming to terms with the Nazi era’s racist crimes should benefit Americans yearning to find a better pathway in the turbulent aftermath of the George Floyd killing.

Thomas H. Peebles

La Châtaigneraie, France

July 29, 2020

 

6 Comments

Filed under American Politics, American Society, German History, History, Politics, United States History

Is Democracy a Universal Value?

 

Larry Diamond, Ill Winds:

Saving Democracy from Russian Rage, Chinese Ambition, and American Complacency (Penguin Press) 

Stanford professor Larry Diamond is one of America’s foremost authorities on democracy – what it is, how it works in diverse countries throughout the world, how it can take hold in countries with little or no history of democratic governance – and how it can be lost.  Diamond brings a decidedly pragmatic perspective to his subject.  His extensive writings focus in particular on how to sustain fragile democratic governance.  He rarely dwells on classical theory or delves into the origins of democracy.  He is more likely to provide an assessment of the prospects for democracy in contemporary Nicaragua, Nigeria or Nepal, or most anywhere in between, than assess the contribution to modern democracy of, say, Thomas Hobbes or Jean-Jacques Rousseau.  In the two decades following the fall of the Berlin wall and the demise of the Soviet Union, Diamond’s bottom line seemed to be that democracy had the upper hand in most corners of the world – the Middle East being at best a giant question mark – and was steadily extending to numerous countries that had hitherto been considered unlikely places for it to take hold.

That was then. Today, Diamond says that he is more concerned about the future of democracy than at any time in the forty plus years of his career.  He begins Ill Winds: Saving Democracy from Russian Rage, Chinese Ambition, and American Complacency, a distinctly more guarded assessment of democratic prospects across the globe than his earlier writings, by noting that the march toward democracy began to slow around 2006.  The independent Freedom House, which tracks democratic progress worldwide, found that 2017 was the twelfth consecutive year that the number of countries declining in liberty significantly outstripped those gaining.

Rather than democracy, it is now authoritarian government — sometimes termed “illiberal democracy” and often associated with nativist, xenophobic “populism” — that seems to be on the rise across the globe.  Throughout much of the world, Diamond notes, authoritarian governments and their autocratic leaders are “seizing the initiative, democrats are on the defensive, and the space for competitive politics and free expression is shrinking” (p.11).  Today’s world has “plunged into a democratic recession” (p.54), with democracy finding itself “perched on a global precipice.”  If authoritarian ascendancy and democratic erosion continue, Diamond warns, we may reach a “tipping point where democracy goes bankrupt suddenly – plunging the world into depths of oppression and aggression that we have not seen since the end of World War II” (p.293).

Diamond’s sub-title reveals that the “ill winds” of his title are blowing chiefly from a Russia rife with “rage,” and a China abounding in “ambition,” while the United States stands by “complacently” rather than blowing in the opposite direction, as it once did.  If the United States does not reclaim its traditional place as the keystone of democracy, Vladimir Putin of Russia, Xi Jinping of China, and their admirers “may turn autocracy into the driving force of the new century” (p.11).  Emboldened by the “new silence from Donald Trump’s America,” the “new swagger” emanating from Jinping’s China and Putin’s Russia have allowed autocrats across the globe to “tyrannize their opponents openly and without apology”(p.58).

Diamond starts his urgent and alarming assessment with general, introductory chapters that provide a working definition of democracy and summarize the present world wide crisis, for example, “Why Democracies Succeed and Fail,” “The March and Retreat of Democracy,” and “The Authoritarian Temptation.”  He then devotes a chapter to each of his three main actors, the United States, Russia and China.  From there, he moves to a series of recommendations on how established democracies can counter the forces that seem to be leading many countries away from democracy and toward authoritarian styles of governance.  His recommendations include combatting public corruption (the “soft underbelly of authoritarian rule;” p.192); and making the Internet safe for democracy (the “global fight for freedom is inseparable from the fight for internet freedom;” p.259).

In a book about the future of global democracy, Diamond’s recommendations are oddly U.S. centric. They are mostly about how the United States can promote democracy more effectively abroad and render its internal institutions and practices more democratic.  There is little here about what other established democracies – for example, Great Britain, Germany or Australia — can do to be more effective abroad or more democratic at home.  Diamond moreover breaks little new ground in this work.

Few readers are likely to be surprised to learn that Russia and China constitute the world’s major anti-democratic actors; that Hungary and Poland, both part of the European Union, the quintessential  democracy project, are among the most prominent countries moving away from democracy and toward authoritarianism; or that countries otherwise as diverse as Turkey, India, the Philippines and Brazil are moving in the same direction.  Nor does Diamond venture into unfamiliar territory when he argues that the United States under President Donald Trump appears to be more on the side of the authoritarians and populists rather than those seeking to institutionalize democracy in their countries.

But Diamond is an accomplished  salesman for democratic governance, the product he has relentlessly pedaled for over four decades, and his salesmanship skills are on full display here.  Amidst all the reasons he provides for pessimism about democracy’s worldwide prospects, readers will be reassured to find more than a little of the optimism that characterized his earlier works.  Although authoritarians may seem to be on the rise everywhere, people across the globe are not losing their faith in democracy, he argues.   Democracy for Diamond remains nothing less than a “universal value” (p.159).  The world’s democracies quite simply “have the better ideas” (p.225), he writes.  But is modern democracy up to the task of halting and reversing the world’s authoritarian turn?  Is it capable of countering effectively Russian rage and Chinese ambition?  These are the questions Diamond wrestles with throughout this timely and passionately argued work.

* * *

For Diamond, democracy at its core is a system of government where people choose and can change their leaders in regular, free and fair elections.  Such a system should also include strong protections for basic liberties, such as freedom of speech, press and religion; protection for racial and cultural minorities; a robust rule of law and an independent judiciary; trustworthy law enforcement institutions; and a lively civil society.   Diamond says little here about the economic systems of countries seeking to establish and sustain democratic institutions.  But at least since the fall of the Soviet Union in 1991, most democracy experts agree that market economies allowing for free enterprise — along with ample room for state regulation in the public interest — are most compatible with modern democracy.

But sustaining democracy over the longer term depends more on culture than institutions, Diamond argues.  A country’s citizens need to believe in democracy and be “willing to defend it as a way of life” (p.25), in which case the level of economic development and the precise design of institutions matter less. When democracy lacks broad support, it will “always be a fragile reed” (p.25).   And the paramount component of democratic culture is legitimacy, the “resilient and broadly shared belief that democracy is better than any other imaginable form of government.  People must commit to democracy come hell or high water, and stick with it even when the economy tanks, incomes plunge, or politicians misbehave” (p.25).

Democracy is hardly restricted to those economically advanced countries we call “Western” (“Western” and “the West” include not just the countries of Western Europe and North America but also prosperous democratic countries that are not geographically part of the West, such as Japan and New Zealand).  A country does not have to be economically well off to institutionalize democracy, Diamond insists. Many African countries have made earnest starts.  But successful transitions to democracy nonetheless remain strongly linked to economic prosperity, he argues, citing the examples of Greece, Spain, Chile, South Korea, Taiwan and South Africa.

But Russia and China are undermining democracy in all corners of the globe, each blowing its own “ill winds” across the planet.  In Russia’s case, they are the winds of “anger, insecurity, and resentments of a former superpower;” with China, those of “ambitions, swagger, and overreach of a new one” (p.130-31).  Both are investing heavily in efforts to “promote disinformation and covertly subvert democratic norms and institutions” (p.12).   Among today’s foes of democracy, only two leaders, Vladimir Putin and Xi Jinping, have “enough power and ambition to undermine the entire global liberal order” (p.161).

Russia experienced some shallow and tentative moves toward democracy in the 1990s, in the aftermath of the collapse of the Soviet Union.  But since Putin assumed power in 2000, the movement has been almost exclusively in the opposite direction.  Deeply insecure about the legitimacy of his rule, Putin believes that the West is “seeking to encircle Russia and keep it weak” (p.111).   The 2013-14 “Eurormaidan Revolution” in Ukraine, which brought down Viktor Yanukovych, a key autocratic partner, infuriated Putin.   The United States had “toppled his closest ally, in a country he regarded as an extension of Russia itself,” as an American journalist put it.  “All that money American had spent on prodemocracy NGOs in Ukraine had paid off” (p.112).

Russia has mastered the use of social media to “stimulate division, increase social and racial unrest, and undermine the self-assurance of the major Western democracies – and work to divide them from one another” (p.112). Its most dramatic targets were Hilary Clinton and the 2016 U.S. Presidential election. Clinton “would almost certainly have won the Electoral College if there had been no Russian intervention” (p.118), Diamond asserts, although he offers no evidentiary support for this assertion.  In hacking the 2016 US election, Putin succeeded in both of his apparent aims: to “sow division and discord in American democracy . . . [and] to punish Clinton and elect Trump” (p.118).

But the 2016 election was just one instance of Russia’s use of social media disinformation campaigns to undermine liberal democracy.  These campaigns, assaults “on truth itself” and  on the “very notion that there can be ‘an objective, verifiable set of facts” (p.119), often aim to strengthen extremist political forces within established democracies.  They “do not need to – and do not really aim to – persuade democratic publics that Russia’s positions are right, only that a democracy’s government and political leaders cannot be believed or trusted” (p.119).  Russia under Putin has sought to wreak havoc within the European Union, aiming in particular to end the economic sanctions that Europe and the United States imposed on Russia in retaliation for its aggression in Ukraine.  Russia almost certainly provided significant illicit funding to the Brexit campaign, Diamond contends, helping to tip Britain into leaving the European Union, a “major achievement for a Kremlin that has the destruction of European unity as one of its major aims” (p.121).

But Diamond emphasizes that Russia is a declining power whose “malign intentions and nationalist bravado cannot disguise its outstripped economy and shrinking importance to the twenty-first century world” (p.124).  In the long run, the “ambitions of a rising China, not the resentments of a falling Russia” represent the greatest external challenge to global democracy.  Today’s China, still recovering from what many Chinese consider a century of humiliation at the hands of Japan and the West, is the world’s “most dynamic power” (p.144), with global reach and power that will “increasingly and inevitably dwarf Russia’s” (p.124).

China seeks hegemony over all of Asia and the Pacific, Diamond argues.  It also increasingly aspires to challenge the United States for global leadership, “economically, politically, and, some believe, eventually militarily” (p.131).  Its military spending is now second only to that of the United States and it may catch America militarily “sooner than we care to imagine” (p.142-43).  China has already established a claim to global dominance in such  transformative technologies as artificial intelligence, robotics, drones, and electric cars.

Manipulating social media massively and aggressively, China is also building a “sweeping surveillance state that aims to assess every digital footprint of every Chinese citizen and then compile each person’s ‘social credit score.’” (p.236).  It readily shares its “Orwellian tools” with other a autocratic regimes, “threatening an ‘Arab Spring in reverse’ in which digital technology enable ‘state domination and repression at a staggering scale’” (p.237).

China’s foreign aid goes disproportionately to the world’s autocrats, many of whom think that China has developed a secret formula.  While some authoritarian regimes dislike China’s heavy-handed attempts to win influence and gain control — sometimes considered a new form of colonialism — others are lured to China’s side by “money, power, ambition, and simple admiration for its sheer success” (p.144).  In addition to assisting the world’s autocracies and countries that could bend in that direction, China also focuses on influencing the world’s democracies.

Diamond sees China playing a longer and more patient game than Russia in its dealing with the West. Through media deals, investments, partnership agreements, charitable and political donations, and positions on boards of directors, it is seeking wider and deeper infiltration into what Diamond calls the “vital tissues of democracies” (p.133): publishing houses, entertainment industries, technology companies, universities, think tanks, non-governmental organizations.  Favorable views of China, he notes, exceed that of the United States in much of the world.

Prior to Donald Trump’s successful 2016 presidential candidacy, Diamond considered the United States uniquely qualified to lead the global resistance to Russian rage and Chinese ambition.  Since Trump became president, however, the United States appears to be more on the side of the authoritarians and populists rather than those seeking to institutionalize democracy in their countries – or, at best, on the sidelines while Russia and China seek to extend their influence and undermine democracy.  If there is any upside to the Trump presidency, Diamond notes, it is that it provides a glimpse into the alarming consequences of world without American leadership and steadfastness, a “far more frightening and dangerous place, with muscular, corrupt dictatorships dominating large swaths of the globe through blatant coercion and covert subversion” (p.287).

Trump’s unremitting insistence that the United States is being cheated by its friends and allies has propelled the country “down the self-defeating path of ‘America alone’” (p.301).  His decision to withdraw the United States from the Trans-Pacific Partnership (TPP), a 2016 twelve-nation Pacific Rim free-trade agreement, “so visionary and so necessary,” constitutes in Diamond’s view the “most grievous self-inflicted wound to America’s global leadership since the creation of the liberal world order after World War II” (p.144).  US withdrawal from the TPP amounted to a “massive gift to authoritarian China and a body blow to democratic aspirations in Southeast Asia” (p.144-45), serving  as a “stunning symbol – and accelerator – of both China’s rise and America’s descent.  As the great democracy that dominated world politics in the twentieth century retreated, the great dictatorship that aims to dominate world politics in the twenty-first could hardly believe its luck” (p.145).

Diamond provides an extensive set of recommendations on how the United States and other advanced democratic countries can deliver more sustainable assistance to aspiring and fragile democracies to counter Russia and China.  Priorities need to be combatting kleptocracy, public corruption, and international money laundering; making the internet safe for democracy; and improving  public diplomacy through  smarter uses of “soft power” to counter Russia and China’s “sharp power.”

Kleptocracy, a recent term now frequently used for high level state corruption, involves the theft of state resources that could have advanced the public good but instead were diverted for private gain – hospitals and schools that were not built, for example – and by definition constitutes a crime against a country’s citizens.  Kleptocracy depends upon using the international financial system to “move, mask, and secure ill-gotten fortunes across borders,” posing the “single most urgent internal threat to democracy,” a threat which renders fragile democracies “all the more vulnerable to external subversion” (p.184).  Many of the world’s democracies, not least the United States, are complicit in providing refuge for the ill-gotten gains of the world’s kleptocrats.  Global transfers of untraceable funds have enabled a “stunning array of venal dictators and their family members, political allies, and business cronies to acquire property and influence in the West as well as to corrupt democracy and the rule of law within free nations” (p.184).

Diamond’s recommendations for combatting public corruption and international money laundering are for the most part US-oriented (e.g. modernize and strengthen the Foreign Agents Registration Act; empower the Treasury Department’s Financial Crimes Enforcement Network to conduct its own investigations).  But he also offers some general recommendations that all the world’s advanced democracies could and should follow (e.g. end anonymous shell companies and real estate purchases).

Today, moreover, the Internet and related technologies – email, text messaging, photo sharing – have the potential to uncover public corruption, as well as highlight human rights abuses, expose voter fraud, and organize demonstrations.   These technologies played a major role in the protests in 2011 that brought down Egyptian dictator Hosni Mubarak; and those that challenged Iran’s blatantly fraudulent 2009 elections.   But many modern authoritarian regimes – not just Russia and China — have developed sophisticated means to to “manipulate, manage, vilify, and amplify public opinion online” (p.234). Freedom House considers  growing state level manipulation of social media one of the leading causes of the steady eight-year decline in global Internet freedom.  Making the Internet a safe place for democracy requires a “concerted partnership among democratic governments, technology companies, civil-society groups, and individual ‘netizens’” (p.229).

Diamond also provides a set of recommendations for how the United States can fine tune its own internal democratic mechanisms through, for example, adoption of ranked choice voting, reducing the gerrymandering of legislative districts and the influence of money in politics — worthy objectives, but markedly out of line with the priorities of the Trump administration and today’s Republican Party.  Looking beyond the Trump administration, however, Diamond argues that the tide of authoritarianism can be reversed.

Few people celebrate authoritarianism as a superior system, “morally or practically” (p.225 ).  There are no large-scale surveys of public opinion showing a popular groundswell for authoritarianism.  Rather, in  surveys from every region of the world, “large to overwhelming majorities of the public, on average, said that democracy is the best form of government and that an unaccountable strongman is a bad idea” (p.159-60).  Within even the world’s most tenacious autocracies, “many people want to understand what democracy is and how it can be achieved.  Even many dictators and generalissimos know and fear democracy’s allure” (p.225).  In this networked age, “both idealism and the harder imperatives of global power and security argue for more democracy, not less” (p.200).

* * *

The best way to counter Russian rage and Chinese ambition, Diamond counsels, is to show that Moscow and Beijing are “on the wrong side of history; that people everywhere yearn to be free, and that they can make freedom work to achieve a more just, sustainable and prosperous society” (p.200).   Yet Diamond makes clear that checking the worldwide authoritarian tide depends to an unsettling degree upon the United States reversing its present course and prioritizing anew the global quest for democracy.

 

Thomas H. Peebles

La Châtaigneraie, France

June 26, 2020

 

 

2 Comments

Filed under American Politics, World History

Misjudgments and Misdeeds of an Unseen Power Broker

Jefferson Morley, The Ghost:

The Secret Life of  CIA Spymaster James Jesus Angleton

(St. Martin’s)

James Jesus Angleton served as the Central Intelligence Agency’s head of counterintelligence — its top spy and effectively the number three person in the agency — from 1954 until he was forced into retirement in 1975.  Although his name is a less familiar than that of the FBI’s original director, J. Edgar Hoover, I couldn’t help thinking of Hoover as I read Jefferson Morley’s trenchant biography, The Ghost: The Secret Life of CIA Spymaster James Jesus Angleton.  Both were immensely powerful, paranoid men who repeatedly broke or skirted the law to advance their often-idiosyncratic versions of what United States national security required.  Throughout their careers, both were able to avoid almost all attempts to hold them accountable for their misdeeds.  With the passage of four decades since Hoover’s death in 1972 and Angleton’s departure from the CIA three years later, we can see that the two men seem  embodied what has recently come to be known as the “Deep State,” a nearly independent branch of government in which officials secretly manipulate government policy, as Morley puts it, “largely beyond the view of the Madisonian government and the voting public” (p.xi).

Morley demonstrates that the notorious COINTELPRO operation, associated today with Hoover and arguably his most dubious legacy, actually began as a joint FBI-CIA undertaking that Angleton concocted.  COINTELPRO aimed to infiltrate and disrupt dissidents and included among its targets Dr. Martin Luther King, left leaning organizations, and Vietnam anti-war protestors.  The original idea that Angleton sold to a skeptical Hoover, who considered the CIA a “nest of liberals, atheists, homosexuals, professors, and otherwise feminized men who specialized in wasting the taxpayer dollar” (p.71), was that the Bureau would target subjects within the United States while the Agency would take the lead in targeting subjects outside the United States.

From there, the CIA and FBI collaborated on LINGUAL, an elaborate and extensive program to read American citizens’ mail, which Morley terms perhaps Angleton’s “most flagrant violation of the law” (p.82); and on CHAOS, an operation designed to infiltrate the entire anti-Vietnam war movement, not just people or organizations that engaged in violence or contacted foreign governments. Post-Watergate hearings brought the existence and extent of COINTELPRO, LINGUAL and CHAOS  to light, along with numerous other chilling exercises of authority attributed to the FBI and CIA, leading to Angleton’s involuntary retirement from the agency.

Morley, a freelance journalist and former Washington Post editor, does not make the Hoover comparison explicitly.  He sees in Angleton a streak of Iago, Othello’s untrustworthy advisor: outwardly a “sympathetic counselor with his own agenda, which sometimes verged on the sinister” (p.158).  Angleton served four American presidents with “seeming loyalty and sometimes devious intent” (p.159), he writes (of course, the same could be said of Hoover, who served eight presidents over the course of a career that began in the 1920s).

Writing in icy prose that pieces together short, punchy vignettes with one word titles, Morley undertakes to show how Angleton was able to elevate himself from a “staff functionary” at the CIA, a new agency created in 1947, to an “untouchable mandarin” who had an “all but transcendent influence on U.S. intelligence operations for two decades” (p.67).  At the height of the Cold War, Morley writes, Angleton became an “unseen broker of American power” (p.158).

But Morley’s biography might better be viewed as a compendium of the misjudgments and misdeeds that punctuated Angleton’s career from beginning to end.  Angleton’s judgment failed him repeatedly, most notoriously when his close friend and associate, British intelligence agent Kim Philby, was revealed to have been a Soviet spy from World War II onward (I reviewed Ben McIntyre’s biography of Philby here in 2016). The Philby revelation convinced Angleton that the KGB had also planted an agent within the CIA, precipitating a disastrous and abysmally unsuccessful “mole hunt” that paralyzed the CIA for years and damaged the careers of many innocent fellow employees, yet discovered no one.

The book’s most explosive conjuncture of questionable judgment and conduct involves Angleton’s relationship to Lee Harvey Oswald, President John F. Kennedy’s presumed assassin.  Angleton followed Oswald closely from 1959, when he defected to the Soviet Union, to that fateful day in Dallas in 1963.  Thereafter, Angleton tenaciously withheld his knowledge of Oswald from the Warren Commission, charged with investigating the circumstances of the Kennedy assassination, to the point where Morley suggests that Angleton should have been indicted for obstruction of justice.  The full extent of Angleton’s knowledge of Oswald has yet to come out, leaving his work laden with fodder for those of a conspiratorial bent who insist that Oswald was something other than a lone gunman, acting alone, as the Warren Commission found (in 2015, I reviewed Peter Savodnik’s biography of Oswald here, in which Savodnik argues forcefully for the lone gunman view of Oswald).

* * *

Born in 1917 in Boise, Idaho, Angleton was the son of a prosperous merchant father and a Mexican-American mother (hence the middle name “Jesus”).  At age 16, the young Angleton moved with his family to Milan, where his father ran the Italian-American Chamber of Commerce and was friendly with many leaders in the fascist regime of Benito Mussolini.  For the remainder of his life, James retained a fondness for Italy, Italian culture and, it could be argued, the Italian brand of fascism.

Angleton attended boarding school in England, then went on to Yale as an undergraduate.  At Yale, he demonstrated a keen interest in poetry and came under the influence of the poet Erza Pound, who later became notorious for his Nazi sympathies (after an investigation led by J. Edgar Hoover, Pound was jailed during World War II).  Poetry constituted a powerful method for Angleton, Morley writes.  He would come to value “coded language, textual analysis, ambiguity, and close control as the means to illuminate the amoral arts of spying that became his job.  Literary criticism led him to the profession of secret intelligence.  Poetry gave birth to a spy” (p.8).

During World War II, Angleton found his way to the Office of Strategic Services, the CIA’s predecessor agency.  He spent the later portion of the war years in Rome, where he developed a friendship with Junio Valerio Borghese, “perhaps the most famous fascist military commander in Italy” (p.21).  Angleton helped Borghese avoid execution at the hands of the same partisan forces that captured and executed Mussolini in 1945.  Thanks to Angleton’s efforts, Borghese “survived to become titular and spiritual leader of postwar Italian fascism” (p.27), and one of the United States’ key partners in preventing a Communist takeover of postwar Italy.

Angleton prepared for his assignment in Rome at Bletchley Park in England, the center of Allied code-breaking operations during World War II.  There, Angleton learned the craft of counter-intelligence under the tutelage of Kim Philby, who taught the young American “how to run double agent operations, to intercept wireless and mail messages, and to feed false information to the enemy.  Angleton would prove to be his most trusting friend” (p.18).  After the war, Philby and Angleton both found themselves in Washington, where they became inseparable buddies, the “closest of friends, soul mates in espionage” (p.41).  Each saw in the other the qualities needed to succeed in espionage: ruthlessness, calculation, autonomy, and cleverness.

The news of Philby’s 1963 defection to Moscow iwas “almost incomprehensible” (p.123) to Angleton.  What he had considered a deep and warm relationship had been a sham.  Philby was “his friend, his mentor, his confidant, his boozy buddy,” Morley writes.  And “through every meeting, conference, debriefing, confidential aside, and cocktail party, his friend had played him for a fool” (p.124).  Philby’s defection does not appear to have damaged Angleton’s position within the CIA, but it set him off on a disastrous hunt for a KGB “mole” that would paralyze and divide the agency for years.

Angleton’s mole hunt hardened into a “fixed idea, which fueled an ideological crusade that more than a few of his colleagues denounced as a witch hunt” (p.86).  Angleton’s operation  was multi-faceted,  “consisting of dozens of different mole hunts – some targeting individuals, others focused on components within the CIA (p.135).  Angleton’s suspicions “effectively stunted or ended the career of colleagues who were guilty of nothing” (p.198).  To this day, after the opening of significant portions of KGB archives in the aftermath of the fall of the Soviet Union, there is no indication it ever had a mole burrowed into the CIA.  Angleton’s mole hunt, Morley concludes, “soaked in alcohol” and permeated by “convoluted certitudes,” brought Angleton to the “brink of being a fool” (p.126).

Just as Angleton never gave up his (witch) hunt for the KGB spy within the CIA, he became convinced that Harold Wilson, British Labor politician and for a while Prime Minister, was a Soviet Spy, and never relinquished this odd view either.  And he argued almost until the day he departed from the CIA that the diplomatic sparring and occasional direct confrontation between the Soviet Union and China was an elaborate exercise in disinformation to deceive the West.

While head of counterintelligence at the CIA, Angleton served simultaneously as the agency’s desk officer for Israel, the direct link between Israeli and American intelligence services.  Angleton was initially wary of the Israeli state that came into existence in 1948, in part the residue of the anti-Semitism he had entertained in his youth, in part the product of his view that too many Jews were communists. By the mid-1950s, however, Angleton had overcome his initial reticence to become an admirer of Israel and especially Mossad, its primary intelligence service.

But Angleton’s judgment in his relationship with Israel frequently failed him just as it failed him in his relationship with Philby.  He did not foresee Israel’s role in the 1956 Anglo-French invasion of Suez (the subject of Ike’s Gamble, reviewed here in 2017), infuriating President Eisenhower.  After winning President Johnson’s favor for calling the Israeli first strike that ignited the June 1967 Six Day War (“accurate almost down to the day and time,” p.181), he incurred the wrath of President Nixon for missing Egypt’s strike at Israel in the October 1973 Yom Kippur War.  Nixon and his Secretary of State, Henry Kissinger, were of the view that Angleton had grown too close to Israel.

Angleton, moreover, was almost certainly involved behind the scenes in a 1968 Israeli heist of uranium enriched nuclear fuel to build its own nuclear reactor, lifted from a Pennsylvania power plant known as NUMEC.  A CIA analyst later concluded that NUMEC had been a “front company deployed in an Israeli-American criminal conspiracy to evade U.S.. nonproliferation laws and supply the Israeli nuclear arsenal” (p.261-62).  Angleton’s loyalty to Israel “betrayed U.S. policy on an epic scale” (p.261), Morley writes.

* * *

Morley’s treatment of Angleton’s relationship to to Lee Harvey Oswald and Fidel Castro’s Cuba raises more questions that it answers.  The CIA learned of Oswald’s attempt to defect to the Soviet Union in November 1959, and began monitoring him at that point.  In this same timeframe, the CIA and FBI began jointly monitoring a pro-Castro group, the Fair Play for Cuba Committee, which would later attract Oswald. Although Angleton was a contemporary and occasional friend of John Kennedy (the two were born the same year), when Kennedy assumed the presidency in 1961, Angleton’s view was that American policy toward Fidel Castro needed to be more aggressive. He viewed Cuba as still another Soviet satellite state, but one just 90 miles from United States shores.

The Kennedy administration’s Cuba policy got off to a miserable start with the infamous failure of the April 1961 Bay of Pigs operation to dislodge Castro.  Kennedy was furious with the way the CIA and the military had presented the options to him and fired CIA Director Allen Dulles in the operation’s aftermath (Dulles’ demise is one of the subjects of Stephen Kinzer’s The Brothers, reviewed here in 2014). But elements within the CIA and the military held Kennedy responsible for the failure by refusing to order air support for the operation (Kennedy had been assured prior to the invasion that no additional military assistance would be necessary).

CIA and military distrust for Kennedy heightened after the Cuban Missile Crisis of October 1962, when the United States and the Soviet Union faced off in what threatened to be a nuclear confrontation over the placement of offensive Soviet missiles on the renegade island.  Although Kennedy’s handling of that crisis was widely acclaimed as his finest moment as president, many within the military and the CIA, Angleton included, thought that Kennedy’s pledge to Soviet Premier Khrushchev of no invasion of Cuba in exchange for Soviet withdrawal of missiles had given Castro and his Soviet allies too much.  Taking the invasion option off the table amounted in Angleton’s view to a cave in to Soviet aggression and a betrayal of the anti-Castro Cuban community in the United States.

In the 13 months that remained of the Kennedy presidency, the administration continued to obsess over Cuba, with a variety of operations under consideration to dislodge Castro.  The CIA was also  monitoring Soviet defector Oswald, who by this time had returned to the United States.  Angleton placed Oswald’s’ name on the LINGUAL list to track his mail.  By the fall of 1963, Oswald had become active in the Fair Play for Cuba Committee, passing out FPCC leaflets in New Orleans.  He was briefly arrested for disturbing the peace after an altercation with anti-Castro activists.  In October of that year, a mere one month before the Kennedy assassination, the FBI and CIA received notice that Oswald had been in touch with the Soviet and Cuban embassies and consular sections in Mexico City.  Angleton followed Oswald’s Mexico City visits intensely, yet withheld for the rest of his life precisely what he knew about them .

From the moment Kennedy was assassinated, Angleton “always sought to give the impression that he knew very little about Oswald before November 22, 1963” (p.140).  But Angleton and his staff, Morley observes, had “monitored Oswald’s movements for four years. As the former marine moved from Moscow to Minsk to Fort Worth to New Orleans to Mexico City to Dallas,” the special group Angleton created to track defectors “received reports on him everywhere he went” (p.140-41).  Angleton clearly knew that Oswald was in Dallas in November 1963.   He hid his knowledge of Oswald from the Warren Commission, established by President Lyndon Johnson to investigate the Kennedy assassination. What was Angleton’s motivation for obfuscation?

The most plausible – and most innocent – explanation is that Angleton was protecting his own rear end in an “epic counterintelligence failure” that had “culminated on Angleton’s watch. It was bigger than the Philby affair and bloodier” (p.140).  Given this disastrous counterintelligence failure, Morley argues, Angleton “could have – and should have – lost his job after November 22 [1963].  Had the public, the Congress, and the Warren Commission known of his pre-assassination interest in Oswald or his post-assassination cover-up, he surely would have” (p.157).

But the range of possibilities Morley considers extends to speculation that Angleton may have been hiding his own involvement in a Deep State operation to assassinate the president.   Was Angleton running Oswald as an agent in an assassination plot, Morley asks:

He certainly had the knowledge and ability to do so.  Angleton and his staff had a granular knowledge of Oswald long before Kennedy was killed.  Angleton had a penchant for running operations outside of reporting channels. He articulated a vigilant anti-communism that depicted the results of JFK’s liberal policies in apocalyptic terms. He participated in discussions of political assassination. And he worked in a penumbra of cunning that excluded few possibilities (p.265).

Whether Angleton manipulated Oswald as part of an assassination plot is a question Morley is not prepared to answer.  But in Morley’s view, Angleton plainly “obstructed justice to hide interest in Oswald.   He lied to veil his use of the ex-defector in later 1963 for intelligence purposes related to the Cuban consulate in Mexico City. . . Whoever killed JFK, Angleton protected them. He masterminded the JFK conspiracy and cover up” (p.265).   To this day, no consensus exists as to why Angleton dodged all questions concerning his undisputed control over the CIA’s file on Oswald for four years, up to Oswald’s death in November 1963.  Angleton’s relationship to Oswald remains “shrouded in deception and perjury, theories and disinformation, lies and legends” (p.87), Morley concludes.  Even though a fuller story began to emerge when Congress ordered the declassification of long-secret JFK assassination records in the 1990s,” the full story has “yet to be disclosed” (p.87).

* * *

The burglary at the Democratic National Headquarters in the Watergate Hotel in June 1972 proved to be Angleton’s professional undoing, just as it was for President Richard Nixon.  The burglary involved three ex-CIA employees, all likely well known to Angleton.   In 1973, in the middle of multiple Watergate investigations, Nixon appointed William Colby as agency director, a man determined to get to the bottom of what was flowing into the public record about the CIA and its possible involvement in Watergate-related activity.

Colby concluded that Angleton’s never-ending mole hunts were “seriously damaging the recruiting of Soviet officers and hurting CIA’s intelligence intake” (p.225).  Colby suspended LINGUAL, finding the mail opening operation “legally questionable and operationally trivial,” having produced little “beyond vague generalities” (p.225). At the same time, New York Times investigative reporter Seymour Hersh published a story that described in great detail Operation CHAOS, the agency’s program aimed at anti-Vietnam activists, attributing ultimate responsibility to Angleton.  Immediately after Christmas 1974. Colby moved  to replace Angleton.

For the first and only time in his career, Angleton’s covert empire within the CIA stood exposed and he left the agency in 1975.  When Jimmy Carter became president in 1977, his Department of Justice elected not to prosecute Angleton, although Morley argues that it had ample basis to do so.  In retirement, Angleton expounded his views to “any and all who cared to listen” (p.256).  He took to running reporters “like he had once run agents in the field, and for the same purpose: to advance his geopolitical vision” (p.266).

* * *

Angleton, a life-long smoker (as well as heavy drinker) was diagnosed with lung cancer in 1986 and died in May 1987.  He was, Morley concludes “fortunate that so much of his legacy was unknown or classified at the time of his death..”  Angleton not only “often acted outside the law and the Constitution,” but also, for the most part, “got away with it” (p.271).

Thomas H. Peebles

La Châtaigneraie, France

June 10, 2020

 

2 Comments

Filed under American Politics, Biography, United States History

Reading Darwin in Abolitionist New England

 

Randall Fuller, The Book That Changed America:

How Darwin’s Theory of Evolution Ignited a Nation (Viking)

In mid-December 1859, the first copy of Charles Darwin’s On the Origin of Species arrived in the United States from England at a wharf in Boston harbor.  Darwin’s book explained how plants and animals had developed and evolved over multiple millennia through a process Darwin termed “natural selection,” a process which distinguished On the Origins of Species from the work of other naturalists of Darwin’s generation.   Although Darwin said little in the book about how humans fit into the natural selection process, the work promised to ignite a battle between science and religion.

In The Book That Changed America: How Darwin’s Theory of Evolution Ignited a Nation, Randall Fuller, professor of American literature at the University of Kansas, contends that what made Darwin’s insight so radical was its “reliance upon a natural mechanism to explain the development of species.  An intelligent Creator was not required for natural selection to operate.  Darwin’s’ vision was of a dynamic, self-generation process of material change.  That process was entirely arbitrary, governed by physical law and chance – and not leading ineluctably . . . toward progress and perfection” (p.24).  Darwin’s work challenged the notion that human beings were a “separate and extraordinary species, differing from every other animal on the planet. Taken to its logical conclusion, it demolished the idea that people had been created in God’s image” (p.24).

On the Origins of Species arrived in the United States at a particularly fraught moment.  In October 1859, abolitionist John Brown had conducted a raid on a federal arsenal in Harper’s Ferry (then part of Virginia, today West Virginia), with the intention of precipitating a rebellion that would eradicate slavery from American soil.  The raid failed spectacularly: Brown was captured, tried for treason and hung on December 2, 1859.  The raid and its aftermath exacerbated tensions between North and South, further polarizing the already bitterly divided country over the issue of chattel slavery in its southern states.  Notwithstanding the little Darwin had written about how humans fit into the natural selection process, abolitionists seized on hints in the book that all humans were biologically related to buttress their arguments against slavery.  To the abolitionists, Darwin “seemed to refute once and for all the idea that African American slaves were a separate, inferior species” (p.x).

Asa Gray, a respected botanist at Harvard University and a friend of Darwin, received the first copy of On the Origin of Species in the United States.  He passed the copy, which he annotated heavily, to his cousin by marriage  Charles Loring Brace (who was also a distant cousin of Harriet Beecher Stowe, author of the anti-slavery runaway best-seller Uncle Tom’s Cabin).  Brace in turn introduced the book to three men: Franklin Benjamin Sanborn, a part-time school master and full-time abolitionist activist; Amos Bronson Alcott, an educator and loquacious philosopher, today best remembered as the father of author Louisa May Alcott; and Henry David Thoreau, one of America’s best known philosophers and truth-seekers.  Sanborn, Alcott and Thoreau were residents of Concord, Massachusetts, roughly twenty miles north of Boston, the site of a famous Revolutionary War battle but in the mid-19th century both a leading literary center and a hotbed of abolitionist sentiment.

As luck would have it, Brace, Alcott and Thoreau gathered at Sanborn’s Concord home on New Year’s Day 1860.  Only Gray did not attend. The four men almost certainly shared their initial reactions to Darwin’s work.   This get together constitutes the starting point for Fuller’s engrossing study, centered on how Gray and the four men in Sanborn’s parlor on that New Year’s Day  absorbed Darwin’s book.   Darwin himself is at best a background figure in the study.  Several familiar figures make occasional appearances, among them:  Frederick Douglass, renowned orator and “easily the most famous black man in America” (p.91); Bronson Alcott’s author-daughter Louisa May; and American philosophe Ralph Waldo Emerson, Thoreau’s mentor and friend.  Emerson, like Louisa May and her father, was a Concord resident, and Fuller’s study takes place mostly there, with occasional forays to nearby Boston and Cambridge.

Fuller’s study is therefore more tightly circumscribed geographically than its title suggests.  He spends little time detailing the reaction to Darwin’s work in other parts of the United States, most conspicuously in the American South, where any work that might seem to support abolitionism and undermine slavery was anathema.   The study is also circumscribed in time; it takes place mostly in 1860, with most of the rest confined to the first half of the 1860s, up to the end of the American Civil War in 1865.  Fuller barely mentions what is sometimes called “Social Darwinism,” a notion that gained traction in the decades after the Civil War that purported to apply Darwin’s theory of natural selection to the competition between individuals in politics and economics, producing an argument for unregulated capitalism.

Rather, Fuller charts out the paths each of his five main characters traversed in absorbing and assimilating into their own worldviews the scientific, religious and political ramifications of Darwin’s work, particularly during the tumultuous year 1860.   All five were fervent abolitionists.   Sunburn was a co-conspirator in John Brown’s raid.  Thoreau gave a series of eloquent, impassioned speeches in support of Brown.  All were convinced that Darwin’s notion of natural selection had provided still another argument against slavery, based on science rather than morality or economics.  But in varying degrees, all five could also be considered adherents of transcendentalism, a mid-19th century philosophical approach that posited a form of human knowledge that goes beyond, or transcends, what can be seen, heard, tasted, touched or felt.

Although transcendentalists were almost by definition highly individualistic, most believed that a special force or intelligence stood behind nature and that prudential design ruled the universe.  Many subscribed to the notion that humans were the products of some sort of “special creation.”   Most saw God everywhere, and considered the human mind “resplendent with powers and insights wholly distinct from the external world” (p.54).  Transcendentalism was both an effort to invoke the divinity within man and, as Fuller puts it, also “cultural attack on a nation that had become too materialistic, too conformist, too smug about its place in history” (p.66).

Transcendentalism thus hovered in the background in 1860 as all but Sanborn wrestled with the implications of Darwinism (Sanborn spent much of the year fleeing federal authorities seeking his arrest for his role in John Brown’s raid).  Alcott never left transcendentalism, rejecting much of Darwinism.  Gray and Brace initially seemed to embrace Darwinian theories wholeheartedly, but in different ways each pulled back once he fully grasped the full implications of those theories.   Thoreau was the only one of the five who accepted wholly Darwinism’s most radical implications, using Darwin’s theories to “redirect his life’s work” (p.ix).

Fuller’s study thus combines a deep dive into the New England abolitionist milieu at a time when the United States was fracturing over the issue of slavery with a medium level dive into the intricacies of Darwin’s theory of natural selection.   But the story Fuller tells is anything but dry and abstract.  With an elegant writing style and an acute sense of detail, Fuller places his five men and their thinking about Darwin in their habitat, the frenetic world of 1860s New England.  In vivid passages, readers can almost feel the chilly January wind whistling through Franklin Sanborn’s parlor that New Year’s Day 1860, or envision the mud accumulating on Henry David Thoreau’s boots as he trudges through the melting snow in the woods on a March afternoon contemplating Darwin.  The result is a lively, easy-to-read narrative that nimbly mixes intellectual and everyday, ground-level history.

* * *

Bronson Alcott, described by Fuller as America’s most radical transcendentalist, never accepted the premises of On the Origins of Species.  Darwin had, in Alcott’s view, “reduced human life to chemistry, to mechanical processes, to vulgar materialism” (p.10).  To Alcott, Darwin seemed “morbidly attached to an amoral struggle of existence, which robbed humans of free will and ignored the promptings of the soul” (p.150). Alcott could not imagine a universe “so perversely cruel as to produce life without meaning.  Nor could he bear to live in a world that was reduced to the most tangible and daily phenomena, to random change and process”(p.188).  Asa Gray, one of America’s most eminent scientists, came to the same realization, but  only after thoroughly digesting Darwin and explaining his theories to a wide swath of the American public.

Gray’s initial reaction to Darwin’s work was one of unbounded enthusiasm.  Gray covered nearly every page of the book with his own annotations.  He admired the book because it “reinforced his conviction that inductive reasoning was the proper approach to science” (p.109).  He also admired the work’s “artfully modulated tone, [and] its modest voice, which softened the more audacious ideas rippling through the text” (p.17). Gray was most impressed with Darwin’s “careful judging and clear-eyed balancing of data” (p.110).  To grapple with Darwin’s ideas, Gray maintained, one had to “follow the evidence wherever it led, ignoring prior convictions and certainties or the narrative one wanted that evidence to confirm” (p.110).  Without saying so explicitly, Gray suggested that readers of Darwin’s book had to be “open to the possibility that everything they had taken for granted was in fact incorrect” (p.110).

Gray reviewed On the Origins of Species for the Atlantic Monthly in three parts, appearing  in the summer and fall of 1860.  Gray’s articles served as the first encounter with Darwin for many American readers.  The articles elicited a steady stream of letters from respectful readers.  Some responded with “unalloyed enthusiasm” for a new idea which “seemed to unlock the mysteries of nature” (p.134).  Others, however, “reacted with anger toward a theory that proposed to unravel . . . their belief in a divine Being who had placed humans at the summit of creation” (p.134).  But as Gray finished the third Atlantic article, he began to realize that he himself was not entirely at ease with the diminution of humanity’s place in the universe that Darwin’s work implied.

The third Atlantic article, appearing in October 1860, revealed Gray’s increasing difficulty in “aligning Darwin’s theory with his own religions convictions” (p.213).   Gray proposed that natural selection might be the “God’s chosen method of creation” (p.214).  This idea seemed to resolve the tension between scientific and religious accounts of origins, making Gray the first to develop a theological case for Darwinian theory.  But the idea that natural selection might be the process by which God had fashioned  the world represented what Fuller describes as a “stunning shift for Gray. Before now, he had always insisted that secondary causes were the only items science was qualified to address.  First, or final causes – the beginning of life, the creation of the universe – were the purview of religion: a matter of faith and metaphysics” (p.214).  Darwin responded to Gray’s conjectures by indicating that, as Fuller summarizes the written exchange, the natural world was “simply too murderous and too cruel to have been created by a just and merciful God” (p.211).

In the Atlantic articles, Fuller argues, Gray leapt “beyond his own rules of science, speculating about something that was untestable” (p.214-15 ).  Gray must have known that his argument “failed to adhere to his own definition of science” (p.216).  But, much like Bronson Alcott, Gray found it “impossible to live in the world Darwin had imagined: a world of chance, a world that did not require a God to operate” (p.216).  Charles Brace, a noted social reformer who founded several institutions for orphans and destitute children, greeted Darwin’s book  with an initial enthusiasm that rivaled that of Gray.

Brace  claimed to have read On the Origins of Species 13 times.  He was most attracted to the book for its implications for human societies, especially for American society, where nearly half the country accepted and defended human slavery.  Darwin’s book “confirmed Brace’s belief that environment played a crucial role in the moral life of humans” (p.11), and demonstrated that every person in the world, black, white, yellow, was related to every one else.  The theory of natural selection was thus for Brace the “latest argument against chattel slavery, a scientific claim that could be used in the most important controversy of his time, a clarion call for abolition” (p.39).

Brace produced a tract entitled The Races of the Old World, modeled after Darwin’s On the Origin of Species, which Fuller describes as a “sprawling, ramshackle work” (p.199).  Its central thesis was simple enough: “There is nothing . . . to prove the negro radically different from the other families of man or even mentally inferior to them” (p.199-200).  But much of The Races of the Old World seemed to undercut Brace’s central thesis.  Although the book never defined the term “race,” Brace “apparently believed that though all humans sprang from the same source, some races had degraded over time . . . Human races were not permanent” (p.199-200).  Brace thus struggled to make Darwin’s theory fit his own ideas about race and slavery. “He increasingly bent facts to fit his own speculations” (p.197), as Fuller puts it.

The Races of the Old World revealed Brace’s hesitation in imagining a multi-racial America. He couched in Darwinian terms the difficulty of the races cohabiting,  reverting to what Fuller describes as nonsense about blacks not being conditioned to survive in the colder Northern climate.  Brace “firmly believed in the emancipation of slaves, and he was equally convinced that blacks and white did not differ in their mental capacities” (p.202).  But he nonetheless worried that “race mixing,” or what was then termed race “amalgamation,” might imperil Anglo-Saxon America, the “apex of development. . . God’s favored nation, a place where democracy and Christianity had fused to create the world’s best hope” (p.202).  Brace joined many other leading abolitionists in opposing race “amalgamation.”  His conclusion that “black and brown-skinned people inhabited a lower run on the ladder of civilization” was shared, Fuller indicates, by “even the most enlightened New England abolitionists” (p.57).

No such misgivings visited Thoreau, who  grappled with On the Origins of Species “as thoroughly and as insightfully as any American of the period” (p.11).  As Thoreau first read his copy of the book in late January 1860,  a “new universe took form on the rectangular page before him” (p.75).  Prior to his encounter with Darwin, Thoreau’s thought had often “bordered on the nostalgic.  He longed for the transcendentalist’s confidence in a natural world infused with spirit” (p.157).  But Darwin led Thoreau beyond nostalgia.

Thoreau was struck in particular by Darwin’s portrayal of the struggle among species as an engine of creation.  The Origin of Species revealed nature as process, in constant transformation.  Darwin’s book directed Thoreau’s attention “away from fixed concepts and hierarchies toward movement instead” (p.144-45).  The idea of struggle among species “undermined transcendentalist assumptions about the essential goodness of nature, but it also corroborated many of Thoreau’s own observations” (p.137).  Thoreau had “long suspected that people were an intrinsic part of nature – neither separate nor entirely alienated from it” (p.155).  Darwin now enabled Thoreau to see how “people and the environment worked together to fashion the world,” providing a “scientific foundation for Thoreau’s belief that humans and nature were part of the same continuum” (p.155).

Darwin’s natural selection, Thoreau wrote, “implies a greater vital force in nature, because it is more flexible and accommodating, and equivalent to a sort of constant new creation” (p.246).  The phrase “constant new creation” in Fuller’s view represents an “epoch in American thought” because it “no longer relies upon divinity to explain the natural world” (p.246).  Darwin thus propelled Thoreau to a radical vision in which there was “no force or intelligence behind Nature, directing its course in a determined and purposeful manner.  Nature just was” (p.246-47).

How far Thoreau would have taken these ideas is impossible to know. He became sick in December 1860, stricken with influenza, exacerbated by tuberculosis, and died in June 1862, with Americans fighting other Americans on the battlefield over the issue of slavery.

* * *

            Fuller compares Darwin’s On the Origin of Species to a Trojan horse.  It entered American culture “using the newly prestigious language of science, only to attack, once inside, the nation’s cherished beliefs. . . With special and desolating force, it combated the idea that God had placed humans at the peak of creation” (p.213).  That the book’s attack did not spare even New England’s best known abolitionists and transcendentalists demonstrates just how unsettling the attack was.

Thomas H. Peebles

La Châtaigneraie, France

May 18, 2020

 

10 Comments

Filed under American Society, History, Political Theory, Religion, Science, United States History

The Power of Human Rights

 

Samantha Power, The Education of an Idealist:

A Memoir 

By almost any measure, Samantha Power should be considered an extraordinary American success story. An immigrant from Ireland who fled the Emerald Isle with her mother and brother at a young age to escape a turbulent family situation, Power earned degrees from Yale University and Harvard Law School, rose to prominence in her mid-20s as a journalist covering civil wars and ethnic cleaning in Bosnia and the Balkans, won a Pulitzer Prize for a book on 20th century genocides, and helped found the Carr Center for Human Rights Policy at Harvard’s Kennedy School of Government, where she served as its executive director — all before age 35.  Then she met an ambitious junior Senator from Illinois, Barack Obama, and her career really took off.

Between 2009 and 2017, Power served in the Obama administration almost continually, first on the National Security Council and subsequently as Ambassador to the United Nations.  In both capacities, she became the administration’s most outspoken and influential voice for prioritizing human rights, arguing regularly for targeted United States and multi-lateral interventions to protect individuals from human rights abuses and mass atrocities, perpetrated in most cases by their own governments.  In what amounts to an autobiography, The Education of an Idealist: A Memoir, Power guides her readers through  the major foreign policy crises of the Obama administration.

Her life story, Power tells her readers at the outset, is one of idealism, “where it comes from, how it gets challenged, and why it must endure” (p.xii).  She is quick to emphasize that hers is not a story of how a person with “lofty dreams” about making a difference in the world came to be “’educated’ by the “brutish forces” (p.xii) she encountered throughout her professional career.  So what then is the nature of the idealist’s “education” that provides the title to her memoir?  The short answer probably lies in how Power learned to make her idealistic message on human rights both heard and effective within the complex bureaucratic structures of the United States government and the United Nations.

But Power almost invariably couples this idealistic message with the view that the promotion and protection of human rights across the globe is in the United States’ own national security interests; and that the United States can often advance those interests most effectively by working multi-laterally, through international organizations and with like-minded states.  The United States, by virtue of its multi-faceted strengths – economic, military and cultural – is in a unique position to influence the actions of other states, from its traditional allies all the way to those that inflict atrocities upon their citizens.

Power acknowledges that the United States has not always used its strength as a positive force for human rights and human betterment – one immediate example is the 2003 Iraq invasion, which she opposed. Nevertheless, the United States retains a reservoir of credibility sufficient to be effective on human rights matters when it choses to do so.   Although Power is sometimes labeled a foreign policy “hawk,” she recoils from that adjective.  To Power, the military is among the last of the tools that should be considered to advance America’s interests around the world.

Into this policy-rich discussion, Power weaves much detail about her personal life, beginning with her early years in Ireland,  the incompatibilities between her parents that prompted her mother to take her and her brother to the United States when she was nine, and her efforts as a schoolgirl to become American in the full sense of the term. After numerous failed romances, she finally met Mr. Right, her husband, Harvard Law School professor Cass Sunstein (who also served briefly in the Obama administration). The marriage gave rise to a boy and a girl with lovely Irish names, Declan and Rían, both born while Power was in government.  With much emphasis upon her parents, husband, children and family life, the memoir is also a case study of how professional women balance the exacting demands of high-level jobs with the formidable responsibilities attached to being a parent and spouse.  It’s a tough balancing act for any parent, but especially for women, and Power admits that she did not always strike the right balance.

Memoirs by political and public figures are frequently attempts to write one’s biography before someone else does, and Power’s whopping 550-page work seems to fit this rule.  But Power provides much candor  – a willingness to admit to mistakes and share vulnerabilities – that is often missing in political memoirs. Refreshingly, she also abstains from serious score settling.  Most striking for me is the nostalgia that pervades the memoir.  Power takes her readers down memory lane, depicting a now by-gone time when the United States cared about human rights and believed in bi- and multi-lateral cooperation to accomplish its goals in its dealings with the rest of the world – a time that sure seems long ago.

* * *

Samantha Jane Power was born in 1970 to Irish parents, Vera Delaney, a doctor, and Jim Power, a part-time dentist.  She spent her early years in Dublin, in a tense family environment where, she can see now, her parents’ marriage was coming unraveled.  Her father put in far more time at Hartigan’s, a local pub in the neighborhood where he was known for his musical skills and “holding court,” than he did at his dentist’s office.  Although young Samantha didn’t recognize it at the time, her father had a serious alcohol problem, serious enough to lead her mother to escape by immigrating to the United States with the couple’s two children, Samantha, then age nine, and her brother Stephen, two years younger. They settled in Pittsburgh, where Samantha at a young age set about to become American, as she dropped her Irish accent, tried to learn the intricacies of American sports, and became a fervent Pittsburgh Pirates fan.

But the two children were required under the terms of their parents’ custody agreement to spend time with her father back in Ireland. On her trip back at Christmas 1979, Samantha’s father informed the nine-year old that he intended to keep her and her brother with him.  When her mother, who was staying nearby, showed up to object and collect her children to return to the United States, a parental confrontation ensued which would traumatize Samantha for decades.  The nine year old found herself caught between the conflicting commands of her two parents and, in a split second decision, left with her mother and returned to the Pittsburgh. She never again saw her father.

When her father died unexpectedly five years later, at age 47 of alcohol-related complications, Samantha, then in high school, blamed herself for her father’s death and carried a sense of guilt with her well into her adult years. It was not until she was thirty-five, after many therapy sessions, that she came to accept that she had not been responsible for her father’s death.  Then, a few years later, she made the mistake of returning to Hartigan’s, where she encountered the bar lady who had worked there in her father’s time.   Mostly out of curiosity, Power asked her why, given that so many people drank so much at Hartigan’s, her father had been the only one who died. The bar lady’s answer was matter-of-fact: “Because you left” (p.192) — not what Power needed to hear.

Power had by then already acquired a public persona as a human rights advocate through her work as a journalist in the 1990s in Bosnia, where she called attention to the ethnic cleansing that was sweeping the country in the aftermath of the collapse of the former Yugoslavia.  Power ended up writing for a number of major publications, including The Economist, the New Republic and the Washington Post.   She was among the first to report on the fall of Srebrenica in July 1995, the largest single massacre in Europe since World War II, in which around 10,000 Muslim men and boy were taken prisoner and “seemed to have simply vanished” (p.102). Although the United States and its NATO allies had imposed a no-fly zone over Bosnia, Power hoped the Clinton administration would commit to employing ground troops to prevent further atrocities. But she did not yet enjoy the clout to have a real chance at making her case directly with the administration.

Power wrote a chronology of the conflict, Breakdown in the Balkans, which was later put into book form and attracted attention from think tanks, and the diplomatic, policy and media communities.  Attracting even more attention was  A Problem for Hell: America and the Age of Genocide, her book exploring  American reluctance to take action in the face of 20th century mass atrocities and genocides.  The book appeared in 2002, and won the 2003 Pulitzer Prize for General Non-Fiction.  It also provided Power with her inroad to Senator Barack Obama.

At the recommendation of a politically well-connected friend, in late 2004 Power sent a copy of the book to the recently elected Illinois Senator who had inspired the Democratic National Convention that summer with an electrifying keynote address.  Obama’s office scheduled a dinner for her with the Senator which was supposed to last 45 minutes.  The dinner went on for four hours as the two exchanged ideas about America’s place in the world and how, why and when it should advance human rights as a component of its foreign policy.  Although Obama considered Power to be primarily an academic, he offered her a position on his Senate staff, where she started working late in 2005.

Obama and Power would then be linked professionally more or less continually until the end of the Obama presidency in January 2017.   Once Obama enters the memoir, at about the one-third point, it becomes as much his story as hers. The two did not always see the world and specific world problems in the same way, but it’s clear that Obama had great appreciation both for Power’s intelligence and her intensity. He was a man who enjoyed being challenged intellectually, and plainly valued the human rights perspective that Power brought to their policy discussions even if he wasn’t prepared to push as far as Power advocated.

After Obama threw his hat in the ring for the 2008 Democratic Party nomination, Power became one of his primary foreign policy advisors and, more generally, a political operative. It was not a role that fit Power comfortably and it threatened to be short-lived.  In the heat of the primary campaign, with Obama and Hilary Clinton facing off in a vigorously contested battle for their party’s nomination, Power was quoted in an obscure British publication, the Scotsman, as describing Clinton as a “monster.” The right-wing Drudge Report picked up the quotation, whose accuracy Power does not contest, and suddenly Power found herself on the front page of major newspapers, the subject of a story she did not want.  Obama’s closest advisors were of the view that she would have to resign from the campaign.  But the candidate himself, who loved sports metaphors, told Power only that she would have to spend some time in the “penalty box” (p.187).  Obama’s relatively soft reaction was an indication of the potential he saw in her and his assessment of her prospective value to him if successful in the primaries and the general election.

Power’s time in the penalty box had expired when Obama, having defeated Clinton for his party’s nomination, won a resounding victory in the general election in November 2008.  Obama badly wanted Power on his team in some capacity, and the transition team placed her on the President’s National Security Council as principal deputy for international organizations, especially the United Nations.  But she was also able to carve out a concurrent position for herself as the President’s Senior Director for Human Rights.   In this portion of the memoir, Power describes learning the jargon and often-arcane skills needed to be effective on the council and within the vast foreign policy bureaucracy of the United States government.  Being solely responsibility for human rights, Power found that she had some leeway in deciding which issues to concentrate on and bring to the attention of the full Council.  Her mentor Richard Holbrook advised her that she could be most effective on subjects for which there was limited United States interest – pick “small fights,” Holbrook advised.

Power had a hand in a string of “small victories” while on the National Security Council: coaxing the United States to rejoin a number of UN agencies from which the Bush Administration had walked away; convincing President Obama to raise his voice over atrocities perpetrated by governments in Sri Lanka and Sudan against their own citizens; being appointed White House coordinator for Iraqi refugees; helping create an inter-agency board to coordinate the United States government’s response to war crimes and atrocities; and encouraging increased emphasis upon lesbian, gay, bi-sexual and transgender issues (LGBT) overseas.  In pursuit of the latter, Obama delivered an address at the UN General Assembly on LGBT rights, and thereafter issued a Presidential Memorandum directing all US agencies to consider LGBT issues explicitly in crafting overseas assistance (disclosure: while with the Department of Justice, I served on the department’s portion of the inter-agency Atrocity Prevention Board, and represented the department in inter-agency coordination on the President’s LGBT memorandum; I never met Power in either capacity).

But the Arab Spring that erupted in late 2010 and early 2011 presented  anything but small issues and resulted in few victories for the Obama administration.  A “cascade of revolts that would reorder huge swaths of the Arab world,” the Arab Spring ended up “impacting the course of Obama’s presidency more than any other geopolitical development during his eight years in office” (p.288), Power writes, and the same could be said for Power’s time in government.  Power was among those at the National Security Council who pushed successfully for United States military intervention in Libya to protect Libyan citizens from the predations of their leader, Muammar Qaddafi.

The intervention, backed by a United Nations Security Council resolution and led jointly by the United States, France and Jordan, saved civilian lives and contributed to Qaddafi’s ouster and death.  ButPresident Obama was determined to avoid a longer-term and more open-ended United States commitment, and the mission stopped short of the follow-up needed to bring stability to the country.  With civil war in various guises continuing to this day, Power suggests that the outcome might have been different had the United States continued its engagement in the aftermath of Qaddafi’s death.

Shortly after Power became US Ambassador to the United Nations, the volatile issue of an American military commitment arose again, this time in Syria in August 2013, when proof came irrefutably to light that Syrian leader Bashar al-Assad was using chemical weapons in his effort to suppress uprisings within the country.  The revelations came 13 months after Obama had asserted that use of such weapons would constitute a “red line” that would move him to intervene militarily in Syria.  Power favored targeted US air strikes within Syria.

Obama came excruciatingly close to approving such strikes.  He not only concluded that the “costs of not responding forcefully were greater than the risks of taking military action” (p.369), but was prepared to act without UN Security Council authorization, given the certainty of  a Russian veto of any Security Council resolution for concerted action.   With elevated stakes for “upholding the international norm against the use of chemical weapons” Power writes, Obama was “prepared to operate with what White House lawyers called a ‘traditionally recognized legal basis under international law’” (p.369).

But almost overnight, Obama decided that he needed prior Congressional authorization for a military strike in Syria, a decision taken seemingly with little effort to ascertain whether there was sufficient support in Congress for such a strike.  With neither the Congress nor the American public supporting military action within Syria to save civilian lives, Obama backed down.  On no other issue did Power see Obama as torn as he was on Syria,  “convinced that even limited military action would mire the United States in another open-ended conflict, yet wracked by the human toll of the slaughter.  I don’t believe he ever stopped interrogating his choices” (p.508).

Looking back at that decision with the passage of more than five years, Power’s disappointment remains palpable.  The consequences of inaction in Syria, she maintains, went:

beyond unfathomable levels of death, destruction, and displacement. The spillover of the conflict into neighboring countries through massive refugee flows and the spread of ISIS’s ideology has created dangers for people in many parts of the world. . . [T]hose of us involved in helping devise Syria policy will forever carry regret over our inability to do more to stem the crisis.  And we know the consequences of the policies we did choose. For generations to come, the Syrian people and the wide world will be living with the horrific aftermath of the most diabolical atrocities carried out since the Rwanda genocide (p.513-14).

But if incomplete action in Libya and inaction in Syria constitute major disappointments for Power, she considers exemplary the response of both the United States and the United Nations to the July 2014 outbreak of the Ebola virus that occurred in three West African countries, Guinea, Liberia and Sierra Leone.  United States experts initially foresaw more than one million infections of the deadly and contagious disease by the end of 2015.  The United States devised its own plan to send supplies, doctors and nurses to the region to facilitate the training of local health workers to care for Ebola patients, along with 3,000 military personnel to assist with on-the-ground logistics.  Power was able to talk President Obama out of a travel ban to the United States from the three impacted countries, a measure favored not only by Donald Trump, then contemplating an improbable run for the presidency, but also by many members of the President’s own party.

At the United Nations, Power was charged with marshaling global assistance.   She convinced 134 fellow Ambassadors to co-sponsor a Security Council resolution declaring the Ebola outbreak a public health threat to international peace and security, the largest number of co-sponsors for any Security Council resolution in UN history and the first ever directed to a public health crisis.  Thereafter, UN Member States committed $4 billion in supplies, facilities and medical treatments.  The surge of international resources that followed meant that the three West African countries “got what they needed to conquer Ebola” (p.455).  At different times in 2015, each of the countries was declared Ebola-free.

The most deadly and dangerous Ebola outbreak in history was contained, Power observes, above all because of the “heroic efforts of the people and governments of Guinea, Liberia and Sierra Leone” (p.456). But America’s involvement was also crucial.  President Obama provided what she describes as an “awesome demonstration of US leadership and capability – and a vivid example of how a country advances its values and interests at once” (p.438).  But the multi-national, collective success further illustrated “why the world needed the United Nations, because no one country – even one as powerful as the United States – could have slayed the epidemic on its own” (p.457).

Although Russia supported the UN Ebola intervention, Power more often found herself in an adversarial posture with Russia on both geo-political and UN administrative issues.  Yet, she used creative  diplomatic skills to develop a more nuanced relationship with her Russian counterpart, Vitaly Churkin.  Cherkin, a talented negotiator and master of the art of strategically storming out of meetings, valued US-Russia cooperation and often “pushed for compromises that Moscow was disinclined to make” (p.405).  Over time, Power writes, she and Churkin “developed something resembling genuine friendship” (p.406). But “I also spent much of my time at the UN in pitched, public battle with him” (p.408).

The most heated of these battles ensued after Russia invaded Ukraine in February 2014, a flagrant violation of international law. Later that year, troops associated with Russia shot down a Malaysian passenger jet, killing all passengers aboard.  In the UN debates on Ukraine, Power found her Russian counterpart “defending the indefensible, repeating lines sent by Moscow that he was too intelligent to believe and speaking in binary terms that belied his nuanced grasp of what was actually happening” (p.426). Yet, Power and Churkin continued to meet privately to seek solutions to the Ukraine crisis, none of which bore fruit.

While at the UN, Power went out of her way to visit the offices of the ambassadors of the smaller countries represented in the General Assembly, many of whom had never received  a United States Ambassador.  During her UN tenure, she managed to meet personally with the ambassadors from every country except North Korea.  Power also started a group that gathered the UN’s 37 female Ambassadors together one day a week for coffee and discussion of common issues.  Some involved  substantive matters that the UN had to deal with, but just as often the group focused on workplace matters that affected the women ambassadors as women, matters that their male colleagues did not have to deal with.

* * *

Donald Trump’s surprise victory in November 2016 left Power stunned.  His nativist campaign to “Make America Great Again” seemed to her like a “repudiation of many of the central tenets of my life” (p.534).  As an  immigrant, a category Trump seemed to relish denigrating, she “felt fortunate to have experienced many countries and cultures. I saw the fate of the American people as intertwined with that of individuals elsewhere on the planet.   And I knew that if the United States retreated from the world, global crises would fester, harming US interests” (p.534-35).  As Obama passed the baton to Trump in January 2017, Power left government.

Not long after, her husband suffered a near-fatal automobile accident, from which he recovered. Today, the pair team-teach courses at Harvard, while Power seems to have found the time for her family that proved so elusive when she was in government.  She is coaching her son’s baseball team and helping her daughter survey rocks and leaves in their backyard.  No one would begrudge Power’s quality time with her family. But her memoir will likely leave many readers wistful, daring to hope that there may someday  be room again for  her and her energetic idealism in the formulation of United States foreign policy.

Thomas H. Peebles

La Châtaigneraie, France

April 26, 2020

7 Comments

Filed under American Politics, American Society, Politics, United States History

School Girls on the Front Lines of Desegregation

 

Rachel Devlin, A Girl Stands in the Door:

The Generation of Young Women Who Desegregated America’s Schools

(Basic Books)

When World War II ended, public schools in the United States were still segregated by race throughout much of the country.  Segregated schools were mandated by state legislatures in all the states of the former Confederacy (“the Deep South”), along with Washington, D.C., Delaware and Arizona, while a handful of American states barred racial segregation in their public schools.  In the remainder, the decision whether to segregate was left to local jurisdictions.  Racial segregation of public schools found its constitutional sanction in Plessy v. Ferguson, the United States Supreme Court’s 1896 decision which held that equal protection of the law under the federal constitution did not prohibit states from maintaining public facilities that were “separate but equal.”

But “separate but equal” was a cruel joke, particularly as applied to public schools: in almost every jurisdiction which maintained segregated schools, those set aside for African-Americans were by every objective standard unequal and inferior to counterpart white schools.  In 1954, the Supreme Court, in one of its most momentous decisions, Brown v. Board of Education of Topeka, Kansas, invalidated the Plessy “separate but equal” standard as applied to public schools, holding that in the school context separate was inherently unequal.  The decision preceded by a year and a half the Montgomery, Alabama, bus boycott that made both Rosa Parks and Martin Luther King, Jr., household names.  The pathway leading to Brown was arguably the opening salvo in what we now term the modern Civil Rights Movement.

That pathway has been the subject of numerous popular and scholarly works, the best known of which is Richard Kluger’s magisterial 1975 work Simple Justice.  In Kluger’s account and most others, the National Association for the Advancement of Colored People (NAACP) and its Legal Defense Fund (LDF), which instituted Brown and several of its predecessor cases, are front and center, with future Supreme Court justice Thurgood Marshall, the LDF’s lead litigator, the undisputed lead character.  Yet, Rachel Devlin, an associate professor of history at Rutgers University, maintains that earlier studies of the school desegregation movement, including that of Kluger, overlook a critical point: the students who desegregated educational institutions – the “firsts,” to use Devlin’s phrase — were mostly girls and young women.

Devlin’s research revealed that only one of the early, post-World War II primary and secondary school desegregation cases that paved the way to the Brown decision was filed on behalf of a boy.  Looking at those who “attempted to register at white schools, testified in court, met with local white administrators and school boards, and talked with reporters from both the black and white press,” Devlin saw almost exclusively schoolgirls.  This disparity “held true in the Deep South, upper South, and Midwest” (p.x). After the Brown decision, the same pattern prevailed: “girls and young women vastly outnumbered boys as the first to attend formerly all-white schools” (p.x).

Unlike Kluger, Devlin does not focus on lawyers and lawsuits but rather on the “largely young, feminine work that brought school desegregation into the courts” (p.xi).  She begins with court challenges to state enforced segregation at the university level, some of which began before World War II.  She then proceeds to a host of post-World War II communities that challenged racial segregation in primary and second schools in the late 1940s and early 1950s.  The Brown decision itself, a ruling on segregated schools in Topeka, Kansas, merits only a few pages, after which she portrays the first African-American students to enter previously all-white schools during the second half of the 1950s and into the 1960s.  The pre-Brown challenges to segregated public education that Devlin highlights took place in Washington, D.C., Kansas, Delaware, Texas and Virginia. In her post-Brown analysis, she turns to the Deep South, to communities in Louisiana, Georgia and South Carolina.

Devlin’s intensely factual and personality-driven narrative at times falls victim to a forest-and-trees problem: she focuses on a multitude of individuals — the trees — to the point that the reader  can easily lose sight of the forest — how the featured individuals fit into the overall school desegregation movement.  Yet, there are a multitude of lovely trees to behold in Devlin’s forest – heroic and endearing schoolgirls and the adults who supported them, both men and women, all willing to confront entrenched racial segregation in America’s public schools.

* * *

School desegregation, Devlin writes, differed from other civil rights battles, such as desegregation of lunch counters, public transportation, and parks, in that interacting with white people was not “fleeting or ‘fortuitous,’ but central to the project itself.  School desegregation required sustained interactions with white school officials and students. This fact called for a different approach than other forms of civil rights activism” (p.xxiv).   But Devlin also emphasizes that this different approach gave rise to controversy among affected African-Americans.

In almost every community she studied, there was a dissident African-American faction that opposed desegregation of all-white schools, favoring direct pressure and court cases designed to force school authorities to make good on the “equal” portion of “separate but equal.”  Parents who favored this less frontal approach, while “willing to protest unequal schools, simply wanted a better education for their children while they were still young enough to receive it, not a long, hard campaign against a long-standing Supreme Court precedent” (p.167).  Devlin demonstrates that this quest for equalization, however understandable, was at best quixotic. Time and time again, she shows, the white power structure in the communities she studies had no serious intention of equalizing black and white schools.

Why girls and young women predominated in school desegregation efforts is as much a part of Devlin’s story as the particulars of those efforts at the institutions and in the communities she studies.  After WWII, she notes, there was a “strong, though unstated, cultural assumption that the war to end school desegregation was a girls’ war, a battle for which young women and girls were specially suited” (p.xvi).  With the example of boys and young men who had gone off to fight in World War II fresh in everyone’s minds, Devlin speculates, girls and young women may have felt an “ethical compulsion to act at a young age” (p.xvi).

Devlin was able to interview several of the female firsts for her book as they looked back on their experience in desegregating schools several decades earlier.  These women, she indicates, had been inspired as school girls “not only by a sense of obligation and individual calling but also by the opportunity to do something important and highly visible in a world and at a time when young women did not often earn much public acclaim” (p.225). The boys and young men she studied, by contrast, manifested a “desire to distance themselves from an overt, individual commitment to desegregating schools” (p.223).  Leaving was more of an option for high school age boys who felt alienated in newly desegregated schools.  They had “more mobility – and autonomy – than young women, and it allowed them to walk away from the school desegregation process when they felt it was not working for them” (p.196).   Leaving for girls “did not feel like a choice, both because they understood their parents’ expectations of them and because they had fewer alternatives” (p.196).

* * *

The pathway to Brown in Devlin’s account starts at the university level with Lucille Bluford and Ida Mae Sipuel, two lesser-known women who were denied admission because of their race to, respectively, the University of Missouri School of Journalism and the University of Oklahoma Law School.  Both saw their court cases overshadowed by those of men, Lloyd Gaines and Herman Sweatt, pursuing university level desegregation in court at the same time.  But while the two men’s cases established major Supreme Court precedents, both proved to be disappointing plaintiffs and spokesmen for the desegregation cause, in sharp contrast to Bluford and Sipuel.

Gaines was the beneficiary of one of the Supreme Court’s first major decisions involving higher education, Gaines v. Canada, where the Court ruled in 1938 that the State of Missouri was required either to admit Gaines to the University of Missouri Law School or create a separate facility for him.  Missouri chose the latter option, which Gaines refused.  But he thereafter went missing.  He was last seen taking a train to Chicago and was never heard from again.  Bluford, then a seasoned journalist working for the African-American newspaper the Kansas City Call, not only covered the Gaines litigation decision but also set out to gain admission herself to the University of Missouri’s prestigious School of Journalism.

Both “hardheaded and gregarious” (p.32), Bluford doggedly pursued admission to the university’s journalism school between 1939 and 1942.  In her court case, her lawyer, the NAACP’s Charles Houston, provided the book’s title in his closing argument when he told the court: “A girl stands at the door and a generation waits outside” (p.27).  When Bluford won a victory in court in 1942, Missouri chose to close its journalism school, citing low wartime enrollment, rather than admit Bluford.  But with her uncanny ability to find “significance in small acts of decency and mutual acknowledgement in everyday encounters” (p.11), Bluford turned her energies to reporting on school desegregation cases throughout the country, including both Sipuel’s quest to enter the University of Oklahoma Law School and the Kansas desegregation cases that led to Brown.

Sipuel agreed to challenge the University of Oklahoma Law School’s refusal to admit African-Americans only after her brother Lemuel turned down the NAACP’s request to serve as plaintiff in the case.  In 1946, she refused Oklahoma’s offer create a separate “Negro law school,” and two years later won a major Supreme Court case when the Court ruled that Oklahoma was obligated to provide her with legal education equal to that of whites.  Sipuel became the near perfect first at the law school, Devlin writes, personifying the uncommon array of skills required in that sensitive position:  “personal ambition combined with an ability to withstand public humiliation, charisma in front of the camera and self-sacrificing patience, the appearance of openness with the black and white press corps alongside an implacable determination” (p.67).

The “girl who started the fight,” as one black newspaper described Sipuel, became “something of a regional folk hero” (p.52) as a role model for future desegregation plaintiffs.  The “revelation that school desegregation was in their grasp came not from the persuasive power of NAACP officials and lawyers,” Devlin writes, but from the “‘young girl’ who would not be turned down” (p.37).  Sipuel went on to become the law school’s first African American graduate and thereafter the first African-American to pass the Oklahoma bar.

Sipuel’s engaging and exuberant public persona contrasted with that of Herman Sweatt, who sought to enter the University of Texas’s flagship law school in Austin.  In a 1950 case bearing his name, Sweatt v. Painter, the Supreme Court rejected Texas’ contention that it could satisfy the requirements of the constitution’s equal protection clause by consigning Sweatt to a “Negro law school” it had established in Houston.  The Court’s sweeping decision outlawed segregation in its entirety in graduate school education.  But although Sweatt did not go missing in action like Lloyd Gaines, he never completed his course of study at the University of Texas Law School and proved to be ill suited to the high-visibility, high-pressure role of a desegregation plaintiff.  He exuded neither Sipuel’s enthusiastic commitment to desegregated higher education, nor her grace under fire.

As the Supreme Court was rewriting the rules of university level education, dozens of cases challenging primary and secondary school segregation were percolating in jurisdictions across America, with Washington, D.C., and Meriam, Kansas, near Kansas City, providing the book’s most memorable characters.  Rigidly segregated Washington,  the nation’s capital, had several lawsuits going  simultaneously, each of which featured a strong father standing behind a courageous daughter.

First out of the gate was 14-year old Marguerite Carr.  Amidst much fanfare, in 1947 Marguerite’s father took her to enroll at a newly built white middle school two blocks from her home, where she faced off with the school principal.  When the principal told her, “you don’t want to come here,” Carr smiled, a “sign of social reciprocity, trustworthiness, a willingness to engage,” yet at the same time told the principal respectfully but firmly, “I do want to come to this school” (p.ix).  Carr’s combative response was pitch perfect, Devlin argues, meeting the “contradictory requirements inherent in such confrontations” (p.ix).

Marguerite’s court case coincided with that of Karla Galaza, a Mexican-American who had been attending  a black vocational school with a strong program in dress design until school authorities discovered that she was not black and barred her from the school.  Her stepfather, a Mexican-American activist, filed suit on his daughter’s behalf.  Simultaneously, Gardner Bishop surged into a leadership position during an African-American student strike challenging segregated education in Washington.  Bishop, by day a barber, was an activist who thrust his somewhat reluctant daughter Judine into the strike and subsequent litigation.  Bishop described himself as an outsider in Washington’s desegregation battle, representing the city’s African-American working class rather than its black middle class.  None of these cases culminated in a major court decision.

The NAACP later chose Spotswood Bolling as the lead plaintiff over a handful of girls in the lawsuit that accompanied Brown to the Supreme Court.  The young Bolling was another elusive male plaintiff, dodging all reporters and photographers.  His discomfort with the press “sets in high relief the performances of girl plaintiffs with reporters in the late 1940s (p.173),” Devlin argues.  Girls and young women “felt it was their special responsibility to find ways to address such inquiries. Bolling evidently did not” (p.174).   But the case bearing his name, Bolling v. Sharp, decided at the same time as Brown, held that segregation in Washington’s public schools was unconstitutional even though, as a federal district rather than a state, Washington was not technically bound by the constitution’s equal protection clause.

In South Park, Kansas, an unincorporated section of Merriam, located outside Kansas City, Esther Brown, arguably the book’s most unforgettable character, led a student strike over segregated schools.  Brown, a 23-year-old Jewish woman, committed radical and communist sympathizer, cast herself as merely a “housewife with a conscience” — a “deliberately humble, naïve, and conservative image” (p.108) that she invoked constantly in her dealings with public.  Lucille Bluford covered the strike for the Kansas City Call.  Bluford and the “White Mrs. Brown,” as she was called, subsequently became friends (Esther Brown was not related to Oliver Brown, the named plaintiff in the Brown case).

During the South Park student strike, Esther Brown went out on a limb to promise that she would find a way to pay the teachers herself.  She organized a Billie Holiday concert, but most of her fund raising targeted people of modest means – farmers, laborers, and domestics.  She eventually persuaded Thurgood Marshall that the NAACP should initiate a court case, despite Marshall’s initial reservations — he was suspicious of what he described as a “one woman show” (p.125).  Although the lawsuit was filed on behalf of an even number of boys and girls, Patricia Black, then eight years old, was chosen to testify in court — “setting another pattern of female participation for the cases to come” (p.111).  Black, who wore a white bow in her hair when she testified, reflected years later that she had been “taught how to act,” which meant “having manners . . . sitting up straight . . . making eye contract, being erect, and [being] nice” (p.139).

The South Park lawsuit led to the NAACP’s first major desegregation victory below the university level.  Black grade school students successfully entered the white school in the fall of 1949. The South Park case also inspired the challenge to segregated schooling in Topeka that culminated in the Brown decision.  At the trial in Brown, a 9-year-old girl, Kathy Cape, accepted the personal risk and outsized responsibility of testifying at the trial, rather than  the named plaintiff Oliver Brown, a boy.

With the Supreme Court’s ruling in Brown meriting barely more than a page, Devlin turns in the last third of the book to the schoolgirls who entered previously all white schools in the aftermath of the ruling.  Here, more than in her earlier portions, she describes in stark terms the white opposition to desegregation which, although widespread, was especially ferocious in the Deep South, where the “vast majority of school boards angrily fought school desegregation with every resource available to them” (p.192).  Devlin notes that between 1955 and 1958, southern legislatures passed nearly five hundred laws to impede implementation of Brown.

In New Orleans, three girls, Tessie Prevost, Leona Tate and Ruby Bridges, were chosen to be firsts as eight year olds at Semmes Elementary School.  Years later, Tessie described to Devlin what she, Leona and Ruby had endured at Semmes.  Administrators, teachers, and fellow pupils “did everything in their power to break us” (p.213-14), Prevost recounted.  Even teachers incited violence against the girls:

The teachers were no better that the kids. They encouraged them to fight us, to do whatever it took.  Spit on us. We couldn’t even eat in the cafeteria; they’d spit on our food – we could hardly use the restrooms  . . . They’d punch you, trip you, kick you . . . They’d push you down the steps . . . I got hit by a bat . . . in the face . . . It was every day. And the teachers encouraged it . . . Every day.  Every day (p.214).

The New Orleans girls’ experience was typical of the young firsts from the other Southern communities Devlin studied, including Baton Rouge, Louisiana, Albany, Georgia and Charleston, South Carolina.  Nearly all experienced relentless abuse, “not simply violence and aggression but a systemic, all encompassing, organized form of endless oppression” (p.214). Throughout the South, black schoolgirls demonstrated an extraordinary ability to “withstand warfare within the school when others could not,” which Devlin characterizes as a “barometer of their determination, courage, ability, and strength” (p.218).

* * *

Devlin acknowledges a growing contemporary disillusionment with the Brown decision and school integration generally among legal scholars, historians and ordinary African-Americans.  But the school desegregation firsts who met with Devlin for this book uniformly believe that their actions more than a half-century earlier had “transformed the arc of American history for the better” (p.268).   Even if Brown no longer occupies quite the exalted place it once enjoyed in the iconography of the modern Civil Rights Movement, the schoolgirls and supporting adults whom Devlin portrays in this deeply researched account deserve our full admiration and gratitude.

 

Thomas H. Peebles

La Châtaigneraie, France

April 8, 2020

 

11 Comments

Filed under American Society, United States History

A Time for New Thinking

 

Arthur Haberman, 1930: Europe in the Shadow of the Beast

(Wildred Lurier University Press) 

 

            Anxiety reigned in Europe in 1930.  The Wall Street stock market crash of the previous October and the ensuing economic crisis that was spreading across the globe threatened to undo much of the progress that had been made in Europe after recovering from the self-inflicted catastrophe of World War I.  A new form of government termed fascism was firmly in place in Italy, based on xenophobic nationalism, irrationality, and an all-powerful state.  Fascism seemed antithetical in just about every way to the universal, secular and cosmopolitan values of the 18th century Enlightenment.  In what was by then known as the Soviet Union, moreover, the Bolsheviks who had seized control during World War I were firmly in power in 1930 and were still threatening, as they had in the immediate post-war years, to spread anti-capitalist revolution westward across Europe.  And in Germany, Adolph Hitler and his unruly Nazi party realized previously unimaginable success in legislative elections in 1930, as they challenged the fragile Weimar democracy.  But if anti-democratic political movements and economic upheavals made average citizens across Europe anxious in 1930, few foresaw the extent of the carnage and destruction that the next 15 years would bring. Things were about to get worse — much worse.

In 1930: Europe in the Shadow of the Beast, Arthur Haberman, professor of history and humanities at York University, seeks to capture the intellectual and cultural zeitgeist of 1930. “What makes 1930 such a watershed is that rarely have so many important minds worked independently on issues so closely related,” Haberman writes. “All argued that something was seriously amiss and asked that people become aware of the dilemma” (p.1).  Haberman focuses on how a handful of familiar thinkers and artists expressed the anxiety that their fellow citizens felt; and how, in different ways, these figures foreshadowed the calamities that lay ahead for Europe.  There are separate chapters on Thomas Mann, Virginia Woolf, Aldous Huxley, Ortega y Gasset, Bertolt Brecht, and Sigmund Freud, each the subject of a short biographical sketch.  But each either published a major work or had one in progress in the 1929-31 time frame, and Haberman’s sketches revolve around these works.  He also includes two lesser known sisters, Paulette and Jane Nardal, two Frenchwomen of African descent who promoted writing that expressed identity and solidarity between blacks in Europe, the Americas and Africa.  Another chapter treats the visual arts in 1930, with a dissection of the various schools and tendencies of the time, among them surrealism, cubism, and fauvism.

But before getting to these figures and their works, Haberman starts with a description of an unnamed, composite European middle class couple living in a major but unidentified city in one of the World War I belligerents.  With all the maimed young men walking the streets using canes and crutches, the “metaphor of sickness and a need to be healed was part of everyday life” (p.7) for the couple.  The couple’s unease was “mirrored by the intellectuals they admired, as they all grappled with what Europe had become and where it was heading” (p.15).

In an extensive final chapter, “Yesterday and Today,” and an Epilogue, “”Europeans Today” — together about one quarter of the book — Haberman assigns himself the herculean task of demonstrating the continued relevance of his figures in contemporary Europe.   Here, he seeks to summarize European anxiety today and the much-discussed European crisis of confidence, especially in the aftermath of the 2008 economic downturn.  It’s an overly ambitious undertaking and the least successful portion of the book.

The key figures Haberman portrays in the book’s first portions were a diverse lot, and it would be an uphill task to tie them together into a neat conceptual package. But if there is a common denominator linking them, it is the specter of World War I, the “Great War,” and the reassessment of Western civilization that it prompted.  The Great War ended the illusion that Europe was at the forefront of civilization and introduced “deep cultural malaise” (p.6).  The “so-called most civilized people on earth committed unprecedented a carnage on themselves” (p.36).  It was thus necessary to think in new ways.

Haberman identifies a cluster of related subjects that both represented this new thinking and heightened the anxiety that average Europeans were sensing about themselves and their future in 1930. They include: the viability of secular Enlightenment values; coming to terms with a darker view of human nature; the rise of the politics of irrationality; mass culture and its dangers; fascism as a norm or aberration; identity and the Other in the midst of Western Civilization; finding ways to represent the post war world visually; and dystopian trends of thought.  The new thinking thus focused squarely on what it meant to be European and human in 1930.

* * *

            None of the figures in Haberman’s study addressed more of these subjects in a single work than the Spanish thinker Ortega y Gasset, whose Revolt of the Masses appeared in 1930.  Here, Ortega confronted the question of the viability of liberal democracy and the durability of the Enlightenment’s core values.  Ortega emphasized liberal democracy’s potential for irrationality and emotion to override reason in determining public choices.  He described a new “mass man” who behaved through “instinct and desire,” could be “violent and brutal” (p.55), and “will accept, even forward, both political and social tyranny” (p.53).  Ortega referred to Bolshevism and Fascism as “retrograde and a new kind of primitivism” (p.54).  The two ideologies, he concluded, gave legitimacy to the brutality he saw cropping up across Europe.

Although Ortega posited a dark view of human nature, it was not far from what had been apparent in the works of Sigmund Freud for decades prior to 1930.  Freud, whom Haberman ranks on par with Einstein as the most famous and influential intellect of his time, was 74 years old in 1930.  Although ill with throat cancer that year, Freud used an extended essay, Civilization and its Discontents, to reflect upon the conscious and unconscious, on sanity, insanity, and madness, and on the contradictions we live with.  His reflections became “central to how humans understood themselves as individuals and social beings” (p.143).

Culture and civilization are more fragile than we had thought, Freud contended. We must constantly reinforce those things that keep civilization going: “the limitations on our sexual life, the rule of law, the restrictions on our aggressive nature, and the hopeless commandment to love our neighbors, even if we don’t like them” (p.150).  The insights from Civilization and its Discontents and Freud’s other works were used in  literature, art and the study of religion, along with philosophy, politics and history.  These insights – these Freudian insights — opened for discussion “matters that had been sealed” (p.162), changing the way we think about ourselves and our nature.  Freud “tried to be a healer in a difficult time,” Habermas writes, one who “changed the discourse about humans and society forever” (p.162).

Virginia Woolf claimed she had not read Freud when she worked on The Waves, an experimental novel, throughout 1930.  The Waves nonetheless seemed to echo Freud, especially in its idea that the unconscious is a “layer of our personality, perhaps the main layer.  All of her characters attempt to deal with their inner lives, their perceptions” (p.44). In The Waves, Woolf adopted the idea that human nature is “very complex, that we are sometimes defined by our consciousness of things, events, people and ourselves, and that there are layers of personality” (p.43).  There are six different narrative voices to The Waves.  The characters sometimes seem to meld into one another.

Woolf had already distinguished herself as a writer heavily invested in the women’s suffragette movement and had addressed  in earlier writings how women can achieve freedom independently of men.  Haberman sees Woolf as part of a group of thinkers who “set the stage for the more formal introduction of existentialism after the Second World War . . . She belongs not only to literature but to modern philosophy” (p.46).

With Mario and the Magician, completed in 1930, novelist Thomas Mann made his first explicit foray into political matters.  Mann, as famous in Germany as Woolf was in Britain, suggested in his novel that culture and politics were intertwined in 1930 as never before.  By that year, Mann had become an outspoken opponent of the Nazi party, which he described as a “wave of anomalous barbarism, of primitive popular vulgarity” (p.29).  Mario and the Magician, involving a German family visiting Italy, addressed the implications of fascism for Italy and Europe generally.

Like Ortega, Mann in his novel examined the “abandonment of personality and individual responsibility on the part of the person who joins the crowd” (p.24).  Like Freud, Mann saw humanity as far more irrational and complicated than liberal democracy assumed.  The deified fascist leader in Mann’s view goes beyond offering simply policy solutions to “appeal to feelings deep in our unconscious and [tries] to give them an outlet” (p.24).  Mann was in Switzerland when the Nazis assumed power in 1933.  His children advised him not to return to Germany, and he did not do so until 1949.  He was stripped of his German citizenship in 1936 as a traitor to the Reich.

Still another consequential novel that appeared in 1930, Aldous Huxley’s Brave New World, was one of the 20th century’s first overtly dystopian works of fiction, along with Yevgeny Zamiatin’s We (both influenced George Orwell’s 1984, as detailed in Dorian Lynskey’s study of Orwell’s novel, reviewed here last month).   Brave New World used “both science and psychology to create a future world where all are happy, there is stability, and conflict is ended” (p.132).  The dystopian novel opened the question of the ethics of genetic engineering.   In 1930, eugenics was considered a legitimate branch of science, a way governments sought to deal with the undesirables in their population, especially those they regarded as unfit.  Although bioethics was not yet a field in 1930, Huxley’s Brave New World made a contribution to its founding.  Huxley’s dystopian work is a “cautionary tale that asks what might happen next.  It is science fiction, political philosophy, ethics, and a reflection on human nature all at once” (p.132).

Haberman’s least familiar figures, and for that reason perhaps the most intriguing, are the Nardal sisters, Paulette and Jane, French citizens of African descent, born in Martinique and living in 1930 in Paris.  The sisters published no major works equivalent to Civilization and Its Discontents or Revolt of the Masses.  But they founded the highly consequential La Revue du Monde Noir, a bi-lingual, French and English publication that featured contributions from African-American writers associated with the Harlem Renaissance, along with French-language intellectuals.   Writings in La Revue challenged head-on the notions underlying French colonialism.

Although France in 1930 was far more welcoming to blacks than the United States, the French vision of what it meant to be black was, as Haberman puts it, a “colonialist construction seen through the eyes of mainly white, wealthy elites” (p.89) that failed to acknowledge the richness and variety of black cultures around the world.  Educated blacks in France were perceived as being  “in the process of becoming cosmopolitan, cultured people in the French tradition, a process they [the French] called their mission civilatrice” (p.89).  Like many blacks in France, Paulette and Jane Nardal “refused to accept this formulation and decided that their identity was more varied and complex than anything the French understood” (p.89).

The Nardal sisters advanced the notion of multiple identities, arguing that the black spirit could be “informed and aided by the association with the West, without losing its own core” (p.92).   Blacks have an “alternative history from that of anyone who was white and born in France. Hence, they needed to attempt to get to a far more complex concept of self, one deeper and richer than those in the majority and the mainstream” (p.100).   The Nardals also came to understand the connection between black culture in Europe and gender.  Black women, “like many females, are a double Other, and this makes them different not only from whites but from Black men as well” (p.101; but conspicuously missing in this work is any sustained discussion of the Jew as the Other, even though anti-Semitism was rising alarmingly in Germany and elsewhere in Europe in 1930).

Between 1927 and 1933,  Bertold Brecht collaborated with Kurt Weill to rethink theatre and opera.  Brecht, alone among the thinkers Haberman portrays, brought an explicit Marxist perspective to his work.  Brecht supplied both the lyrics and dialogue to the pair’s plays, while Weill composed the music.   The Three Penny Opera, their joint work first performed in Berlin in 1928, was a decidedly non-traditional opera that proved to be spectacular success in Weimar Germany.

In 1930, the Brecht and Weill produced The Rise and Fall of the City of Mahagonny, an even less traditional production.  Brecht termed Mahagonny “epic theatre,” whose purpose was “not to entertain or provide the audience with an imitation of their lives” (p.70), but rather to engage the audience in issues of social justice.  Epic theatre was designed to “force the spectator to be active, to query his own assumptions”(p.78).

Haberman describes Mahagonny as an angry anti-capitalist production, a strange sort of “utopia of desire,” where money rules.  Its lesson: in a capitalist society, all is “commoditized, no relationship is authentic . . . [M]oney cannot satisfy human needs” (p.81-82).  The Nazis, who enjoyed increased popular support throughout 1930, regularly demonstrated against Mahagonny performances. Both Brecht and Weill fled Germany when the Nazis came to power in early 1933.  Neither The Three Penny Opera nor Mahagonny was performed again in Germany until after World War II.

Haberman sees Brecht and Weill as stage and musical companions to surrealist painters such as René Magritte and Salvador Dali, who were also juxtaposing traditional elements to force audiences to ask what was really going on.  Magritte’s The Key to Dreams, a name that is a direct reference to Freud, was a painting about painting and how we construct reality.  Words are not the objects themselves, Magritte seemed to be saying.  Paintings can refer to an object but are not the object itself.   Salvador Dali was the rising star of surrealism in 1930.  His paintings were at once “provocative, mythic, and phallic, while also using juxtaposition to great effect” (p.115).  As with Magritte, the code of understanding in Dali paintings is “closer to Freudian psychology than it is to ‘reason’” (p.115).

The most transformative shift in the visual arts by 1930 was the abandonment of mimesis, the idea that a work of art should represent external reality.  Artists from the many varying schools regarded external reality as “just appearance, not realty at all.  Now it was necessary to go through or beyond appearance to investigate what was real” (p.107).  Artists like Pablo Picasso, Georges Braque and Henri Matisse “wanted a painting to be seen holistically before being analyzed in its parts” (p.118). Like Woolf in literature, these artists by 1930 were depicting “multiple realities,” with the “whole, deep world of the unconscious guiding us” (p.108).

In the end, Haberman concludes, the perspective of the major artists of 1930 was in line with that of the writers he portrays. All in their own way:

feared where humanity was headed, in some cases they feared what they discovered about human nature. They wrote and created art. They did so in order to both help us know about ourselves and offer some redemption for a hard time. They did so because, in spite of their fears, and in spite of their pessimism, they had hope that our better nature would triumph.   Their works are relevant today as they were in 1930 (p.212).

* * *

                        Articulating their contemporary relevance is the purpose of Haberman’s extensive final chapter and epilogue, where he also seeks to summarize contemporary Europe’s zeitgeist.  The Enlightenment faith in the idea and inevitability of progress has now “more or less ended,” he argues, and the world “no longer seems as automatically better as time moves on” (p.171) – the core insight which World War I provided to the generation of 1930.  The politics of irrationality of the type that so worried Ortega seems again resurgent in today’s Europe.  Nationalism – in Haberman’s view, the most influential of the modern ideologies born in the 19th century – “persists and appears to be growing in Europe in a more frightening manner, in the rise of racist neo-fascist and quasi-fascist parties in many countries. What was once thought impossible after the defeat of Hitlerian Germany is now coming into being” (p.168).

Despite the rise of European social democracy in the aftermath of World War II, there is a trend toward appropriation of wealth in fewer and fewer hands, with the gap between the rich and poor widening.   Traditional religion has less hold on Europeans today than it did in 1930 — although it had no apparent hold on any of the writers and artists Haberman features. The question of the place for the Other – marginalized groups like the blacks of the Nardal sisters’ project – has come to the fore in today’s Europe.  Haberman frames the question as whether today’s Europe, theoretically open, liberal, tolerant and egalitarian, is so “only for those who conform to the norm – who are white, indigenous to whatever place they live, nominally or deeply Christian, and identifying strongly with the nation.”  Or is there something “built into European culture as it is taught and practiced that automatically marginalizes women, Blacks, Jews, Roma, and Muslims?” (p.185).

After posing this unanswerable question, Haberman finishes by returning to his composite couple, explaining how their lives were changed by events between 1930 and 1945.  They lost a son in battle in World War II and some civilian relatives were also killed.  Haberman then fast-forwards to the couple’s granddaughter, born in 1982, who married at age 30 and is now pregnant.   She and her husband are ambivalent about their future.  Peace is taken for granted in the way it was not in 1930.  But there is pessimism in the economic sphere.  The couple sees the tacit social contract between generations fraying. The issues that move the couple most deeply are the environment and concerns about climate change.

* * *

               Through his individual portraits, Haberman provides a creative elaboration upon the ideas which leading thinkers and artists wrestled with in the anxious year of 1930.  Describing contemporary applications of these ideas , as he attempts to do in the latter portion of his work, would be a notable accomplishment for an entire book and his attempt to do so here falls flat.

 

 

Thomas H. Peebles

La Châtaigneraie, France

March 15, 2020

 

 

9 Comments

Filed under European History, Intellectual History

A Defense of Truth

 

Dorian Lynskey, The Ministry of Truth:

The Biography of George Orwell’s 1984 

                           George Orwell’s name, like that of William Shakespeare, Charles Dickens and Franz Kafka, has given rise to an adjective.  “Orwellian” connotes official deception, secret surveillance, misleading terminology, and the manipulation of history.   Several terms used in Orwell’s best known novel, Nineteen Eighty Four, have entered into common usage, including “doublethink,” “thought crime,” “newspeak,” “memory hole,” and “Big Brother.”  First published in June 1949, a little over a half year prior to Orwell’s death in January 1950, Nineteen Eighty Four is consistently described as a “dystopian” novel – a genre of fiction which, according to Merriam-Webster, pictures “an imagined world or society in which people lead wretched, dehumanized, fearful lives.”

This definition fits neatly the world that Orwell depicted in Nineteen Eighty Four, a world divided between three inter-continental super states perpetually at war, Oceania, Eurasia and Eastasia, with Britain reduced to a province of Oceania bearing the sardonic name “Airstrip One.”  Airstrip One is ruled by The Party under the ideology Insoc, a shortening of “English socialism.”  The Party’s leader, Big Brother, is the object of an intense cult of personality — even though there is no hard proof he actually exists.  Surveillance through two-way telescreens and propaganda are omnipresent.  The protagonist, Winston Smith, is a diligent lower-level Party member who works at the Ministry of Truth, where he rewrites historical records to conform to the state’s ever-changing version of history.  Smith enters into a forbidden relationship with his co-worker, Julia, a relationship that terminates in mutual betrayal.

In his intriguing study, The Ministry of Truth: The Biography of George Orwell’s 1984, British journalist and music critic Dorian Lynskey seeks to explain what Nineteen Eighty-Four “actually is, how it came to be written, and how it has shaped the world, in its author’s absence, over the past seventy years” (p.xiv). Although there are biographies of Orwell and academic studies of Nineteen Eighty-Four’s intellectual context, Lynskey contends that his is the first to “merge the two streams into one narrative, while also exploring the book’s afterlife” (p.xv; I reviewed Thomas Ricks’ book on Orwell and Winston Churchill here in November 2017).   Lynskey’s work is organized in a “Before/After” format.  Part One, about 2/3 of the book, looks at the works and thinkers who influenced Orwell and his novel, juxtaposed with basic Orwell biographical background.  Part II, roughly the last third, examines the novel’s afterlife.

But Lynskey begins in a surprising place, Washington, D.C., in January 2017, where a spokesman for President Donald Trump told the White House press corps that the recently-elected president had taken his oath of office before the “largest audience to ever witness an inauguration – period – both in person and around the globe.”  A presidential adviser subsequently justified this “preposterous lie” by characterizing the statement as “alternative facts” (p.xiii).   Sales of Orwell’s book shot up immediately thereafter.  The incident constitutes a reminder, Lynskey contends, of the “painful lessons that the world appears to have unlearned since Orwell’s lifetime, especially those concerning the fragility of truth in the face of power” (p.xix).

How Orwell came to see the consequences of mutilating truth and gave them expression in Nineteen Eighty-Four is the focus of Part I.  Orwell’s brief participation in the Spanish Civil War, from December 1936 through mid-1937, was paramount among his personal experiences in shaping the novel’s worldview. Spain was the “great rupture in his life; his zero hour” (p.4), the experience that lead Orwell to the conclusion that Soviet communism was as antithetical as fascism and Nazism to the values he held dear (Lynskey’s list of Orwell’s values: “honesty, decency, fairness, memory, history, clarity, privacy, common sense, sanity, England, and love” (p.xv)).  While no single work provided an intellectual foundation for Nineteen Eighty Four in the way that the Spanish Civil War provided the personal and practical foundation, Lynskey discusses numerous writers whose works contributed to the worldview on display in Orwell’s novel.

Lynskey dives deeply into the novels and writings of Edward Bellamy, H.G. Wells and the Russian writer Yevgeny Zamytin.  Orwell’s friend Arthur Koestler set out what Lynskey terms the “mental landscape” for Nineteen Eighty-Four in his 1940 classic Darkness at Noon, while the American conservative James Burnham provided the novel’s “geo-political superstructure” (p.126).  Lynskey discusses a host of other writers whose works in one way or another contributed to Nineteen Eighty-Four’s world view, among them Jack London, Aldous Huxley, Friedrich Hayek, and the late 17th and early 18th century satirist Jonathan Swift.

In Part II, Lynskey treats some of the dystopian novels and novelists that have appeared since Nineteen Eighty-Four.  He provides surprising detail on David Bowie, who alluded to Orwell in his songs and wrote material that reflected the outlook of Nineteen Eighty-Four.  He notes that Margaret Atwood termed her celebrated The Handmaid’s Tale a “speculative fiction of the George Orwell variety” (p.241).  But the crux of Part II lies in Lynskey’s discussion of the evolving interpretations of the novel since its publication, and why it still matters today.  He argues that Nineteen Eighty Four has become both a “vessel into which anyone could pour their own version of the future” (p.228), and an “all-purpose shorthand” for an “uncertain present” (p.213).

In the immediate aftermath of its publication, when the Cold War was at its height, the novel was seen by many as a lesson on totalitarianism and the dangers that the Soviet Union and Communist China posed to the West (Eurasia, Eastasia and Oceania in the novel correspond roughly to the Soviet Union, China and the West, respectively).  When the Cold War ended with the fall of Soviet Union in 1991, the novel morphed into a warning about the invasive technologies spawned by the Internet and their potential for surveillance of individual lives.  In the Age of Trump and Brexit, the novel has become “most of all a defense of truth . . . Orwell’s fear that ‘the very concept of objective truth is fading out of the world’ is the dark heart of Nineteen Eighty-Four. It gripped him long before he came up with Big Brother, Oceania, Newspeak or the telescreen, and it’s more important than any of them” (p.265-66).

* * *

                            Orwell was born as Eric Blair in 1903 in India, where his father was a mid-level civil servant. His mother was half-French and a committed suffragette.  In 1933, prior to publication of his first major book,  Down and Out in Paris and London, which recounts his life in voluntary poverty in the two cities, the fledgling author took the pen name Orwell from a river in Sussex .  He changed names purportedly to save his parents from the embarrassment which  he assumed his forthcoming work  would cause.  He was at best a mid-level journalist and writer when he went to Spain in late 1936, with a handful of novels and lengthy essays to his credit – “barely George Orwell” (p.4), as Lynskey puts it.

The Spanish Civil war erupted after Spain’s Republican government, known as the Popular Front, a coalition of liberal democrats, socialists and communists, narrowly won a parliamentary majority in 1936, only to face a rebellion from the Nationalist forces of General Francisco Franco, representing Spain’s military, business elites, large landowners and the Catholic Church.  Nazi Germany and Fascist Italy furnished arms and other assistance for the Nationalists’ assault on Spain’s democratic institutions, while the Soviet Union assisted the Republicans (the leading democracies of the period, Great Britain, France and the United States, remained officially neutral; I reviewed Adam Hochschild’s work on the Spanish Civil War here in August 2017).   Spain provided Orwell with his first and only personal exposure to the “nightmare atmosphere” (p.17) that would envelop the novel he wrote a decade later.

Fighting with the Workers’ Party of Marxist Unification (Spanish acronym: POUM), a renegade working class party that opposed Stalin, Orwell quickly found himself in the middle of what amounted to a mini-civil war among the disparate left-wing factions on the Republican side, all within the larger civil war with the Nationalists.  Orwell saw first-hand the dogmatism and authoritarianism of the Stalinist left at work in Spain, nurtured by a level of deliberate deceit that appalled him.  He read newspaper accounts that did not even purport to bear any relationship to what had actually happened. For Orwell previously, Lynskey writes:

people were guilty of deliberate deceit or unconscious bias, but at least they believed in the existence of facts and the distinction between true and false. Totalitarian regimes, however, lied on such a grand scale that they made Orwell feel that ‘the very concept of objective truth is fading out of the world’ (p.99).

Orwell saw totalitarianism in all its manifestations as dangerous not primarily because of secret police or constant surveillance but because “there is no solid ground from which to mount a rebellion –no corner of the mind that has not been infected and warped by the state.  It is power that removes the possibility of challenging power” (p.99).

Orwell narrowly escaped death when he was hit by a bullet in the spring of 1937.  He was hospitalized in Barcelona for three weeks, after which he and his wife Eileen escaped across the border to France.  Driven to Spain by his hatred of fascism, Orwell left with a “second enemy. The fascists had behaved just as appallingly as he had expected they would, but the ruthlessness and dishonesty of the communists had shocked him” (p.18).  From that point onward, Orwell criticized communism more energetically than fascism because he had seen communism “up close, and because its appeal was more treacherous. Both ideologies reached the same totalitarian destination but communism began with nobler aims and therefore required more lies to sustain it” (p.22).   After his time in Spain, Orwell knew that he stood against totalitarianism of all stripes, and for democratic socialism as its counterpoint.

The term “dystopia” was not used frequently in Orwell’s time, and Orwell distinguished between “favorable” and “pessimistic” utopias.   Orwell developed what he termed a “pitying fondness” (p.38) for nineteenth-century visions of a better world, particularly the American Edward Bellamy’s 1888 novel Looking Backward.  This highly popular novel contained a “seductive political argument” (p.33) for the nationalization of all industry, and the use of an “industrial army” to organize production and distribution.  Bellamy had what Lynskey terms a “thoroughly pre-totalitarian mind,” with an “unwavering faith in human nature and common sense” that failed to see the “dystopian implications of unanimous obedience to a one-party state that will last forever” (p.38).

Bellamy was a direct inspiration for the works of H.G. Wells, one of the most prolific writers of his age. Wells exerted enormous influence on the young Eric Blair, looming over the boy’s childhood “like a planet – awe inspiring, oppressive, impossible to ignore – and Orwell never got over it” (p.60).  Often called the English Jules Verne, Wells foresaw space travel, tanks, electric trains, wind and water power, identity cards, poison gas, the Channel tunnel and atom bombs.  His fiction imagined time travel, Martian invasions, invisibility and genetic engineering.  The word Wellsian came to mean “belief in an orderly scientific utopia,” but his early works are “cautionary tales of progress thwarted, science abused and complacency punished” (p.63).

Wells was himself a direct influence upon Yevgeny Zamatin’s We which, in Lymskey’s interpretation, constitutes the most direct antecedent to Nineteen Eighty-Four.  Finished in 1920 at the height of the civil war that followed the 1917 Bolshevik Revolution (but not published in the Soviet Union until 1988), We is set in the undefined future, a time when people are referred to only by numbers. The protagonist, D-503, a spacecraft engineer, lives in the One State, where mass surveillance is omnipresent and all aspects of life are scientifically managed.  It is an open question whether We was intended to satirize the Bolshevik regime, in 1920 already a one-party state with extensive secret police.

Zamyatin died in exile in Paris in 1937, at age 53.   Orwell did not read We until sometime after its author’s death.  Whether Orwell “took ideas straight from Zamyatin or was simply thinking along similar lines” is “difficult to say” (p.108), Lynskey writes.  Nonetheless, it is “impossible to read Zamyatin’s bizarre and visionary novel without being strongly reminded of stories that were written afterwards, Orwell’s included” (p.102).

Koestler’s Darkness at Noon offered a solution to the central riddle of the Moscow show trials of the 1930s: “why did so many Communist party members sign confessions of crimes against the state, and thus their death warrants?” Koestler argued that their “years of unbending loyalty had dissolved their belief in objective truth: if the Party required them to be guilty, then guilty they must be” (p.127).  To Orwell this meant that one is punished in totalitarian states not for “ what one does but for what one is, or more exactly, for what one is suspected of being” (p.128).

The ideas contained in James Burnham’s 1944 book, The Managerial Revolution “seized Orwell’s imagination even as his intellect rejected them” (p.122).  A Trotskyite in his youth who in the 1950s helped William F. Buckley found the conservative weekly, The National Review, Burnham saw the future belonging to a huge, centralized bureaucratic state run by a class of managers and technocrats.  Orwell made a “crucial connection between Burnham’s super-state hypothesis and his own long-standing obsession with organized lying” (p.121-22).

Orwell’s chronic lung problems precluded him from serving in the military during World War II.  From August 1941 to November 1943, he worked for the Indian Section of the BBC’s Eastern Service, where he found himself “reluctantly writing for the state . . . Day to day, the job introduced him to the mechanics of propaganda, bureaucracy, censorship and mass media, informing Winston Smith’s job at the Ministry of Truth” (p.83; Orwell’s boss at the BBC was notorious Cambridge spy Guy Burgess, whose biography I reviewed here in December 2017).   Orwell left the BBC in 1943 to become literary editor of the Tribune, an anti-Stalinist weekly.

While at the Tribune, Orwell found time to produce Animal Farm, a “scrupulous allegory of Russian history from the revolution to the Tehran conference” (p.138), with each animal representing an individual, Stalin, Trotsky, Hitler, and so on.  Animal Farm shared with Nineteen Eighty-Four an “obsession with the erosion and corruption of memory” (p.139).  Memories in the two works are gradually erased, first, by the falsification of evidence; second, by the infallibility of the leader; third, by language; and fourth, by time.  Published in August 1945, Animal Farm quickly became a best seller.  The fable’s unmistakable anti-Soviet message forced Orwell to remind readers that he remained a socialist.  “I belong to the Left and must work inside it,” he wrote, “much as I hate Russian totalitarianism and its poisonous influence of this country” (p.141).

Earlier in 1945, Orwell’s wife Eileen died suddenly after being hospitalized for a hysterectomy, less than a year after the couple had adopted a son, whom they named Richard Horatio Blair.  Orwell grieved the loss of his wife by burying himself in the work that culminated in Nineteen Eighty-Four.   But Orwell became ever sicker with tuberculosis as he worked  over the next four years on the novel which was titled The Last Man in Europe until almost immediately prior to publication (Lynskey gives no credence to the theory that Orwell selected 1984 as a inversion of the last two digits of 1948).

Yet, Lynskey rejects the notion that Nineteen Eighty-Four was the “anguished last testament of a dying man” (p.160).  Orwell “never really believed he was dying, or at least no more than usual. He had suffered from lung problems since childhood and had been ill, off and on, for so long that he had no reason to think that this time would be the last ” (p.160).  His novel was published in June 1949.  227 days later, in January 1950, Orwell died when a blood vessel in his lung ruptured.

* * *

                                    Nineteen Eighty-Four had an immediate positive reception. The book was variously compared to an earthquake, a bundle of dynamite, and the label on a bottle of poison.  It was made into a movie, a play, and a BBC television series.  Yet, Lynskey writes, “people seemed determined to misunderstand it” (p.170).  During the Cold War of the early 1950s, conservatives and hard line leftists both saw the book as a condemnation of socialism in all its forms.  The more astute critics, Lynskey argues, were those who “understood Orwell’s message that the germs of totalitarianism existed in Us as well as Them” (p.182).  The Soviet invasion of Hungary in 1956 constituted a turning point in interpretations of Nineteen Eighty-Four.  After the invasion, many of Orwell’s critics on the left “had to accept that they had been wrong about the nature of Soviet communism and that he [Orwell] had been infuriatingly right” (p.210).

The hoopla that accompanied the actual year 1984, Lynskey notes wryly, came about only because “one man decided, late in the day, to change the title of his novel” (p.234).   By that time, the book was being read less as an anti-communist tract and more as a reminder of the abuses exposed in the Watergate affair of the previous decade, the excesses of the FBI and CIA, and the potential for mischief that personal computers, then in their infancy, posed.  With the fall of the Berlin wall and the end of communism between 1989 and 1991, focus on the power of technology intensified.

But today the focus is on Orwell’s depiction of the demise of objective truth in Nineteen Eighty-Four, and appropriately so, Lynskey argues, noting how President Trump masterfully “creates his own reality and measures his power by the number of people who subscribe to it: the cruder the lie, the more power its success demonstrates” (p.264).  It is truly Orwellian, Lynskey contends, that the phrase “fake news” has been “turned on its head by Trump and his fellow authoritarians to describe real news that is not to their liking, while flagrant lies become ‘alternative facts’” (p.264).

* * *

                                 While resisting the temptation to term Nineteen Eighty-Four more relevant now than ever, Lynskey asserts that the novel today is nonetheless  “a damn sight more relevant than it should be” (p.xix).   An era “plagued by far-right populism, authoritarian nationalism, rampant disinformation and waning faith in liberal democracy,” he concludes, is “not one in which the message of Nineteen Eighty-Four can be easily dismissed” (p.265).

Thomas H. Peebles

La Châtaigneraie, France

February 25, 2020

2 Comments

Filed under Biography, British History, European History, Language, Literature, Political Theory, Politics, Soviet Union