Category Archives: Politics

Reading Darwin in Abolitionist New England

 

Randall Fuller, The Book That Changed America:

How Darwin’s Theory of Evolution Ignited a Nation (Viking)

In mid-December 1859, the first copy of Charles Darwin’s On the Origin of Species arrived in the United States from England at a wharf in Boston harbor.  Darwin’s book explained how plants and animals had developed and evolved over multiple millennia through a process Darwin termed “natural selection,” a process which distinguished On the Origins of Species from the work of other naturalists of Darwin’s generation.   Although Darwin said little in the book about how humans fit into the natural selection process, the work promised to ignite a battle between science and religion.

In The Book That Changed America: How Darwin’s Theory of Evolution Ignited a Nation, Randall Fuller, professor of American literature at the University of Kansas, contends that what made Darwin’s insight so radical was its “reliance upon a natural mechanism to explain the development of species.  An intelligent Creator was not required for natural selection to operate.  Darwin’s’ vision was of a dynamic, self-generation process of material change.  That process was entirely arbitrary, governed by physical law and chance – and not leading ineluctably . . . toward progress and perfection” (p.24).  Darwin’s work challenged the notion that human beings were a “separate and extraordinary species, differing from every other animal on the planet. Taken to its logical conclusion, it demolished the idea that people had been created in God’s image” (p.24).

On the Origins of Species arrived in the United States at a particularly fraught moment.  In October 1859, abolitionist John Brown had conducted a raid on a federal arsenal in Harper’s Ferry (then part of Virginia, today West Virginia), with the intention of precipitating a rebellion that would eradicate slavery from American soil.  The raid failed spectacularly: Brown was captured, tried for treason and hung on December 2, 1859.  The raid and its aftermath exacerbated tensions between North and South, further polarizing the already bitterly divided country over the issue of chattel slavery in its southern states.  Notwithstanding the little Darwin had written about how humans fit into the natural selection process, abolitionists seized on hints in the book that all humans were biologically related to buttress their arguments against slavery.  To the abolitionists, Darwin “seemed to refute once and for all the idea that African American slaves were a separate, inferior species” (p.x).

Asa Gray, a respected botanist at Harvard University and a friend of Darwin, received the first copy of On the Origin of Species in the United States.  He passed the copy, which he annotated heavily, to his cousin by marriage  Charles Loring Brace (who was also a distant cousin of Harriet Beecher Stowe, author of the anti-slavery runaway best-seller Uncle Tom’s Cabin).  Brace in turn introduced the book to three men: Franklin Benjamin Sanborn, a part-time school master and full-time abolitionist activist; Amos Bronson Alcott, an educator and loquacious philosopher, today best remembered as the father of author Louisa May Alcott; and Henry David Thoreau, one of America’s best known philosophers and truth-seekers.  Sanborn, Alcott and Thoreau were residents of Concord, Massachusetts, roughly twenty miles north of Boston, the site of a famous Revolutionary War battle but in the mid-19th century both a leading literary center and a hotbed of abolitionist sentiment.

As luck would have it, Brace, Alcott and Thoreau gathered at Sanborn’s Concord home on New Year’s Day 1860.  Only Gray did not attend. The four men almost certainly shared their initial reactions to Darwin’s work.   This get together constitutes the starting point for Fuller’s engrossing study, centered on how Gray and the four men in Sanborn’s parlor on that New Year’s Day  absorbed Darwin’s book.   Darwin himself is at best a background figure in the study.  Several familiar figures make occasional appearances, among them:  Frederick Douglass, renowned orator and “easily the most famous black man in America” (p.91); Bronson Alcott’s author-daughter Louisa May; and American philosophe Ralph Waldo Emerson, Thoreau’s mentor and friend.  Emerson, like Louisa May and her father, was a Concord resident, and Fuller’s study takes place mostly there, with occasional forays to nearby Boston and Cambridge.

Fuller’s study is therefore more tightly circumscribed geographically than its title suggests.  He spends little time detailing the reaction to Darwin’s work in other parts of the United States, most conspicuously in the American South, where any work that might seem to support abolitionism and undermine slavery was anathema.   The study is also circumscribed in time; it takes place mostly in 1860, with most of the rest confined to the first half of the 1860s, up to the end of the American Civil War in 1865.  Fuller barely mentions what is sometimes called “Social Darwinism,” a notion that gained traction in the decades after the Civil War that purported to apply Darwin’s theory of natural selection to the competition between individuals in politics and economics, producing an argument for unregulated capitalism.

Rather, Fuller charts out the paths each of his five main characters traversed in absorbing and assimilating into their own worldviews the scientific, religious and political ramifications of Darwin’s work, particularly during the tumultuous year 1860.   All five were fervent abolitionists.   Sunburn was a co-conspirator in John Brown’s raid.  Thoreau gave a series of eloquent, impassioned speeches in support of Brown.  All were convinced that Darwin’s notion of natural selection had provided still another argument against slavery, based on science rather than morality or economics.  But in varying degrees, all five could also be considered adherents of transcendentalism, a mid-19th century philosophical approach that posited a form of human knowledge that goes beyond, or transcends, what can be seen, heard, tasted, touched or felt.

Although transcendentalists were almost by definition highly individualistic, most believed that a special force or intelligence stood behind nature and that prudential design ruled the universe.  Many subscribed to the notion that humans were the products of some sort of “special creation.”   Most saw God everywhere, and considered the human mind “resplendent with powers and insights wholly distinct from the external world” (p.54).  Transcendentalism was both an effort to invoke the divinity within man and, as Fuller puts it, also “cultural attack on a nation that had become too materialistic, too conformist, too smug about its place in history” (p.66).

Transcendentalism thus hovered in the background in 1860 as all but Sanborn wrestled with the implications of Darwinism (Sanborn spent much of the year fleeing federal authorities seeking his arrest for his role in John Brown’s raid).  Alcott never left transcendentalism, rejecting much of Darwinism.  Gray and Brace initially seemed to embrace Darwinian theories wholeheartedly, but in different ways each pulled back once he fully grasped the full implications of those theories.   Thoreau was the only one of the five who accepted wholly Darwinism’s most radical implications, using Darwin’s theories to “redirect his life’s work” (p.ix).

Fuller’s study thus combines a deep dive into the New England abolitionist milieu at a time when the United States was fracturing over the issue of slavery with a medium level dive into the intricacies of Darwin’s theory of natural selection.   But the story Fuller tells is anything but dry and abstract.  With an elegant writing style and an acute sense of detail, Fuller places his five men and their thinking about Darwin in their habitat, the frenetic world of 1860s New England.  In vivid passages, readers can almost feel the chilly January wind whistling through Franklin Sanborn’s parlor that New Year’s Day 1860, or envision the mud accumulating on Henry David Thoreau’s boots as he trudges through the melting snow in the woods on a March afternoon contemplating Darwin.  The result is a lively, easy-to-read narrative that nimbly mixes intellectual and everyday, ground-level history.

* * *

Bronson Alcott, described by Fuller as America’s most radical transcendentalist, never accepted the premises of On the Origins of Species.  Darwin had, in Alcott’s view, “reduced human life to chemistry, to mechanical processes, to vulgar materialism” (p.10).  To Alcott, Darwin seemed “morbidly attached to an amoral struggle of existence, which robbed humans of free will and ignored the promptings of the soul” (p.150). Alcott could not imagine a universe “so perversely cruel as to produce life without meaning.  Nor could he bear to live in a world that was reduced to the most tangible and daily phenomena, to random change and process”(p.188).  Asa Gray, one of America’s most eminent scientists, came to the same realization, but  only after thoroughly digesting Darwin and explaining his theories to a wide swath of the American public.

Gray’s initial reaction to Darwin’s work was one of unbounded enthusiasm.  Gray covered nearly every page of the book with his own annotations.  He admired the book because it “reinforced his conviction that inductive reasoning was the proper approach to science” (p.109).  He also admired the work’s “artfully modulated tone, [and] its modest voice, which softened the more audacious ideas rippling through the text” (p.17). Gray was most impressed with Darwin’s “careful judging and clear-eyed balancing of data” (p.110).  To grapple with Darwin’s ideas, Gray maintained, one had to “follow the evidence wherever it led, ignoring prior convictions and certainties or the narrative one wanted that evidence to confirm” (p.110).  Without saying so explicitly, Gray suggested that readers of Darwin’s book had to be “open to the possibility that everything they had taken for granted was in fact incorrect” (p.110).

Gray reviewed On the Origins of Species for the Atlantic Monthly in three parts, appearing  in the summer and fall of 1860.  Gray’s articles served as the first encounter with Darwin for many American readers.  The articles elicited a steady stream of letters from respectful readers.  Some responded with “unalloyed enthusiasm” for a new idea which “seemed to unlock the mysteries of nature” (p.134).  Others, however, “reacted with anger toward a theory that proposed to unravel . . . their belief in a divine Being who had placed humans at the summit of creation” (p.134).  But as Gray finished the third Atlantic article, he began to realize that he himself was not entirely at ease with the diminution of humanity’s place in the universe that Darwin’s work implied.

The third Atlantic article, appearing in October 1860, revealed Gray’s increasing difficulty in “aligning Darwin’s theory with his own religions convictions” (p.213).   Gray proposed that natural selection might be the “God’s chosen method of creation” (p.214).  This idea seemed to resolve the tension between scientific and religious accounts of origins, making Gray the first to develop a theological case for Darwinian theory.  But the idea that natural selection might be the process by which God had fashioned  the world represented what Fuller describes as a “stunning shift for Gray. Before now, he had always insisted that secondary causes were the only items science was qualified to address.  First, or final causes – the beginning of life, the creation of the universe – were the purview of religion: a matter of faith and metaphysics” (p.214).  Darwin responded to Gray’s conjectures by indicating that, as Fuller summarizes the written exchange, the natural world was “simply too murderous and too cruel to have been created by a just and merciful God” (p.211).

In the Atlantic articles, Fuller argues, Gray leapt “beyond his own rules of science, speculating about something that was untestable” (p.214-15 ).  Gray must have known that his argument “failed to adhere to his own definition of science” (p.216).  But, much like Bronson Alcott, Gray found it “impossible to live in the world Darwin had imagined: a world of chance, a world that did not require a God to operate” (p.216).  Charles Brace, a noted social reformer who founded several institutions for orphans and destitute children, greeted Darwin’s book  with an initial enthusiasm that rivaled that of Gray.

Brace  claimed to have read On the Origins of Species 13 times.  He was most attracted to the book for its implications for human societies, especially for American society, where nearly half the country accepted and defended human slavery.  Darwin’s book “confirmed Brace’s belief that environment played a crucial role in the moral life of humans” (p.11), and demonstrated that every person in the world, black, white, yellow, was related to every one else.  The theory of natural selection was thus for Brace the “latest argument against chattel slavery, a scientific claim that could be used in the most important controversy of his time, a clarion call for abolition” (p.39).

Brace produced a tract entitled The Races of the Old World, modeled after Darwin’s On the Origin of Species, which Fuller describes as a “sprawling, ramshackle work” (p.199).  Its central thesis was simple enough: “There is nothing . . . to prove the negro radically different from the other families of man or even mentally inferior to them” (p.199-200).  But much of The Races of the Old World seemed to undercut Brace’s central thesis.  Although the book never defined the term “race,” Brace “apparently believed that though all humans sprang from the same source, some races had degraded over time . . . Human races were not permanent” (p.199-200).  Brace thus struggled to make Darwin’s theory fit his own ideas about race and slavery. “He increasingly bent facts to fit his own speculations” (p.197), as Fuller puts it.

The Races of the Old World revealed Brace’s hesitation in imagining a multi-racial America. He couched in Darwinian terms the difficulty of the races cohabiting,  reverting to what Fuller describes as nonsense about blacks not being conditioned to survive in the colder Northern climate.  Brace “firmly believed in the emancipation of slaves, and he was equally convinced that blacks and white did not differ in their mental capacities” (p.202).  But he nonetheless worried that “race mixing,” or what was then termed race “amalgamation,” might imperil Anglo-Saxon America, the “apex of development. . . God’s favored nation, a place where democracy and Christianity had fused to create the world’s best hope” (p.202).  Brace joined many other leading abolitionists in opposing race “amalgamation.”  His conclusion that “black and brown-skinned people inhabited a lower run on the ladder of civilization” was shared, Fuller indicates, by “even the most enlightened New England abolitionists” (p.57).

No such misgivings visited Thoreau, who  grappled with On the Origins of Species “as thoroughly and as insightfully as any American of the period” (p.11).  As Thoreau first read his copy of the book in late January 1860,  a “new universe took form on the rectangular page before him” (p.75).  Prior to his encounter with Darwin, Thoreau’s thought had often “bordered on the nostalgic.  He longed for the transcendentalist’s confidence in a natural world infused with spirit” (p.157).  But Darwin led Thoreau beyond nostalgia.

Thoreau was struck in particular by Darwin’s portrayal of the struggle among species as an engine of creation.  The Origin of Species revealed nature as process, in constant transformation.  Darwin’s book directed Thoreau’s attention “away from fixed concepts and hierarchies toward movement instead” (p.144-45).  The idea of struggle among species “undermined transcendentalist assumptions about the essential goodness of nature, but it also corroborated many of Thoreau’s own observations” (p.137).  Thoreau had “long suspected that people were an intrinsic part of nature – neither separate nor entirely alienated from it” (p.155).  Darwin now enabled Thoreau to see how “people and the environment worked together to fashion the world,” providing a “scientific foundation for Thoreau’s belief that humans and nature were part of the same continuum” (p.155).

Darwin’s natural selection, Thoreau wrote, “implies a greater vital force in nature, because it is more flexible and accommodating, and equivalent to a sort of constant new creation” (p.246).  The phrase “constant new creation” in Fuller’s view represents an “epoch in American thought” because it “no longer relies upon divinity to explain the natural world” (p.246).  Darwin thus propelled Thoreau to a radical vision in which there was “no force or intelligence behind Nature, directing its course in a determined and purposeful manner.  Nature just was” (p.246-47).

How far Thoreau would have taken these ideas is impossible to know. He became sick in December 1860, stricken with influenza, exacerbated by tuberculosis, and died in June 1862, with Americans fighting other Americans on the battlefield over the issue of slavery.

* * *

            Fuller compares Darwin’s On the Origin of Species to a Trojan horse.  It entered American culture “using the newly prestigious language of science, only to attack, once inside, the nation’s cherished beliefs. . . With special and desolating force, it combated the idea that God had placed humans at the peak of creation” (p.213).  That the book’s attack did not spare even New England’s best known abolitionists and transcendentalists demonstrates just how unsettling the attack was.

Thomas H. Peebles

La Châtaigneraie, France

May 18, 2020

 

10 Comments

Filed under American Society, History, Political Theory, Religion, Science, United States History

The Power of Human Rights

 

Samantha Power, The Education of an Idealist:

A Memoir 

By almost any measure, Samantha Power should be considered an extraordinary American success story. An immigrant from Ireland who fled the Emerald Isle with her mother and brother at a young age to escape a turbulent family situation, Power earned degrees from Yale University and Harvard Law School, rose to prominence in her mid-20s as a journalist covering civil wars and ethnic cleaning in Bosnia and the Balkans, won a Pulitzer Prize for a book on 20th century genocides, and helped found the Carr Center for Human Rights Policy at Harvard’s Kennedy School of Government, where she served as its executive director — all before age 35.  Then she met an ambitious junior Senator from Illinois, Barack Obama, and her career really took off.

Between 2009 and 2017, Power served in the Obama administration almost continually, first on the National Security Council and subsequently as Ambassador to the United Nations.  In both capacities, she became the administration’s most outspoken and influential voice for prioritizing human rights, arguing regularly for targeted United States and multi-lateral interventions to protect individuals from human rights abuses and mass atrocities, perpetrated in most cases by their own governments.  In what amounts to an autobiography, The Education of an Idealist: A Memoir, Power guides her readers through  the major foreign policy crises of the Obama administration.

Her life story, Power tells her readers at the outset, is one of idealism, “where it comes from, how it gets challenged, and why it must endure” (p.xii).  She is quick to emphasize that hers is not a story of how a person with “lofty dreams” about making a difference in the world came to be “’educated’ by the “brutish forces” (p.xii) she encountered throughout her professional career.  So what then is the nature of the idealist’s “education” that provides the title to her memoir?  The short answer probably lies in how Power learned to make her idealistic message on human rights both heard and effective within the complex bureaucratic structures of the United States government and the United Nations.

But Power almost invariably couples this idealistic message with the view that the promotion and protection of human rights across the globe is in the United States’ own national security interests; and that the United States can often advance those interests most effectively by working multi-laterally, through international organizations and with like-minded states.  The United States, by virtue of its multi-faceted strengths – economic, military and cultural – is in a unique position to influence the actions of other states, from its traditional allies all the way to those that inflict atrocities upon their citizens.

Power acknowledges that the United States has not always used its strength as a positive force for human rights and human betterment – one immediate example is the 2003 Iraq invasion, which she opposed. Nevertheless, the United States retains a reservoir of credibility sufficient to be effective on human rights matters when it choses to do so.   Although Power is sometimes labeled a foreign policy “hawk,” she recoils from that adjective.  To Power, the military is among the last of the tools that should be considered to advance America’s interests around the world.

Into this policy-rich discussion, Power weaves much detail about her personal life, beginning with her early years in Ireland,  the incompatibilities between her parents that prompted her mother to take her and her brother to the United States when she was nine, and her efforts as a schoolgirl to become American in the full sense of the term. After numerous failed romances, she finally met Mr. Right, her husband, Harvard Law School professor Cass Sunstein (who also served briefly in the Obama administration). The marriage gave rise to a boy and a girl with lovely Irish names, Declan and Rían, both born while Power was in government.  With much emphasis upon her parents, husband, children and family life, the memoir is also a case study of how professional women balance the exacting demands of high-level jobs with the formidable responsibilities attached to being a parent and spouse.  It’s a tough balancing act for any parent, but especially for women, and Power admits that she did not always strike the right balance.

Memoirs by political and public figures are frequently attempts to write one’s biography before someone else does, and Power’s whopping 550-page work seems to fit this rule.  But Power provides much candor  – a willingness to admit to mistakes and share vulnerabilities – that is often missing in political memoirs. Refreshingly, she also abstains from serious score settling.  Most striking for me is the nostalgia that pervades the memoir.  Power takes her readers down memory lane, depicting a now by-gone time when the United States cared about human rights and believed in bi- and multi-lateral cooperation to accomplish its goals in its dealings with the rest of the world – a time that sure seems long ago.

* * *

Samantha Jane Power was born in 1970 to Irish parents, Vera Delaney, a doctor, and Jim Power, a part-time dentist.  She spent her early years in Dublin, in a tense family environment where, she can see now, her parents’ marriage was coming unraveled.  Her father put in far more time at Hartigan’s, a local pub in the neighborhood where he was known for his musical skills and “holding court,” than he did at his dentist’s office.  Although young Samantha didn’t recognize it at the time, her father had a serious alcohol problem, serious enough to lead her mother to escape by immigrating to the United States with the couple’s two children, Samantha, then age nine, and her brother Stephen, two years younger. They settled in Pittsburgh, where Samantha at a young age set about to become American, as she dropped her Irish accent, tried to learn the intricacies of American sports, and became a fervent Pittsburgh Pirates fan.

But the two children were required under the terms of their parents’ custody agreement to spend time with her father back in Ireland. On her trip back at Christmas 1979, Samantha’s father informed the nine-year old that he intended to keep her and her brother with him.  When her mother, who was staying nearby, showed up to object and collect her children to return to the United States, a parental confrontation ensued which would traumatize Samantha for decades.  The nine year old found herself caught between the conflicting commands of her two parents and, in a split second decision, left with her mother and returned to the Pittsburgh. She never again saw her father.

When her father died unexpectedly five years later, at age 47 of alcohol-related complications, Samantha, then in high school, blamed herself for her father’s death and carried a sense of guilt with her well into her adult years. It was not until she was thirty-five, after many therapy sessions, that she came to accept that she had not been responsible for her father’s death.  Then, a few years later, she made the mistake of returning to Hartigan’s, where she encountered the bar lady who had worked there in her father’s time.   Mostly out of curiosity, Power asked her why, given that so many people drank so much at Hartigan’s, her father had been the only one who died. The bar lady’s answer was matter-of-fact: “Because you left” (p.192) — not what Power needed to hear.

Power had by then already acquired a public persona as a human rights advocate through her work as a journalist in the 1990s in Bosnia, where she called attention to the ethnic cleansing that was sweeping the country in the aftermath of the collapse of the former Yugoslavia.  Power ended up writing for a number of major publications, including The Economist, the New Republic and the Washington Post.   She was among the first to report on the fall of Srebrenica in July 1995, the largest single massacre in Europe since World War II, in which around 10,000 Muslim men and boy were taken prisoner and “seemed to have simply vanished” (p.102). Although the United States and its NATO allies had imposed a no-fly zone over Bosnia, Power hoped the Clinton administration would commit to employing ground troops to prevent further atrocities. But she did not yet enjoy the clout to have a real chance at making her case directly with the administration.

Power wrote a chronology of the conflict, Breakdown in the Balkans, which was later put into book form and attracted attention from think tanks, and the diplomatic, policy and media communities.  Attracting even more attention was  A Problem for Hell: America and the Age of Genocide, her book exploring  American reluctance to take action in the face of 20th century mass atrocities and genocides.  The book appeared in 2002, and won the 2003 Pulitzer Prize for General Non-Fiction.  It also provided Power with her inroad to Senator Barack Obama.

At the recommendation of a politically well-connected friend, in late 2004 Power sent a copy of the book to the recently elected Illinois Senator who had inspired the Democratic National Convention that summer with an electrifying keynote address.  Obama’s office scheduled a dinner for her with the Senator which was supposed to last 45 minutes.  The dinner went on for four hours as the two exchanged ideas about America’s place in the world and how, why and when it should advance human rights as a component of its foreign policy.  Although Obama considered Power to be primarily an academic, he offered her a position on his Senate staff, where she started working late in 2005.

Obama and Power would then be linked professionally more or less continually until the end of the Obama presidency in January 2017.   Once Obama enters the memoir, at about the one-third point, it becomes as much his story as hers. The two did not always see the world and specific world problems in the same way, but it’s clear that Obama had great appreciation both for Power’s intelligence and her intensity. He was a man who enjoyed being challenged intellectually, and plainly valued the human rights perspective that Power brought to their policy discussions even if he wasn’t prepared to push as far as Power advocated.

After Obama threw his hat in the ring for the 2008 Democratic Party nomination, Power became one of his primary foreign policy advisors and, more generally, a political operative. It was not a role that fit Power comfortably and it threatened to be short-lived.  In the heat of the primary campaign, with Obama and Hilary Clinton facing off in a vigorously contested battle for their party’s nomination, Power was quoted in an obscure British publication, the Scotsman, as describing Clinton as a “monster.” The right-wing Drudge Report picked up the quotation, whose accuracy Power does not contest, and suddenly Power found herself on the front page of major newspapers, the subject of a story she did not want.  Obama’s closest advisors were of the view that she would have to resign from the campaign.  But the candidate himself, who loved sports metaphors, told Power only that she would have to spend some time in the “penalty box” (p.187).  Obama’s relatively soft reaction was an indication of the potential he saw in her and his assessment of her prospective value to him if successful in the primaries and the general election.

Power’s time in the penalty box had expired when Obama, having defeated Clinton for his party’s nomination, won a resounding victory in the general election in November 2008.  Obama badly wanted Power on his team in some capacity, and the transition team placed her on the President’s National Security Council as principal deputy for international organizations, especially the United Nations.  But she was also able to carve out a concurrent position for herself as the President’s Senior Director for Human Rights.   In this portion of the memoir, Power describes learning the jargon and often-arcane skills needed to be effective on the council and within the vast foreign policy bureaucracy of the United States government.  Being solely responsibility for human rights, Power found that she had some leeway in deciding which issues to concentrate on and bring to the attention of the full Council.  Her mentor Richard Holbrook advised her that she could be most effective on subjects for which there was limited United States interest – pick “small fights,” Holbrook advised.

Power had a hand in a string of “small victories” while on the National Security Council: coaxing the United States to rejoin a number of UN agencies from which the Bush Administration had walked away; convincing President Obama to raise his voice over atrocities perpetrated by governments in Sri Lanka and Sudan against their own citizens; being appointed White House coordinator for Iraqi refugees; helping create an inter-agency board to coordinate the United States government’s response to war crimes and atrocities; and encouraging increased emphasis upon lesbian, gay, bi-sexual and transgender issues (LGBT) overseas.  In pursuit of the latter, Obama delivered an address at the UN General Assembly on LGBT rights, and thereafter issued a Presidential Memorandum directing all US agencies to consider LGBT issues explicitly in crafting overseas assistance (disclosure: while with the Department of Justice, I served on the department’s portion of the inter-agency Atrocity Prevention Board, and represented the department in inter-agency coordination on the President’s LGBT memorandum; I never met Power in either capacity).

But the Arab Spring that erupted in late 2010 and early 2011 presented  anything but small issues and resulted in few victories for the Obama administration.  A “cascade of revolts that would reorder huge swaths of the Arab world,” the Arab Spring ended up “impacting the course of Obama’s presidency more than any other geopolitical development during his eight years in office” (p.288), Power writes, and the same could be said for Power’s time in government.  Power was among those at the National Security Council who pushed successfully for United States military intervention in Libya to protect Libyan citizens from the predations of their leader, Muammar Qaddafi.

The intervention, backed by a United Nations Security Council resolution and led jointly by the United States, France and Jordan, saved civilian lives and contributed to Qaddafi’s ouster and death.  ButPresident Obama was determined to avoid a longer-term and more open-ended United States commitment, and the mission stopped short of the follow-up needed to bring stability to the country.  With civil war in various guises continuing to this day, Power suggests that the outcome might have been different had the United States continued its engagement in the aftermath of Qaddafi’s death.

Shortly after Power became US Ambassador to the United Nations, the volatile issue of an American military commitment arose again, this time in Syria in August 2013, when proof came irrefutably to light that Syrian leader Bashar al-Assad was using chemical weapons in his effort to suppress uprisings within the country.  The revelations came 13 months after Obama had asserted that use of such weapons would constitute a “red line” that would move him to intervene militarily in Syria.  Power favored targeted US air strikes within Syria.

Obama came excruciatingly close to approving such strikes.  He not only concluded that the “costs of not responding forcefully were greater than the risks of taking military action” (p.369), but was prepared to act without UN Security Council authorization, given the certainty of  a Russian veto of any Security Council resolution for concerted action.   With elevated stakes for “upholding the international norm against the use of chemical weapons” Power writes, Obama was “prepared to operate with what White House lawyers called a ‘traditionally recognized legal basis under international law’” (p.369).

But almost overnight, Obama decided that he needed prior Congressional authorization for a military strike in Syria, a decision taken seemingly with little effort to ascertain whether there was sufficient support in Congress for such a strike.  With neither the Congress nor the American public supporting military action within Syria to save civilian lives, Obama backed down.  On no other issue did Power see Obama as torn as he was on Syria,  “convinced that even limited military action would mire the United States in another open-ended conflict, yet wracked by the human toll of the slaughter.  I don’t believe he ever stopped interrogating his choices” (p.508).

Looking back at that decision with the passage of more than five years, Power’s disappointment remains palpable.  The consequences of inaction in Syria, she maintains, went:

beyond unfathomable levels of death, destruction, and displacement. The spillover of the conflict into neighboring countries through massive refugee flows and the spread of ISIS’s ideology has created dangers for people in many parts of the world. . . [T]hose of us involved in helping devise Syria policy will forever carry regret over our inability to do more to stem the crisis.  And we know the consequences of the policies we did choose. For generations to come, the Syrian people and the wide world will be living with the horrific aftermath of the most diabolical atrocities carried out since the Rwanda genocide (p.513-14).

But if incomplete action in Libya and inaction in Syria constitute major disappointments for Power, she considers exemplary the response of both the United States and the United Nations to the July 2014 outbreak of the Ebola virus that occurred in three West African countries, Guinea, Liberia and Sierra Leone.  United States experts initially foresaw more than one million infections of the deadly and contagious disease by the end of 2015.  The United States devised its own plan to send supplies, doctors and nurses to the region to facilitate the training of local health workers to care for Ebola patients, along with 3,000 military personnel to assist with on-the-ground logistics.  Power was able to talk President Obama out of a travel ban to the United States from the three impacted countries, a measure favored not only by Donald Trump, then contemplating an improbable run for the presidency, but also by many members of the President’s own party.

At the United Nations, Power was charged with marshaling global assistance.   She convinced 134 fellow Ambassadors to co-sponsor a Security Council resolution declaring the Ebola outbreak a public health threat to international peace and security, the largest number of co-sponsors for any Security Council resolution in UN history and the first ever directed to a public health crisis.  Thereafter, UN Member States committed $4 billion in supplies, facilities and medical treatments.  The surge of international resources that followed meant that the three West African countries “got what they needed to conquer Ebola” (p.455).  At different times in 2015, each of the countries was declared Ebola-free.

The most deadly and dangerous Ebola outbreak in history was contained, Power observes, above all because of the “heroic efforts of the people and governments of Guinea, Liberia and Sierra Leone” (p.456). But America’s involvement was also crucial.  President Obama provided what she describes as an “awesome demonstration of US leadership and capability – and a vivid example of how a country advances its values and interests at once” (p.438).  But the multi-national, collective success further illustrated “why the world needed the United Nations, because no one country – even one as powerful as the United States – could have slayed the epidemic on its own” (p.457).

Although Russia supported the UN Ebola intervention, Power more often found herself in an adversarial posture with Russia on both geo-political and UN administrative issues.  Yet, she used creative  diplomatic skills to develop a more nuanced relationship with her Russian counterpart, Vitaly Churkin.  Cherkin, a talented negotiator and master of the art of strategically storming out of meetings, valued US-Russia cooperation and often “pushed for compromises that Moscow was disinclined to make” (p.405).  Over time, Power writes, she and Churkin “developed something resembling genuine friendship” (p.406). But “I also spent much of my time at the UN in pitched, public battle with him” (p.408).

The most heated of these battles ensued after Russia invaded Ukraine in February 2014, a flagrant violation of international law. Later that year, troops associated with Russia shot down a Malaysian passenger jet, killing all passengers aboard.  In the UN debates on Ukraine, Power found her Russian counterpart “defending the indefensible, repeating lines sent by Moscow that he was too intelligent to believe and speaking in binary terms that belied his nuanced grasp of what was actually happening” (p.426). Yet, Power and Churkin continued to meet privately to seek solutions to the Ukraine crisis, none of which bore fruit.

While at the UN, Power went out of her way to visit the offices of the ambassadors of the smaller countries represented in the General Assembly, many of whom had never received  a United States Ambassador.  During her UN tenure, she managed to meet personally with the ambassadors from every country except North Korea.  Power also started a group that gathered the UN’s 37 female Ambassadors together one day a week for coffee and discussion of common issues.  Some involved  substantive matters that the UN had to deal with, but just as often the group focused on workplace matters that affected the women ambassadors as women, matters that their male colleagues did not have to deal with.

* * *

Donald Trump’s surprise victory in November 2016 left Power stunned.  His nativist campaign to “Make America Great Again” seemed to her like a “repudiation of many of the central tenets of my life” (p.534).  As an  immigrant, a category Trump seemed to relish denigrating, she “felt fortunate to have experienced many countries and cultures. I saw the fate of the American people as intertwined with that of individuals elsewhere on the planet.   And I knew that if the United States retreated from the world, global crises would fester, harming US interests” (p.534-35).  As Obama passed the baton to Trump in January 2017, Power left government.

Not long after, her husband suffered a near-fatal automobile accident, from which he recovered. Today, the pair team-teach courses at Harvard, while Power seems to have found the time for her family that proved so elusive when she was in government.  She is coaching her son’s baseball team and helping her daughter survey rocks and leaves in their backyard.  No one would begrudge Power’s quality time with her family. But her memoir will likely leave many readers wistful, daring to hope that there may someday  be room again for  her and her energetic idealism in the formulation of United States foreign policy.

Thomas H. Peebles

La Châtaigneraie, France

April 26, 2020

7 Comments

Filed under American Politics, American Society, Politics, United States History

A Defense of Truth

 

Dorian Lynskey, The Ministry of Truth:

The Biography of George Orwell’s 1984 

                           George Orwell’s name, like that of William Shakespeare, Charles Dickens and Franz Kafka, has given rise to an adjective.  “Orwellian” connotes official deception, secret surveillance, misleading terminology, and the manipulation of history.   Several terms used in Orwell’s best known novel, Nineteen Eighty Four, have entered into common usage, including “doublethink,” “thought crime,” “newspeak,” “memory hole,” and “Big Brother.”  First published in June 1949, a little over a half year prior to Orwell’s death in January 1950, Nineteen Eighty Four is consistently described as a “dystopian” novel – a genre of fiction which, according to Merriam-Webster, pictures “an imagined world or society in which people lead wretched, dehumanized, fearful lives.”

This definition fits neatly the world that Orwell depicted in Nineteen Eighty Four, a world divided between three inter-continental super states perpetually at war, Oceania, Eurasia and Eastasia, with Britain reduced to a province of Oceania bearing the sardonic name “Airstrip One.”  Airstrip One is ruled by The Party under the ideology Insoc, a shortening of “English socialism.”  The Party’s leader, Big Brother, is the object of an intense cult of personality — even though there is no hard proof he actually exists.  Surveillance through two-way telescreens and propaganda are omnipresent.  The protagonist, Winston Smith, is a diligent lower-level Party member who works at the Ministry of Truth, where he rewrites historical records to conform to the state’s ever-changing version of history.  Smith enters into a forbidden relationship with his co-worker, Julia, a relationship that terminates in mutual betrayal.

In his intriguing study, The Ministry of Truth: The Biography of George Orwell’s 1984, British journalist and music critic Dorian Lynskey seeks to explain what Nineteen Eighty-Four “actually is, how it came to be written, and how it has shaped the world, in its author’s absence, over the past seventy years” (p.xiv). Although there are biographies of Orwell and academic studies of Nineteen Eighty-Four’s intellectual context, Lynskey contends that his is the first to “merge the two streams into one narrative, while also exploring the book’s afterlife” (p.xv; I reviewed Thomas Ricks’ book on Orwell and Winston Churchill here in November 2017).   Lynskey’s work is organized in a “Before/After” format.  Part One, about 2/3 of the book, looks at the works and thinkers who influenced Orwell and his novel, juxtaposed with basic Orwell biographical background.  Part II, roughly the last third, examines the novel’s afterlife.

But Lynskey begins in a surprising place, Washington, D.C., in January 2017, where a spokesman for President Donald Trump told the White House press corps that the recently-elected president had taken his oath of office before the “largest audience to ever witness an inauguration – period – both in person and around the globe.”  A presidential adviser subsequently justified this “preposterous lie” by characterizing the statement as “alternative facts” (p.xiii).   Sales of Orwell’s book shot up immediately thereafter.  The incident constitutes a reminder, Lynskey contends, of the “painful lessons that the world appears to have unlearned since Orwell’s lifetime, especially those concerning the fragility of truth in the face of power” (p.xix).

How Orwell came to see the consequences of mutilating truth and gave them expression in Nineteen Eighty-Four is the focus of Part I.  Orwell’s brief participation in the Spanish Civil War, from December 1936 through mid-1937, was paramount among his personal experiences in shaping the novel’s worldview. Spain was the “great rupture in his life; his zero hour” (p.4), the experience that lead Orwell to the conclusion that Soviet communism was as antithetical as fascism and Nazism to the values he held dear (Lynskey’s list of Orwell’s values: “honesty, decency, fairness, memory, history, clarity, privacy, common sense, sanity, England, and love” (p.xv)).  While no single work provided an intellectual foundation for Nineteen Eighty Four in the way that the Spanish Civil War provided the personal and practical foundation, Lynskey discusses numerous writers whose works contributed to the worldview on display in Orwell’s novel.

Lynskey dives deeply into the novels and writings of Edward Bellamy, H.G. Wells and the Russian writer Yevgeny Zamytin.  Orwell’s friend Arthur Koestler set out what Lynskey terms the “mental landscape” for Nineteen Eighty-Four in his 1940 classic Darkness at Noon, while the American conservative James Burnham provided the novel’s “geo-political superstructure” (p.126).  Lynskey discusses a host of other writers whose works in one way or another contributed to Nineteen Eighty-Four’s world view, among them Jack London, Aldous Huxley, Friedrich Hayek, and the late 17th and early 18th century satirist Jonathan Swift.

In Part II, Lynskey treats some of the dystopian novels and novelists that have appeared since Nineteen Eighty-Four.  He provides surprising detail on David Bowie, who alluded to Orwell in his songs and wrote material that reflected the outlook of Nineteen Eighty-Four.  He notes that Margaret Atwood termed her celebrated The Handmaid’s Tale a “speculative fiction of the George Orwell variety” (p.241).  But the crux of Part II lies in Lynskey’s discussion of the evolving interpretations of the novel since its publication, and why it still matters today.  He argues that Nineteen Eighty Four has become both a “vessel into which anyone could pour their own version of the future” (p.228), and an “all-purpose shorthand” for an “uncertain present” (p.213).

In the immediate aftermath of its publication, when the Cold War was at its height, the novel was seen by many as a lesson on totalitarianism and the dangers that the Soviet Union and Communist China posed to the West (Eurasia, Eastasia and Oceania in the novel correspond roughly to the Soviet Union, China and the West, respectively).  When the Cold War ended with the fall of Soviet Union in 1991, the novel morphed into a warning about the invasive technologies spawned by the Internet and their potential for surveillance of individual lives.  In the Age of Trump and Brexit, the novel has become “most of all a defense of truth . . . Orwell’s fear that ‘the very concept of objective truth is fading out of the world’ is the dark heart of Nineteen Eighty-Four. It gripped him long before he came up with Big Brother, Oceania, Newspeak or the telescreen, and it’s more important than any of them” (p.265-66).

* * *

                            Orwell was born as Eric Blair in 1903 in India, where his father was a mid-level civil servant. His mother was half-French and a committed suffragette.  In 1933, prior to publication of his first major book,  Down and Out in Paris and London, which recounts his life in voluntary poverty in the two cities, the fledgling author took the pen name Orwell from a river in Sussex .  He changed names purportedly to save his parents from the embarrassment which  he assumed his forthcoming work  would cause.  He was at best a mid-level journalist and writer when he went to Spain in late 1936, with a handful of novels and lengthy essays to his credit – “barely George Orwell” (p.4), as Lynskey puts it.

The Spanish Civil war erupted after Spain’s Republican government, known as the Popular Front, a coalition of liberal democrats, socialists and communists, narrowly won a parliamentary majority in 1936, only to face a rebellion from the Nationalist forces of General Francisco Franco, representing Spain’s military, business elites, large landowners and the Catholic Church.  Nazi Germany and Fascist Italy furnished arms and other assistance for the Nationalists’ assault on Spain’s democratic institutions, while the Soviet Union assisted the Republicans (the leading democracies of the period, Great Britain, France and the United States, remained officially neutral; I reviewed Adam Hochschild’s work on the Spanish Civil War here in August 2017).   Spain provided Orwell with his first and only personal exposure to the “nightmare atmosphere” (p.17) that would envelop the novel he wrote a decade later.

Fighting with the Workers’ Party of Marxist Unification (Spanish acronym: POUM), a renegade working class party that opposed Stalin, Orwell quickly found himself in the middle of what amounted to a mini-civil war among the disparate left-wing factions on the Republican side, all within the larger civil war with the Nationalists.  Orwell saw first-hand the dogmatism and authoritarianism of the Stalinist left at work in Spain, nurtured by a level of deliberate deceit that appalled him.  He read newspaper accounts that did not even purport to bear any relationship to what had actually happened. For Orwell previously, Lynskey writes:

people were guilty of deliberate deceit or unconscious bias, but at least they believed in the existence of facts and the distinction between true and false. Totalitarian regimes, however, lied on such a grand scale that they made Orwell feel that ‘the very concept of objective truth is fading out of the world’ (p.99).

Orwell saw totalitarianism in all its manifestations as dangerous not primarily because of secret police or constant surveillance but because “there is no solid ground from which to mount a rebellion –no corner of the mind that has not been infected and warped by the state.  It is power that removes the possibility of challenging power” (p.99).

Orwell narrowly escaped death when he was hit by a bullet in the spring of 1937.  He was hospitalized in Barcelona for three weeks, after which he and his wife Eileen escaped across the border to France.  Driven to Spain by his hatred of fascism, Orwell left with a “second enemy. The fascists had behaved just as appallingly as he had expected they would, but the ruthlessness and dishonesty of the communists had shocked him” (p.18).  From that point onward, Orwell criticized communism more energetically than fascism because he had seen communism “up close, and because its appeal was more treacherous. Both ideologies reached the same totalitarian destination but communism began with nobler aims and therefore required more lies to sustain it” (p.22).   After his time in Spain, Orwell knew that he stood against totalitarianism of all stripes, and for democratic socialism as its counterpoint.

The term “dystopia” was not used frequently in Orwell’s time, and Orwell distinguished between “favorable” and “pessimistic” utopias.   Orwell developed what he termed a “pitying fondness” (p.38) for nineteenth-century visions of a better world, particularly the American Edward Bellamy’s 1888 novel Looking Backward.  This highly popular novel contained a “seductive political argument” (p.33) for the nationalization of all industry, and the use of an “industrial army” to organize production and distribution.  Bellamy had what Lynskey terms a “thoroughly pre-totalitarian mind,” with an “unwavering faith in human nature and common sense” that failed to see the “dystopian implications of unanimous obedience to a one-party state that will last forever” (p.38).

Bellamy was a direct inspiration for the works of H.G. Wells, one of the most prolific writers of his age. Wells exerted enormous influence on the young Eric Blair, looming over the boy’s childhood “like a planet – awe inspiring, oppressive, impossible to ignore – and Orwell never got over it” (p.60).  Often called the English Jules Verne, Wells foresaw space travel, tanks, electric trains, wind and water power, identity cards, poison gas, the Channel tunnel and atom bombs.  His fiction imagined time travel, Martian invasions, invisibility and genetic engineering.  The word Wellsian came to mean “belief in an orderly scientific utopia,” but his early works are “cautionary tales of progress thwarted, science abused and complacency punished” (p.63).

Wells was himself a direct influence upon Yevgeny Zamatin’s We which, in Lymskey’s interpretation, constitutes the most direct antecedent to Nineteen Eighty-Four.  Finished in 1920 at the height of the civil war that followed the 1917 Bolshevik Revolution (but not published in the Soviet Union until 1988), We is set in the undefined future, a time when people are referred to only by numbers. The protagonist, D-503, a spacecraft engineer, lives in the One State, where mass surveillance is omnipresent and all aspects of life are scientifically managed.  It is an open question whether We was intended to satirize the Bolshevik regime, in 1920 already a one-party state with extensive secret police.

Zamyatin died in exile in Paris in 1937, at age 53.   Orwell did not read We until sometime after its author’s death.  Whether Orwell “took ideas straight from Zamyatin or was simply thinking along similar lines” is “difficult to say” (p.108), Lynskey writes.  Nonetheless, it is “impossible to read Zamyatin’s bizarre and visionary novel without being strongly reminded of stories that were written afterwards, Orwell’s included” (p.102).

Koestler’s Darkness at Noon offered a solution to the central riddle of the Moscow show trials of the 1930s: “why did so many Communist party members sign confessions of crimes against the state, and thus their death warrants?” Koestler argued that their “years of unbending loyalty had dissolved their belief in objective truth: if the Party required them to be guilty, then guilty they must be” (p.127).  To Orwell this meant that one is punished in totalitarian states not for “ what one does but for what one is, or more exactly, for what one is suspected of being” (p.128).

The ideas contained in James Burnham’s 1944 book, The Managerial Revolution “seized Orwell’s imagination even as his intellect rejected them” (p.122).  A Trotskyite in his youth who in the 1950s helped William F. Buckley found the conservative weekly, The National Review, Burnham saw the future belonging to a huge, centralized bureaucratic state run by a class of managers and technocrats.  Orwell made a “crucial connection between Burnham’s super-state hypothesis and his own long-standing obsession with organized lying” (p.121-22).

Orwell’s chronic lung problems precluded him from serving in the military during World War II.  From August 1941 to November 1943, he worked for the Indian Section of the BBC’s Eastern Service, where he found himself “reluctantly writing for the state . . . Day to day, the job introduced him to the mechanics of propaganda, bureaucracy, censorship and mass media, informing Winston Smith’s job at the Ministry of Truth” (p.83; Orwell’s boss at the BBC was notorious Cambridge spy Guy Burgess, whose biography I reviewed here in December 2017).   Orwell left the BBC in 1943 to become literary editor of the Tribune, an anti-Stalinist weekly.

While at the Tribune, Orwell found time to produce Animal Farm, a “scrupulous allegory of Russian history from the revolution to the Tehran conference” (p.138), with each animal representing an individual, Stalin, Trotsky, Hitler, and so on.  Animal Farm shared with Nineteen Eighty-Four an “obsession with the erosion and corruption of memory” (p.139).  Memories in the two works are gradually erased, first, by the falsification of evidence; second, by the infallibility of the leader; third, by language; and fourth, by time.  Published in August 1945, Animal Farm quickly became a best seller.  The fable’s unmistakable anti-Soviet message forced Orwell to remind readers that he remained a socialist.  “I belong to the Left and must work inside it,” he wrote, “much as I hate Russian totalitarianism and its poisonous influence of this country” (p.141).

Earlier in 1945, Orwell’s wife Eileen died suddenly after being hospitalized for a hysterectomy, less than a year after the couple had adopted a son, whom they named Richard Horatio Blair.  Orwell grieved the loss of his wife by burying himself in the work that culminated in Nineteen Eighty-Four.   But Orwell became ever sicker with tuberculosis as he worked  over the next four years on the novel which was titled The Last Man in Europe until almost immediately prior to publication (Lynskey gives no credence to the theory that Orwell selected 1984 as a inversion of the last two digits of 1948).

Yet, Lynskey rejects the notion that Nineteen Eighty-Four was the “anguished last testament of a dying man” (p.160).  Orwell “never really believed he was dying, or at least no more than usual. He had suffered from lung problems since childhood and had been ill, off and on, for so long that he had no reason to think that this time would be the last ” (p.160).  His novel was published in June 1949.  227 days later, in January 1950, Orwell died when a blood vessel in his lung ruptured.

* * *

                                    Nineteen Eighty-Four had an immediate positive reception. The book was variously compared to an earthquake, a bundle of dynamite, and the label on a bottle of poison.  It was made into a movie, a play, and a BBC television series.  Yet, Lynskey writes, “people seemed determined to misunderstand it” (p.170).  During the Cold War of the early 1950s, conservatives and hard line leftists both saw the book as a condemnation of socialism in all its forms.  The more astute critics, Lynskey argues, were those who “understood Orwell’s message that the germs of totalitarianism existed in Us as well as Them” (p.182).  The Soviet invasion of Hungary in 1956 constituted a turning point in interpretations of Nineteen Eighty-Four.  After the invasion, many of Orwell’s critics on the left “had to accept that they had been wrong about the nature of Soviet communism and that he [Orwell] had been infuriatingly right” (p.210).

The hoopla that accompanied the actual year 1984, Lynskey notes wryly, came about only because “one man decided, late in the day, to change the title of his novel” (p.234).   By that time, the book was being read less as an anti-communist tract and more as a reminder of the abuses exposed in the Watergate affair of the previous decade, the excesses of the FBI and CIA, and the potential for mischief that personal computers, then in their infancy, posed.  With the fall of the Berlin wall and the end of communism between 1989 and 1991, focus on the power of technology intensified.

But today the focus is on Orwell’s depiction of the demise of objective truth in Nineteen Eighty-Four, and appropriately so, Lynskey argues, noting how President Trump masterfully “creates his own reality and measures his power by the number of people who subscribe to it: the cruder the lie, the more power its success demonstrates” (p.264).  It is truly Orwellian, Lynskey contends, that the phrase “fake news” has been “turned on its head by Trump and his fellow authoritarians to describe real news that is not to their liking, while flagrant lies become ‘alternative facts’” (p.264).

* * *

                                 While resisting the temptation to term Nineteen Eighty-Four more relevant now than ever, Lynskey asserts that the novel today is nonetheless  “a damn sight more relevant than it should be” (p.xix).   An era “plagued by far-right populism, authoritarian nationalism, rampant disinformation and waning faith in liberal democracy,” he concludes, is “not one in which the message of Nineteen Eighty-Four can be easily dismissed” (p.265).

Thomas H. Peebles

La Châtaigneraie, France

February 25, 2020

2 Comments

Filed under Biography, British History, European History, Language, Literature, Political Theory, Politics, Soviet Union

Stirring Rise and Crushing Fall of a Renaissance Man

 

 

Jeff Sparrow, No Way But This:

In Search of Paul Robeson (Scribe)

            If you are among those who think the term “Renaissance Man” seems fuzzy and even frivolous when applied to anyone born after roughly 1600, consider the case of Paul Robeson (1898-1976), a man whose talents and genius extended across an impossibly wide range of activities.  In the 1920s and 1930s, Robeson, the son of a former slave, thrilled audiences worldwide with both his singing and his acting.  In a mellifluous baritone voice, Robeson gave new vitality to African-American songs that dated to slave plantations.  On the stage, his lead role as Othello in the play of that name gave a distinctly 20th century cast to one of Shakespeare’s most enigmatic characters.  He also appeared in a handful of films in the 1930s.  Before becoming a singing and acting superstar, Robeson had been one of the outstanding athletes of his generation, on par with the legendary Jim Thorpe.  Robeson  further earned a degree from Columbia Law School and reportedly was conversant in upwards of 15 languages.

Robeson put his multiple talents to use as an advocate for racial and economic justice internationally.  He was among the minority of Americans in the 1930s who linked European Fascism and Nazism to the omnipresent racism he had confronted in America since childhood.  But Robeson’s political activism during the Cold War that followed World War II ensnared the world class Shakespearean actor in a tragedy of Shakespearean dimension, providing a painful denouement to his uplifting life story.

Although Robeson never joined a communist party, he perceived a commitment to full equality in the Soviet Union that was missing in the West.  While many Westerners later saw that their admiration for the Soviet experiment had been misplaced, Robeson never publicly criticized the Soviet Union and paid an unconscionably heavy price for his stubborn consistency during the Cold War.  The State Department refused to renew his passport, precluding him from traveling abroad for eight years.  He was hounded by the FBI and shunned professionally.  Robeson had suffered from depression throughout his adult life.  But his mental health issues intensified in the Cold War era and included a handful of suicide attempts.  Robeson spent his final years in limbo, silenced, isolated and increasingly despairing, up to his death in 1976.

In No Way But This: In Search of Paul Robeson, Jeff Sparrow, an Australian journalist, seeks to capture Robeson’s stirring rise and crushing fall.  The book’s subtitle – “In Search of Paul Robeson” — may sound like any number of biographical works, but in this case encapsulates precisely the book’s unique quality.  In nearly equal doses, Sparrow’s work consists of the major elements of Robeson’s life and Sparrow’s account of how he set about to learn the details of that life — an example of biography and memoir melding together.  Sparrow visited many of the places where Robeson lived, including Princeton, New Jersey, where he was born in 1898; Harlem in New York City; London and Wales in Great Britain; and Moscow and other locations in today’s Russia.

In each location, Sparrow was able to find knowledgeable people, such as archivists and local historians, who knew about Robeson and were able to provide helpful insights into the man’s relationship to the particular location.  We learn for instance from Sparrow’s guides how the Harlem that Robeson knew is rapidly gentrifying today and how the economy of contemporary Wales functions long after closure of the mines which Robeson once visited.  Sparrow’s travels to the former Soviet Union take him to several locations where Robeson never set foot, including Siberia, all in effort to understand the legacy of Soviet terror which Robeson refused to acknowledge.  Sparrow’s account of his travels to these diverse places and his interactions with his guides reads at times like a travelogue.  Readers looking to plunge into the vicissitudes of Robeson’s life may find these portions of the book distracting.  The more compelling portions are those that treat Robeson’s extraordinary life itself.

* * *

            That life began in Princeton, New Jersey, world famous for its university of that name.  The Robeson family lived in a small African-American community rarely visited by those whose businesses and lives depended upon the university.  Princeton was then considered,  as Sparrow puts it, a “northern outpost of the white supremacist South: a place ‘spiritually located in Dixie’” (p.29).  William Robeson, Paul’s father, was a runaway former slave who earned a degree from Lincoln University and became an ordained Presbyterian minister.  His mother Maria, who came from an abolitionist Quaker family and was of mixed ancestry, died in a house fire when Paul was six years old.  Thereafter, William raised Paul and his three older brothers and one older sister on his own.  William played a formidable role in shaping young Paul, who later described his father as the “glory of my boyhood years . . . I loved him like no one in all the world” (p.19).

William abandoned Presbyterianism for the African Methodist Episcopal Zion Church, one of the oldest black denominations in the country, and took on a much larger congregation in Somerville, New Jersey, where Paul attended high school.  One of a handful of African-American students in a sea of whites, Robeson excelled academically and played baseball, basketball and football.  He also edited the school paper, acted with the drama group, sang with the glee club, and participated in the debating society.  When his father was ill or absent, he sometimes preached at his father’s church.  Robeson’s high school accomplishments earned him a scholarship to nearby Rutgers University.

At Rutgers, Robeson again excelled academically.  He became a member of the Phi Beta Kappa honor society and was selected as class valedictorian.  As in high school, he was also an outstanding athlete, earning varsity letters in football, basketball and track.  A standout in football, Robeson was “one of the greatest American footballers of a generation,” so much so that his coach “designed Rutgers’ game-plan tactics specifically to exploit his star’s manifold talents” (p.49).  Playing in the backfield, Robeson could both run and throw. His hefty weight and size made him almost impossible to stop.  On defense, his tackling “took down opponents with emphatic finality” (p.49).  Twice named to the All-American Football Team, Robeson was not inducted into the College Football Hall of Fame until 1995, 19 years after his death.

After graduation from Rutgers in 1919, Robeson spent the next several years in New York City.  He enrolled in New York University Law School, then transferred to Columbia and moved to Harlem.  There, Robeson absorbed the weighty atmosphere the Harlem Renaissance, a flourishing of African-American culture, thinking and resistance in the 1920s.  While at Columbia, Robeson met chemistry student Eslanda Goode, known as “Essie.”  The couple married in 1921.

Robeson received his law degree from Columbia in 1923 and worked for a short time in a New York law firm.  But he left the firm abruptly when a secretary told him that she would not take dictation from an African-American.  Given his talents, one wonders what Robeson could have achieved had he continued in the legal profession.  It is not difficult to imagine Robeson the lawyer becoming the black Clarence Darrow of his age, the “attorney for the damned;” or a colleague of future Supreme Court Justice Thurgood Marshall in the 20th century’s legal battles for full African-American rights.  But Robeson gravitated instead toward singing and acting after leaving the legal profession, while briefly playing semi-pro football and basketball.

Robeson made his mark as a singer by rendering respectable African-American songs such as “Sometimes I Feel Like a Motherless Child” and “Swing Low Sweet Chariot” that had originated on the plantations — “sorrow songs” that “voiced the anguish of slavery” (p.81), as Sparrow puts it.  After acting in amateur plays, Robeson won the lead role in Eugene O’Neill’s All God’s Chillun Got Wings, a play about inter-racial sexual attraction that established Robeson as an “actor to watch” (p.69).  Many of the leading lights of the Harlem Renaissance criticized Robeson’s role in the play as reinforcing racial stereotypes, while white reviewers “blasted the play as an insult to the white race” (p.70).  An opportunity to star in O’Neill’s Emperor Jones on the London stage led the Robesons to Britain in 1925, where they lived for several years.  The couple’s  only child, Paul Jr., whom they called “Pauli,” was born in London in 1927.

Robeson delighted London audiences with his role in the musical Show Boat, which proved to be as big a hit in Drury Lane as it had been on Broadway.  He famously changed the lines to “Old Man River” from the meek “I’m tired of livin’” and “feared of dyin'” to a declaration of resistance: “I must keep fightin’/Until I’m dyin'”.  His rendition of “Old Man River,” Sparrow writes, transported the audience “beyond the silly narrative to an almost visceral experience of oppression and pain.”  Robeson used his huge frame, “bent and twisted as he staggered beneath a bale, to convey the agony of black history while revealing the tremendous strength forged by centuries of resistance” (p.103).

The Robesons in their London years prospered financially and moved easily in a high inner circle of respectable society.  The man who couldn’t rent a room in many American cities lived as an English gentleman in London, Sparrow notes.  But by the early 1930s, Robeson had learned to see respectable England as “disconcertingly similar” to the United States, “albeit with its prejudices expressed through nicely graduated hierarchies of social class.  To friends, he spoke of his dismay at how the British upper orders related to those below them” (p.131).

In London, as in New York, the “limited roles that playwrights offered to black actors left Paul with precious few opportunities to display any range. He was invariably cast as the same kind of character, and as a result even his admirers ascribed his success to instinct rather than intellect, as a demonstration not so much of theatrical mastery but of an innate African talent for make-believe, within certain narrow parameters” (p.107). Then, in 1930, Robeson received a fateful invitation to play Othello in a London production, a role that usually went to an actor of Arab background.

Robeson’s portrayal of Othello turned out triumphal, with the initial performance receiving an amazing 20 curtain calls.  In that production, which  ran for six weeks, Robeson transformed Shakespeare’s tragedy into an “affirmation of black achievement, while hinting at the rage that racism might yet engender” (p.113).  Thereafter, Othello “became central to Paul’s public persona,” (p.114), providing a role that seemed ideal for Robeson: a “valiant high-ranking figure of color, an African neither to be pitied nor ridiculed” (p.109).

While in London, Robeson developed sensitivity to the realities of colonial Africa through friendships with men such as Nnamdi Azikiwe, Jomo Kenyatta, and Kwame Nkrumah, future leaders of independence movements in Nigeria, Kenya and Ghana, respectively.  Robeson retained a keen interest in African history and politics for the remainder of his life.  But  Robeson’s commitment to political activism seems to have crystallized through his frequent visits to Wales, where he befriended striking miners and sang for them.

Robeson supported the Welsh labor movement because of the “collectivity it represented. In Wales, in the pit villages and union lodges and little chapels, he’d found solidarity” (p.149).  Robeson compared Welsh churches to the African-American churches he knew in the United States, places where a “weary and oppressed people drew succor from prayer and song” (p.133).  More than anywhere else, Robeson’s experiences in Wales made him aware of the injustices which capitalism can inflict upon those at the bottom of the economic ladder, regardless of color.  Heightened class-consciousness proved to be a powerful complement to Robeson’s acute sense of racial injustice developed through the endless humiliations encountered in his lifetime in the United States.

Robeson’s sensitivity to economic and racial injustice led him to the Soviet Union in the 1930s, which he visited many times and where he and his family lived for a short time.  But a stopover in Berlin on his initial trip to Moscow in 1934 opened Robeson’s eyes to the Nazis’ undisguised racism.  Nazism to Robeson was a “close cousin of the white supremacy prevailing in the United States,” representing a “lethal menace” to black people.  For Robeson, the suffering of African Americans in their own country was no justification for staying aloof from international politics, but rather a “reason to oppose fascism everywhere” (p.153).

With the outbreak of the Spanish Civil War in 1936, Spain became the key battleground to oppose fascism, the place where “revolution and reaction contested openly” and “Europe’s fate would be settled” (p.160).  After speaking and raising money on behalf of the Spanish Republican cause in the United States and Britain, Robeson traveled to Barcelona, where he sang frequently.  Robeson’s brief experience in Spain transformed him into a “fervent anti-fascist, committed to an international Popular Front: a global movement uniting democrats and radicals against Hitler, Mussolini, and their allies” that would also extend democracy within the United States, end colonialism abroad, and “abolish racism everywhere” (p.196-97).

Along with many progressives of the 1930s, Robeson looked to the Soviet Union to lead the global fight against racism and fascism.  Robeson once said in Moscow, “I feel like a human being for the first time since I grew up.  Here I am not a Negro but a human being” (p.198).  Robeson’s conviction that the Soviet Union was a place where  a non-racist society was possible “sustained him for the rest of his political life” (p.202).   Although he never joined a communist party, from the 1930s onward Robeson accepted most of the party’s ideas and “loyally followed its doctrinal twists and turns” (p.215).  It is easy, Sparrow indicates, to see Robeson’s enthusiasm for the Soviet Union as the “drearily familiar tale of a gullible celebrity flattered by the attentions of a dictatorship” (p.199).

Sparrow wrestles with the question of the extent to which Robeson was aware of the Stalinist terror campaigns that by the late 1930s were taking the lives of millions of innocent Soviet citizens.  He provides no definitive answer to this question, but Robeson never wavered publicly in his support for the Soviet Union.  Had he acknowledged Soviet atrocities, Sparrow writes, he would have besmirched the “vision that had inspired him and all the people like him – the conviction that a better society was an immediate possibility” (p.264).

Robeson devoted himself to the Allied cause when the United States and the Soviet Union found themselves on the same side fighting Nazi aggression during World War II, “doing whatever he could to help the American government win what he considered an anti-fascist crusade” (p.190).  His passion for Soviet Russia “suddenly seemed patriotic rather than subversive” (p.196-97).  But that quickly changed during the intense anti-Soviet Cold War that followed the defeat of Nazi Germany.  Almost overnight in the United States, communist party members and their sympathizers became associated “not only with a radical political agenda but also with a hostile state.  An accusation of communist sympathies thus implied disloyalty – and possibly treason and espionage” (p.215).

The FBI, which had been monitoring Robeson for years, intensified its scrutiny in 1948.   It warned concert organizers and venue owners not to allow Robeson to perform “communist songs.”  If a planned tour went ahead, Sparrow writes, proprietors were told that they would be:

judged Red sympathizers themselves. The same operation was conducted in all the art forms in which Paul excelled.  All at once, Paul could no longer record music, and the radio would not play his songs.  Cinemas would not screen his movies. The film industry had already recognized that Paul was too dangerous; major theatres arrived at the same conclusion. The mere rumor that an opera company was thinking about casting him led to cries for a boycott.  With remarkable speed, Paul’s career within the country of his birth came to an end (p.216).

In 1950, the US State Department revoked Robeson’s passport after he declined to sign an affidavit denying membership in the Communist Party.  When Robeson testified before the House Un-American Affairs Committee (HUAC) in 1956, a Committee member asked Robeson why he didn’t go back to the Soviet Union if he liked it so much.  Roberson replied: “Because my father was a slave . . . and my people died to build this country, and I am going to stay here, and have a part of it just like you.  And no fascist-minded people will drive me from it. Is that clear?” (p.228). Needless to say, this was not what Committee members wanted to hear, and Robeson’s remarks “brought the moral weight of the African-American struggle crashing down upon the session” (p.228-29).

Robeson was forced to stay on the sidelines in early 1956 when the leadership of the fledgling Montgomery bus boycott movement (which included a young Dr. Martin Luther King, Jr.) concluded that his presence would undermine the movement’s fragile political credibility.  On the other side of the Cold War divide, Soviet leader Nikita Khrushchev delivered a not-so-secret speech that winter to party loyalists in which he denounced Stalinist purges.   Sparrow hints but doesn’t quite say that Robeson’s exclusion from the bus boycott and Khrushchev’s acknowledgment of the crimes committed in the name of the USSR had a deleterious effect on Robeson’s internal well-being.   He had suffered from bouts of mental depression throughout his adult life, most notably when a love affair with an English actress in the 1930s ended badly (one of several Robeson extra-marital affairs). But his mental health deteriorated during the 1950s, with “periods of mania alternating with debilitating lassitude” (p.225).

Even after Robeson’s passport was restored in 1958 as a result of a Supreme Court decision, he never fully regained his former zest.  A broken man, he spent his final decade nearly invisible, living in his sister’s care before dying of a stroke in 1976.

* * *

                     Sparrow describes his book as something other than a conventional biography, more of a “ghost story” in which particular associations in the places he visited form an “eerie bridge” (p.5) between Robeson’s time and our own.  But his travels to the places where Robeson once lived and his interactions with his local guides have the effect of obscuring the full majesty and tragedy of Robeson’s life.  With too much attention given to Sparrow’s search for what remains of Robeson’s legacy on our side of the bridge, Sparrow’s part biography, part travel memoir comes up short in helping readers discover Robeson himself on the other side.

 

 

Thomas H. Peebles

Paris, France

October 21, 2019

 

 

6 Comments

Filed under American Society, Biography, European History, History, Politics, United States History

The Contrarian’s Disconcerting Dualism

 

Fintan O’Toole, Judging Shaw:

The Radicalism of GBS (Royal Irish Academy, $40.00) 

            By 1920, theatergoers throughout the world recognized the three letters “GBS” as a shorthand reference to George Bernard Shaw, not only the era’s most prolific and successful English language playwright but also a prominent social and political commentator with radical left-wing views.  GBS in 1920 was Shaw’s self-created brand, which he cultivated carefully and marketed shamelessly.  In Judging Shaw: The Radicalism of GBS, prominent Irish journalist and cultural critic Fintan O’Toole explores how the brand GBS interacted with Shaw the man and evolved over the years.  O’Toole does so through eight thematic essays, each a section on a separate aspect of Shaw’s long life (1856-1950), but without adhering to a strict chronology.  His work is more appraisal than biography.

Author of over sixty plays, among them Man and Superman (1902), Pygmalion (1912) and Saint Joan (1923), Shaw was also a prodigious writer of letters, pamphlets, and speeches.  By one estimate, O’Toole notes, Shaw wrote at least a quarter of a million letters and postcards.  Although he analyses Shaw’s plays, O’Toole also draws liberally upon them and other writings to cast light upon Shaw’s social and political thought – upon the “Radicalism of GBS” to use the book’s sub-title.  At the book’s heart lies Shaw’s disconcerting dualism: in the post-World War I era, the outspoken political progressive became an apologist for the totalitarian regimes in Italy, Germany and Soviet Russia, as well as an ostensible proponent of eugenics.  It is primarily in Shaw’s capacity as a social and political thinker that O’Toole engages his readers in an exercise in “Judging Shaw,” the book’s title.

Although not a conventional biography, the book contains a detailed and helpful chronology at the outset, with year-by-year highlights of Shaw’s life.  It also contains an impressive series of visual memorabilia between each section. The series includes relevant photos but also vivid photocopies of letters, drafts of published writings, and other reminders of Shaw’s contrarian career.

* * *

                O’Toole’s initial section, “The Invention of GBS,” describes  Shaw as “among the first private citizens in world history to create for themselves a personal brand with global resonance.  GBS was an almost universal signifier” (p.20).  None of Shaw’s predecessors created a brand that was “as deliberate, as resonant, as widespread and as sustained as GBS. He shattered cultural boundaries in ways that still seem breathtakingly bold, confounding the apparently obvious differences between seriousness and showmanship, personality and politics, art and propaganda, the mainstream and the outré, the voice in the wilderness and the voice on the radio, moral purpose and charlatanism” (p.23).  GBS, the “invention of a single, obscure impoverished Irishman,” was “one of the great achievements of the history of advertising” which produced a “unique form of celebrity: a vast popularity that depended on a reputation for insisting on unpopular ideas and causes, for pleasing the public by provoking it to the point of distraction” (p.21-22).  Quite simply, GBS was “Shaw’s greatest character” (p.22).

O’Toole’s initial section also looks at Shaw’s early years growing up in a Protestant family in Dublin.  Shaw’s ancestors on the side of his father had been quite prosperous, but his grandfather lost the family money and his alcoholic father, George Carr Shaw, struggled to earn a living sufficient for Shaw and his two older sisters.  The realization that George Carr was a “drunk,” O’Toole writes, “introduced him to reality in a way that permanently shaped his consciousness” (p.26).   Shaw’s career might be seen as a “backhanded compliment to his family.  His teetotalism and vegetarianism were reactions against the toxicity of alcoholic addiction. His ferocious, almost manic work ethic was surely driven by the fecklessness and failure of his Papa” (p.30-31).

Shaw acquired his artistic sensibility mostly from his mother, Bessie Gurly.  O’Toole recounts how Bessie invited another man, George John Vandeleur Lee, Bessie’s piano teacher, to live with the family. Lee became a substitute father for Shaw, from whom the young man derived his lifelong affinity for classical music, along with a “studied individuality of ideas about food and health” (p.37).  Lee had a certain flamboyance about him that presaged the GBS mark.  Shaw’s relationship to Lee involved a process of “mentally killing off his real father and replacing him, for a time at least, with Lee” (p.36-37), O’Toole writes.  There was some speculation that Lee might have been Shaw’s actual father.  This is surely wrong, O’Toole argues, but if the young Shaw may have looked like Lee, the reason was “not genetic but mimetic. Consciously or not, he imitated the man who had displaced his father.  Shaw never explicitly acknowledged Lee’s influence on him, but it is stamped on one of his most successful plays, Pygmalion. . . [where] Henry Higgins is a mélange of GBS and Lee” (p.38).

Shaw left Dublin for London in April 1876, three months before his 20th birthday, the “culmination of an imaginative process of slow disengagement from Dublin and thus from the physical realities of his youth” (p.47).  With Shaw’s arrival in London, where he lived for most of the rest of his years, O’Toole abandons any pretense at chronological biography in favor of his thematic essays.  One, “GBS versus England,” addresses Shaw’s general relationship to England, where he always retained a sense of himself as an exile, followed by “GBS versus Ireland.” Here, O’Toole explains Shaw’s relationship to Ireland and the Irish independence movement during his adult years.  Shaw “always saw an independent Ireland remaining voluntarily as an active member of a democratized Commonwealth.  But he never deviated from a passionate insistence that Ireland was and must be its own country and that British rule was an illegitimate imposition. He insisted that aggressive Irish nationalism was a fever that could be cured only by freedom” (p.113).

In the next section, “The Thinking Cap and the Jester’s Bells,” O’Toole turns specifically to Shaw’s plays and how he used the stage to shatter multiple norms.  Shaw wrote in a society and a culture “deeply committed to notions of human difference – that the upper class was vastly different from the lower, the imperial power from its subjects, the superior races from the inferior.”  Shaw’s dramaturgy was a “conscious revolt against these notions” (p.153).  Shaw used the stage to suggest that “how we behave is a function not of our characters, but of social roles and circumstance” (p.162-63).  O’Toole compares Shaw’s characters to a set of Russian dolls: “we never know whether, if enough layers were exposed, we would actually find a ‘real’ self. . . [T]he haunting thought is that the real self may not exist” (p.170).

Unlike most playwrights of his day, Shaw took great care in preparing a preface to his plays.  The preface helped Shaw’s readers and viewers see him “not as a famous playwright but as a famous man who wrote plays and used his celebrity to generate an audience for them” (p.95).  Shaw’s plays were democratic in their themes but also in their targeted audiences and readership, persons of modest income and education, the first generation of mass readers.  Shaw’s plays appealed to:

the millions who devoured newspapers and haunted public libraries, who joined trade unions and feminist organizations, social clubs and socialist societies, who hungered for ideas about the world. . . The history of the cheap paperback book is intertwined with the history of GBS. And not for nothing – they both belonged in the hands of working men and women (p.308-09).

In two sections, “GBS’s War on Poverty” and “The Lethal Chamber: The Dark Side of GBS,” O’Toole draws heavily on Shaw’s plays as well as his other writings to set out the contours of Shaw’s political and social thought.  At least until the 1960s, Shaw was “by far the most widely read socialist thinker in the English language.  And at the heart of his thought was that visceral hatred of poverty he breathed in with the fetid air of the Dublin slums” (p.197).  More than any other factor, Shaw’s deep hatred for economic oppression and inequality shaped his social thought.

Shaw challenged the perception of poverty as a “product of personal failure or mere bad luck, or as a necessary and inevitable corollary of economic progress” (p.198).  For Shaw, poverty was “not the cause of crime – it is the crime” (p.204).  Moralizing constructs like the “deserving poor” were only “self-serving cant” (p.310).  Shaw began to write in an era like ours, O’Toole observes, when wealth was expanding rapidly but distributed ever more unequally, giving his thought “renewed relevance in the twenty-first century” (p.198).

Shaw was one of the first intellectuals to suggest that children have rights independent of their parents.  He became a fierce fighter for woman’s suffrage and advocated for repeal of laws against consensual adult homosexual activity.  Almost alone among public figures, Shaw stood by and defended Oscar Wilde when Wilde was released from prison after serving nearly two years for “gross indecency,” i.e., homosexual acts (the subject of a review here earlier this year).

But Shaw’s progressive heroism was more than tempered for me by O’Toole’s section “The Lethal Chamber: The Dark Side of GBS,” in which the task of “judging Shaw” considers his embrace of some of the 20th century’s darkest moments: Fascism, Nazism and Communism.  Shaw also appeared to embrace the now discredited notion of eugenics, the use of selective breeding to “ensure that ‘bad’ human traits, ranging from physical and mental disabilities to moral delinquency, were ‘bred out’ of the human race” (p.267).  O’Toole provides startling quotations in which Shaw seems to support not just determining who should be allowed to give birth but also a massive increase in capital punishment for those inclined to criminality or what was considered deviant behavior.  “A part of eugenic politics,” Shaw told an audience in 1910, “would finally land us in an extensive use of the lethal chamber.  A great many people would have to be put out of existence simply because it wastes other people’s time to look after them” (p.268).  Shaw’s critics jumped on this and similar statements as evidence of the extremes to which his socialism invariably led.

Here, O’Toole turns lawyer for Shaw’s defense.  Shaw’s critics were willfully missing the irony behind his provocative suggestions, O’Toole argues.  Shaw was using the device of “pushing an idea to a grotesque conclusion in order to highlight an absurdity or an injustice” (p.269).  O’Toole compares Shaw to the Anglo-Irish satirist Jonathan Swift (1667-1745), who argued in a deadpan tone that the rich should be allowed to eat the children of the poor.  But when O’Toole comes to Shaw’s attraction to Nazism and Fascism in the 1930s, he admits that he cannot serve effectively as Shaw’s lawyer.

Shaw imagined fascism as an “incomplete and underdeveloped version of his own communism” (p.277), O’Toole writes.  He saw Mussolini’s persecution of left-wing parties “not as part of the essence of fascism, but merely as a mistake” (p.277).  After a 1927 lunch with famed socialists Sydney and Beatrice Webb, Beatrice recorded that Shaw had “gabbled” on the subject of Mussolini, demonstrating that he had “lost touch with political reality” and “could no longer be taken seriously as a political thinker” (p.276).  Webb blamed Shaw’s enthusiasm for Mussolini on his intellectual isolation and weakness for flattery, the result of his “living a luxurious life in the midst of a worthless multitude of idle admirers” (p.277;  Webb’s notes on this lunch appear as one of the between-section visuals, at p.294-95)

The Webbs must have been even more aghast with Shaw a few years later as Hitler rose to power in Germany.   Shaw had presciently seen the folly of the Versailles Treaty and, like John Maynard Keynes, had argued that it was little more than an invitation to another war.   Shaw’s early lack of objections to Hitler may have been in part because Shaw viewed Hitler’s rise as a natural reaction to Versailles.  “His sympathy for Hitler was driven in part by a sense that the rise of the Nazi leader was proving GBS’s warnings correct,” O’Toole writes (p.281).  Shaw supported Hitler’s withdrawal from the League of Nations, repudiation of the Treaty of Versailles, and rapid rebuilding of Germany’s armed forces.

Throughout the 1930s, Shaw maintained a “hopeless inability to understand what Nazism was about” (p.279).  Although Shaw despised Nazi racial theories, as he despised all racial theories, his “great delusion” was to think that the problem with anti-Semitism was an “excrescence of the ‘great Nazi movement’ that must be capable of something nobler. . . What Shaw seemed incapable of grasping was that anti-Semitism was not a stain on the otherwise pure cloth of Nazism. It was Hitler’s primary color” (p.279-80).  Shaw “blinded himself to the murderousness implicit in Nazism and choreographed his own ridiculous dance around one of the central realities of the 1930s” (p.282).  It was only after Germany invaded the Soviet Union that Shaw admitted he had been wrong about Hitler’s intentions.  But here, too, his apology was couched in terms that were neither “gracious” nor a “searching self-reflection – Shaw essentially apologized for Hitler not being as intelligent as GBS” (p.288).

Shaw’s infatuation with Communism is easier to square with his left-wing political outlook.   Shaw was hardly the only Westerner of a leftist bent who saw a potential “socialist paradise” in the Soviet Union of the 1930s and applauded its apparent rapid modernization while the Western democracies remained mired in a worldwide economic depression.  O’Toole recounts an interview with Stalin that Shaw and Nancy Astor conducted when the pair traveled to Moscow in 1931.  Astor, Britain’s first female parliamentarian although an American by birth, asked Stalin why he slaughtered so many people.  Shaw seemed to have been satisfied with Stalin’s “bland assurance that ‘the need for dealing with political prisoners drastically would soon cease’” (p.279).  Thereafter, O’Toole indicates, Shaw’s view of Stalin “approached hero-worship: a photograph of Stalin was beside his deathbed, though with characteristic perversity it was balanced by one of Mahatma Gandhi” (p.278-79).

As he considers Shaw’s embrace of these totalitarian regimes as part of the task of “judging Shaw,” O’Toole sounds more like a prosecutor delivering an impassioned closing argument:

The great seer failed to see the true nature of fascism, Nazism and Stalinism. The great skeptic allowed himself to believe just what he wanted to believe, that the totalitarian regimes of Mussolini, Hitler and Stalin were rough harbingers of real progress and true democracy.  GBS was by no means the only artist or intellectual to be deluded by the promises of regimes that ‘got things done’ while democracies struggled to end the Great Depression.  But no other artist or intellectual had his standing as a global sage.  His sagacity proved to be useless when it mattered most (p.275).

After wearing both a defense lawyer’s hat and that of a prosecutor, O’Toole seems to find a judicial robe when he reminds his readers that Shaw’s dark phase coincided with an almost entirely barren period for him as a playwright and writer.   From the late 1920s onward through World War II, Shaw’s output came to an almost absolute halt.  In O’Toole’s view, the Great War marked the death of GBS, depriving Shaw of his most potent message.  Shaw had used mockery, paradox and comic absurdity to remind his readers and viewers that what was termed “civilization” was merely a “veneer on cruelty and hypocrisy. But the Great War swatted aside the gadfly. It revealed, through the scale of its horror, all the hidden truths that GBS had delighted in exposing” (p.240).

The great failure of GBS the sage in the post-World War I era, O’Toole contends, “cannot be divorced from the waning of the powers of GBS the dramatist.  It was in his art that Shaw tested and contradicted and argued with himself.  But that ability dried up” (p.289). Unlike artistic creators as varied as Beethoven, Titian, Goya and W.B. Yeats, all of whom found newborn creativity late in life, Shaw was “unable to develop a successful late style” (p.289).  His last great play was in 1923, Saint Joan, when he was 68. He “long outlived the GBS who could spin ideas and contradictions on the end of his fingertips” (p.290).

* * *

                The GBS brand may have died in the wake of World War I, and Shaw the social and political commentator remains tainted by his dalliances with the totalitarian ideologies of the 1930s.  Yet, in closing out this erudite and elegantly written exercise in judging Shaw, O’Toole concludes that nearly three quarters of a century after his death, Shaw’s status as playwright and artist — and contrarian — seems  “more secure now than might have been predicted even a few decades ago” (p.305-06).  Shaw’s revolutionary impact continues to lie in his insistence that the “right to question everything, to hold nothing sacred” belongs to the “common man and woman. And that it was not just a right – it was a duty” (p.306).

Thomas H. Peebles

La Châtaigneraie, France

September 29, 2019

 

 

 

5 Comments

Filed under British History, English History, History, Literature, Political Theory, Politics

Relighting The City of Light

 

Agnès Poirier, Left Bank:

Art, Passion, and the Rebirth of Paris, 1940-1950 

(Henry Holt & Co., $30)

             Agnès Poirier, a Paris-born and London-educated journalist, takes on two weighty subjects in Left Bank: Art, Passion, and the Rebirth of Paris, 1940-1950: Parisian artistic, cultural and intellectual life during what was surely Paris’s darkest 20th century period, the four years of German occupation, 1940-44; and the efforts to restore the City of Light to its former eminence in all things artistic, cultural and intellectual in the remaining years of the turbulent decade.  Her book consists primarily of short anecdotes or vignettes – what she terms a “collage of images” (p.4) — about some of the leading artistic and intellectual personalities in 1940s Paris.  With much emphasis upon the shifting romantic attachments among these personalities, the book has a gossipy flavor.

             The grim occupation years constitute only about a third of Poirier’s narrative.  The book begins to gather momentum when she turns to the second of her two weighty subjects, how artists and intellectuals sought to regain their footing in the second half of the decade.  One of the primary tasks Poirier sets for herself is to capture the euphoria that accompanied the liberation of Paris in 1944 and the end of hostilities the following year.

            Simone de Beauvoir and her life-long partner Jean-Paul Sartre are undoubtedly the book’s lead characters, the personalities Poirier returns to consistently in this collection of anecdotal portraits.  In an appropriate rebalancing from other works on philosophy’s ultimate power couple, Beauvoir receives more attention than Sartre; she is the book’s star.  Behind Sartre in supporting roles are novelists Albert Camus and Arthur Koestler (both Camus and Koestler have been subject of books reviewed on this blog, here and here).   In addition to these four, Poirier provides glimpses of a dizzying number of luminaries who congregated at the Café de Flore and the other Left Bank cafés along or near the Boulevard St. Germain where Beauvoir and Sartre hung out.

            At the outset, Poirier provides a list of 32 individuals who make up the book’s “Cast of Characters” — her “band of brothers and sisters” (p.2), as she terms them, the men and women who appear in the book’s vignettes as we glance at their personalities and interactions in their professional and personal lives. Among the continentals likely to be familiar to readers are Raymond Aron, Henri Cartier-Bresson, Jean Cocteau, Maurice Merleau-Ponty, Pablo Picasso, Jean Paulhan, and Simone Signoret.  Irish playwright Samuel Beckett makes several appearances.  There is also a heavy contingent of Americans, including James Baldwin, Sylvia Beach, Saul Bellow, Art Buchwald, Alexander Calder, Miles Davis, Janet Flanner, Ernest Hemingway,  Norman Mailer, Henry Miller, Theodore White, and Richard Wright.  They were “budding novelists, philosophers, painters, composers, anthropologists, theorists, artists, photographers, poets, editors, publishers and playwrights” (p.1).  Most were under 40 when the war ended and many either came to Paris from abroad in the post-war period or returned to Paris after taking refuge elsewhere during the war years.

            Together, Poirier’s band of brothers and sisters:

founded the New Journalism, which . . . forever blurred the lines between literature and reportage. Poets and playwright slowly buried Surrealism and invented the Theater of the Absurd; budding painters transcended Socialist Realism, pushed Geometric Abstraction to its limits, and fostered Action Painting.  Philosophers founded new schools of thought such as Existentialism . . . Aspiring writers found their voices in Paris’s gutters and the decrepit student rooms of Saint-Germain-des-Prés, while others invented the nouveau roman.  Photographers reclaimed their authorship through photojournalism agencies . . . black [American] jazz musicians, fleeing segregation at home, found consecration in the concert halls and jazz clubs of Paris, where New Orleans jazz received its long-overdue appreciation while bebop was bubbling up (p.2).

            Despite this range of creativity and artistic output, Poirier argues that after 1944 “everything was political” in Paris (p.2).  Among Left Bank denizens, there was a near-universal conviction that France desperately needed an independent, social democratic political movement.  Such a movement would challenge not only the Communist Party that dominated the post-war French political landscape but also the hyper-nationalist Gaullist movement that was the Communists’ main rival and the American free-market capitalist model that hovered in the background.

            The elusive quest for an independent political movement constitutes one of two threads that tie together the book’s vignettes and elevate the text to something more than a series of gossipy anecdotes.  The second is Beauvoir herself, especially the development of her seminal work, The Second Sex, and how she became a model of female emancipation, breaking social conventions by combining intellectual ambition, financial independence and sexual freedom.

* * *

            Poirier’s vignettes begin before the German occupation and follow a rough chronological order.  She provides some sense of how Parisian artists and intellectuals survived the war years.  A deeper and more satisfying study of Parisian artistic and cultural life during the occupation may be found in Alan Riding’s And the Show Went On, reviewed here in September 2012.  As in Riding’s book, Poirier delves into France’s pained efforts to decide how to come to terms with these dark years, and how writers, artists, and intellectuals fared after the war in a process termed épuration, best translated as “purification.”

            Epuration in Poirier’s view amounted to little more than an entire nation adopting the fiction that almost everyone had been part of the resistance during the occupation, with Charles de Gaulle willing to go along and even encourage this fiction as the most effective means to reunite a divided and demoralized nation.  But she characterizes the épuration process as a “murky affair, and the discrepancy in punishments opened up a national debate on the nature of revenge and justice.  Never more public than among writers and journalists, the debate tore friends apart” (p.74).

            One writer, Robert Brasillach, who had collaborated with the Nazi and Vichy governments, was executed, notwithstanding petitions signed by Sartre, Camus and other writers who abhorred Brasillach’s political convictions yet pleaded to spare his life.   Every other writer with collaborationist tendencies avoided this fate.  Poirier wryly notes the generosity underlying the épuration process, which seemed designed to demonstrate that it was “never too late to be a patriot” (p.71).  Those French men and women who had lived through the four years of the occupation “with the shame of the armistice but had not had the courage to join the Resistance or go to London seized the opportunity to clear their conscience” (p.71).  As Beauvoir observed, the joie de vivre which liberation generated was “tempered by the shame of having survived” (p.76).

            Poirier earnestly seeks to capture that tempered joie de vivre.  As Paris’s galleries, boulevards, jazz clubs, cafés, bistros, and bookshops returned to life, the city’s intellectuals and artists were, she writes, eager to unleash their “unquenchable thirst for freedom in every aspect of their lives.  Whether born into the working class or the bourgeoisie, they wanted little to do with their caste’s traditions and conventions or with propriety. Family was an institution to be banished, children a plague to avoid at all costs” (p.116).  It was, Poirier writes, a time to “stare at the reality with lucidity in order to change it” (p.95), a time to “experiment with life, love, and ideas, to throw away conventions, to reinvent oneself, and to reenchante the world” (p.95).

            It was also a time of sexual liberation, for women as much as men, Poirier emphasizes, with a high percentage of assertive bi-sexual women and “female Don Juans” (p.3) among the book’s cast of characters.  In post-war Paris, where such traditional categories as race and religion were deemed inconsequential, gender still counted. “Only the strongest of women survived,” Poirier writes, but those who did “shook the old male order.” Using their “wizardry with words, images, and concepts,” Left Bank women “revolutionized not only philosophy and literature but also film and modern art” (p.288-89).  The romantic interactions among the band of brothers and sisters, a theme to which Poirier returns consistently, were multiple and complicated, none more so than those involving Beauvoir herself.

            By 1944, Beauvoir and Sartre were more like business partners, writing and generating ideas together while pursuing romantic interests independently (not unlike Franklin and Eleanor Roosevelt; see Jospeh Lelyveld’s His Final Battle: The Last Six Months of Franklin Roosevelt, reviewed here in March 2017).  Beauvoir’s interests included both Koestler and Camus.  Beauvoir was attracted to Koestler upon reading the initial proofs of his signature novel, Darkness at Noon.   Poirier describes in some detail one intensive soirée the couple spent together, perhaps their only one, after which Beauvoir concluded that Koestler was a “violent and impulsive man, a world-weary seducer” (p.122).

            The electricity between Beauvoir and Camus lasted longer although, we learn, Camus was frightened by Beauvoir’s intelligence.  Beauvoir, by contrast, had “no reservations about [Camus].  Nothing in him could turn her off, except perhaps his moralizing” (p.120).  Poirier also pays close attention to Beauvoir’s most sustained and deepest heterosexual relationship in the post-war period, with American writer Nelson Algren.  During her affair with Algren, Beauvoir appears to have ceased liaisons with countless younger women, many of whom Sartre also pursued.

            Both Sartre and Camus “often fell for pretty students and groupies.  It was easy, Poirier writes. ” Those young lovers were enthusiastic, malleable, a little naïve perhaps, and would recover in no time once the affair with the great man had ended” (p.120).  The two men, at different times, also pursued Koestler’s girlfriend and wife-to-be Mamaine Paget, while Koestler was ranging  widely among the Left Bank female contingent.  One of Koestler’s liaisons was with Dominique Aury, who translated some of Koestler’s articles into French.

            Poirier describes Aury as a “seductress, a conqueror of both men and women [who] knew how to snare her prey” (p.125).  Aury’s most intriguing relationship was with not Koestler but Édith Thomas, a relationship Aury struck up after leaving an unidentified man whom Poirier speculates might have been Camus.  Both Aury and Thomas had been part of the resistance during the war.  Aury was from a traditional, right-wing Catholic family; Thomas,  a published novelist, was  a member of the Communist Party. Despite these differences, they shared a “passion that would transport Édith to a state of delirious ecstasy” (p.127).  But Aury, a “born huntress” who after passion subsided “always chased other prey” (p.125), ended the affair with Thomas for another with a married man, famed publisher and literary critic Jean Paulhan, leaving Thomas with a sense of betrayal she never got over.  And so it went on the Left Bank.

            Amidst the musical beds, Sartre and Camus endeavored to establish a social democratic alternative to the Communist party.  By 1946, the first full year after the war, France’s leading political parties were the Communists and the Gaullists. The non-Communist Socialist Party was too splintered to be a force attractive to Parisian artists or intellectuals.  Nor were they attracted to the bourgeois nationalism and cautious reformism of the Gaullists. The French Communist Party’s association with wartime resistance, moreover, gave it a huge advantage with the public.

            The “terrible and deep guilt felt by so many French people who did not take part in active résistance against the Nazi occupiers meant that they could not and would not criticize the Communists, the most active members of the French Resistance” (p.158).  With a “kind of spiritual power over the youth and the intelligentsia,” the Communist party was for many a “conscience, and a magnet” (p.138).  The party was “intent on invading every nook and cranny of public life” (p.138).  It received a jolt of positive energy when Picasso officially joined in 1944.

            Few Left Bank artists or intellectuals followed Picasso’s example, although in Poirier’s view most saw Communism, on balance and in theory, as preferable to capitalism.  Beauvoir and Sartre certainly felt the tug of the party’s appeal, but were not ready to “bargain their freedom, espirit critique, and independence for the sake of acceptance into the Communist fold” (p.143).  Although Camus and Koestler had both flirted with Communism in their younger days, by 1946 they were among the Left Bank’s most vocal anti-Communists.  

            Camus had credibility as having been an active member of the resistance (Sartre, by contrast, was as an “armchair” resistant although he had been jailed during the war, p.67).  Koestler, even more firmly anti-Communist, leaned toward the Gaullists and tried with limited success to convince his Left Bank associates that there was “no greater threat in Europe than Communism” (p.153).  Although the Left Bank denizens may have been clashing among themselves at the Café de Flore and elsewhere over arcane philosophical and historical implications of Marxism and Communism, the Party, hardly into nuance, “put them all in the same bag” (p.160) as traitors. The influential French Communist press “used all their might to promote their ideology and to attack all those who did not think like them, including Camus, Sartre, and Beauvoir” (p.77).

            Poirier credits Camus with having a vision for a political alternative. Rather than promote a “tired compromise between left and right,” Camus “dreamt of a humanist socialism, of new boldness in politics, a fresh, harsh and pure new elite coming from the Resistance to rule over an old country.   He dreamt of social justice and of individual freedoms” (p.136-37).   But it fell to Sartre to take concrete steps toward building a new political party, the Rassemblement Démocratique et Revolutionnaire (the Democratic and Revolutionary Alliance or RDR), which he co-founded in 1948 with novelist David Rousset.

            The RDR’s aim was to unite the non-Communist Left and promote an independent Europe as a bridge between the Soviet and American blocs. The RDR opened to a big splash, with the public support of Camus.   But the party never really got off the ground.  The cantankerous Left Bank intellectuals “proved too heterogeneous a mélange to speak with one strong voice” (p.157) on the notion of a third force that would be hostile to both the Gaullists and the Communists. The inglorious end of the RDR prompted Sartre to withdraw from political activism.  “No more party membership. From now on, there would be only literature” (p.265).

            Beauvoir lent little support to Sartre’s RDR project. She was feverishly at work at what had started as a short essay and grew into the voluminous The Second Sex (Le deuxième sexe in French), as well as deeply involved with Nelson Algren.  The Second Sex, a meticulously researched and easily accessible work, sought to demonstrate “how men oppress women” (p.285).  Looking at biology, psychoanalysis, and history, Beauvoir found “numerous examples of women’s ‘inferiority’ taken for granted but never had she found a convincing justification for it” (p.285).  Poirier summarizes Beauvoir’s composite portrait of women as:

conditioned by society into accepting a passive, dependent, objectified existence, deprived as they are of subjectivity and the ambition to emancipate themselves through financial independence and work. Whether daughter, wife, mother, or prostitute, women are made to conform to stereotypes imposed by men . . . [O]nly through work, and thus economic independence, can women obtain autonomy and freedom (p.286)

             The Second Sex’s brilliance lay in Beauvoir’s “rigorous intellectual approach combined with a cool and superbly concisely style” (p.132). Beauvoir “at last” was “considered worldwide an equal to Sartre and Camus” (p.285).

* * *

            Poirier indicates at the outset that Left Bank is not an “academic analysis” (p.4), to which she could have added that the book is not a conventional history either.  Herein lies the primary reason why,  despite its weighty subject matter, the book fell short of my expectations.  While Poirier’s anecdotal portraits of the leading lights in Paris during a fraught decade make for entertaining reading, they don’t add up to a portrait of the city itself during the decade.

Thomas H. Peebles

La Châtaigneraie, France

February 26, 2019

 

 

9 Comments

Filed under French History, History, Intellectual History, Politics

Just How Machiavellian Was He?

 

Erica Benner, Be Like the Fox:

Machiavelli’s Lifelong Quest for Freedom 

            Niccolò Machiavelli (1469-1527), the Florentine writer, civil servant, diplomat and political philosopher, continues to confound historians, philosophers and those interested in the genealogy of political thinking.  His name has become a well-known adjective, “Machiavellian,” referring to principles and methods of expediency, craftiness, and duplicity in politics.  Common synonyms for “Machiavellian” include “scheming,” “cynical,” “shrewd” and “cunning.”  For some, Machiavellian politics constitute nothing less than a prescription for maintaining power at any cost, in which dishonesty is exalted and the killing of innocents authorized if necessary.  Machiavelli earned this dubious reputation primarily through his best known work, The Prince, published in 1532, five years after his death, in which he purported to advise political leaders in Florence and elsewhere – “princes” – on how to maintain power, particularly in a republic, where political leadership is not based on monarchy or titles of nobility and citizens are supposed to be on equal footing.

            But to this day there is no consensus as to whether the adjective “Machiavellian” fairly captures the Florentine’s objectives and outlook.  Many see in Machiavelli an early proponent of republican government and consider his thinking a precursor to modern democratic ideas.  Erica Brenner, author of two other books on Machiavelli, falls squarely into this camp.  In Be Like the Fox: Machiavelli’s Lifelong Quest for Freedom, Benner portrays Machiavelli as a “thorough-going republican,” and a “eulogist of democracy” who “sought to uphold high moral standards” and “defend the rule of law against corrupt popes and tyrants” (p.xvi).   Brenner discounts the shocking advice of The Prince as bait for tyrants.

            Machiavelli wore the mask of helpful advisor, Benner writes, “all the while knowing the folly of his advice, hoping to ensnare rulers and drag them to their ruin” (p.xv).  As a “master ironist” and a “dissimulator who offers advice that he knows to be imprudent” (p.xvi), Machiavelli’s hidden intent was to “show how far princes will go to hold on to power” and to “warn people who live in free republics about the risks they face if they entrust their welfare to one man” (p. xvi-xvii).   A deeper look at Machiavelli’s major writings, particularly The Prince and his Discourses on Livy, nominally a discussion of politics in ancient Rome, reveals Machiavelli’s insights on several key questions about republican governance, among them: how can leaders in a republic sustain power over the long term; how can a republic best protect itself from threats to its existence, internal and external; and how can a republic avoid lapsing into tyranny.

            Benner advances her view of Machiavelli as a forerunner of modern liberal democracy by placing the Florentine “squarely in his world, among his family, friends, colleagues and compatriots” (p.xix).  Her work has some of the indicia of biography, yet is unusual in that it is written almost entirely in the present tense.  Rather than setting out Machiavelli’s ideas on governance as abstractions, she has taken his writings and integrated them into dialogues, using italics to indicate verbatim quotations – a method which, she admits, “transgresses the usual biographical conventions” but nonetheless constitutes a “natural way to show [her] protagonist in his element” (p.xx).  Benner’s title alludes to Machiavelli’s observation that a fox has a particular kind of cunning that can recognize traps and avoid snares.  Humans need to emulate a fox by being “armed with mental agility rather than physical weapons” and developing a kind of cunning that “sees through ruses, decent words or sacred oaths” (p.151).

            Machiavelli’s world in this “real time” account is almost Shakespearean, turning on intrigue and foible in the pursuit and exercise of power, and on the shortsightedness not only of princes and those who worked for them and curried their favor, but also of those who worked against them and plotted their overthrow.  But Benner’s story is not always easy story to follow.  Readers unfamiliar with late 15th and early 16th Florentine politics may experience difficulty in constructing the big picture amidst the continual conspiring, scheming and back-stabbing.  At the outset, in a section termed “Dramatis Personae,” she lists the story’s numerous major characters by category (e.g., family, friends, popes), and readers will want to consult this helpful list liberally as they work their way through her rendering of Machiavelli. The book would have also benefitted from a chronology setting out in bullet form the major events in Machiavelli’s lifetime.

* * *

               Florence in Machiavelli’s time was already at its height as the center of the artistic and cultural flourishing known as the Renaissance.  But Benner’s story lies elsewhere, focused on the city’s cutthroat political life, dominated as it was by the Medici family.  Bankers to the popes, patrons of Renaissance art, and masters of political cronyism, the Medici exercised close to outright control of Florence from the early 15th century until thrown out of power in 1494, with the assistance of French king Charles VIII, at the outset of Machiavelli’s career. They recaptured control in 1512, but were expelled again in 1527, months before Machiavelli’s death, this time with the assistance of Hapsburg Emperor Charles V.  Lurking behind the Medici family were the popes in Rome, linked to the family through intertwining and sometimes familial relationships.   In a time of rapidly shifting alliances, the popes competed with rulers from France, Spain and the mostly German-speaking Holy Roman Empire for worldly control over Florence and Italy’s other city-states, duchies and mini-kingdoms, all at a time when ominous challenges to papal authority had begun to gather momentum in other parts of Europe.

           The 1494 plot that threw Piero de’ Medici out of power was an exhilarating moment for the young Machiavelli.  Although Florence under the Medici had nominally been a republic — Medici leaders insisted they were simply “First Citizens” — Machiavelli and other Florentines of his generation welcomed the new regime as an opportunity to “build a republic in deed, not just in name, stronger and freer than all previous Florence governments” (p.63).  With the Medici outside the portals of power, worthy men of all stripes, and not just Medici cronies, would be “free to hold office, speak their minds, and play their part in the great, messy, shared business of civil self-government” (p.63).

             Machiavelli entered onto the Florentine political stage at this optimistic time.  He went on to serve as a diplomat for the city of Florence and held several high-level civil service positions, including secretary – administrator – for Florence’s war committee.   In this position, Machiavelli promoted the idea that Florence should abandon its reliance upon mercenaries with no fixed loyalties to fight its wars and cultivate its own home grown fighting force, a “citizens’ militia.”

         Machiavelli’s civil service career came to an abrupt halt in 1513, shortly after Guiliano de’ Medici, with the assistance of Pope Julius II and Spanish troops, wrestled back control over Florence’s government. The new regime accused Machiavelli of participating in an anti-Medici coup.  He was imprisoned, tortured, and banished from government, spending most of the ensuing seven years on the family farm outside Florence. Ironically, he had reconciled with the Medici and re-established a role for himself in Florence’s government by the time of the successful 1527 anti-Medici coup, two months prior to his death.   Machiavelli thus spent his final weeks as an outcast in a new government that he in all likelihood supported.

         The Prince and the Discourses on Livy took shape between 1513 and 1520, Machiavelli’s period of forced exile from political and public life, during which he drew upon his long experience in government to formulate his guidance to princes on how to secure and maintain political power. Although both works were published after his death in 1527, Benner uses passages from them — always in italics — to illuminate particular events of Machiavelli’s life.  Extracting from these passages and Benner’s exegesis upon them, we can parse out a framework for Machiavelli’s ideal republic.  That framework begins with Machiavelli’s consistent excoriation of the shortsightedness of the ruling princes and political leaders of his day, in terms that seem equally apt to ours.

                To maintain power over the long term, leaders need to eschew short-term gains and benefits and demonstrate, as Benner puts it, a “willingness to play the long game, to pit patience against self-centered impetuosity” (p.8). As Machiavelli wrote in the Discourses, for a prince it is necessary to have the people friendly; otherwise he has no remedy in adversity” (p.167).  A prince who thinks he can rule without taking popular interests seriously “will soon lose his state . . . [E]ven the greatest princes need to deal transparently with their allies and share power with their people if they want to maintain their state” (p.250).  Governments that seek to satisfy the popular desire are “firmer and last longer than those that let a few command the rest” (p.260).   Machiavelli’s long game thus hints at the modern notion that the most effective government is one that has the consent of the governed.

           Machiavelli’s ideal republic was not a democracy based upon direct rule by the people but rather upon what we today would term the “rule of law.”  In his Discourses, Machiavelli argued that long-lasting republics “have had need of being regulated by the laws” (p.261).  It is the “rule of laws that stand above the entire demos and regulate the relations between ‘its parts,’ as he calls them,” Benner explains, “so that no class or part can dominate the others” (p.275).  Upright leaders should put public laws above their own or other people’s private feelings.  They should resist emotional appeals to ties of family or friendship, and punish severely when the laws and the republic’s survival so demands.  Arms and justice together are the foundation of Machiavelli’s ideal republic.

            Several high-profile executions of accused traitors and subversives convinced Machiavelli to reject the idea that when a republic is faced with internal threats, “one cannot worry too much about ordinary legal procedures or the rights of defendants” (p.121.)  No matter how serious the offense, exceptional punishments outside the confines of the law “set a corrupting precedent” (p.121).  Machiavelli’s lifelong dream that Florence should cultivate its own fighting force rather than rely upon mercenaries to fight its wars with external enemies arose out of similar convictions.

             In The Prince and the Discourses, Machiavelli admonished princes that the only sure way to maintain power over time is to “arm your own people and keep them satisfied” (p.49).  Cities whose people are “free, secure in their livelihood, respected and self-respecting, are harder to attack than those that lack such robust arms” (p.186). Florence hired mercenaries because its leaders didn’t believe their own people could be trusted with arms. But mercenaries, whose only motivation for fighting is a salary, can  just as easily turn upon their employers’ state, hardly a propitious outcome for long-term sustainability.

               During Machiavelli’s time in exile, the disputatious monk Martin Luther posted his Ninety-Five Theses onto a church door in German-speaking Wittenberg, challenging a wide range of papal practices.  Luther’s provocation set in motion the Protestant Reformation and, with it, more than a century of bloody conflict in Europe between Protestants and Catholics.  The Prince became an instrument in the propaganda wars stirred up by the Reformation, Benner contends, with Machiavelli demonized “mostly by men of religion, both Catholic and Protestant” (p.xv), who saw in the Florentine’s thinking a challenge to traditional relations between church and state.

              These men of religion rightly perceived that the  church would have little role to play in Machiavelli’s ideal republic.  In the Discourses, Benner explains, Machiavelli argued that the Christian “sect,” as he called it, had “always declared war on ideas and writings that it could not control – and especially on those that presented ordinary human reasoning, not priestly authority, as the best source of guidance in private and political life” (p.317).  Men flirt with disaster when they purport to know the unknowable under the guise of religious “knowledge.”  For Machiavelli, unchanging, universal moral truths can be worked out only through a close study of human interactions and reflections on human nature.  Instead of praying for some new holy man to save you, Machiavelli advised, “learn the way to Hell in order to steer clear of it yourself” (p. p.282).   These views earned all of Machiavelli’s works a place on the Catholic Church’s 1557 Index of Prohibited Books, one of the Church’s solutions to the heresies encouraged by the Reformation, where they remained until 1890.

* * *

              The ruthlessly  duplicitous Machiavelli – his “evil double” (p.xiv), as Brenner puts it — is barely present in Benner’s account.  Her Machiavelli, an “altogether human, and humane” (p.xvi) commentator and operative on the political stage of his time, exudes few of the qualities associated with the adjective that bears his name.

Thomas H. Peebles

La Châtaigneraie, France

October 25, 2018

 

 

 

 

8 Comments

Filed under Biography, European History, History, Italian History, Political Theory, Rule of Law

Solitary Confrontations

 

Glenn Frankel, High Noon:

The Hollywood Blacklist and the Making of An American Classic 

            High Noon remains one of Hollywood’s most enduringly popular movies. The term “High Noon” is now part of our everyday language, meaning a “time of a decisive confrontation or contest,” usually between good and evil, in which good is often embodied in a solitary person.  High Noon is a fairly simple story, yet filled with tension.  The film takes place in the small western town of Hadleyville.  Former marshal Will Kane, played by Gary Cooper, is preparing to leave town with his new bride, Amy Fowler, played by Grace Kelly, when he learns that notorious criminal Frank Miller, whom Kane had helped send to jail, has been set free and is arriving with his cronies on the noon train to take revenge on the marshal.  Amy, a devout Quaker and a pacifist, urges her husband to leave town before Miller arrives, but Kane’s sense of duty and honor compels him to stay. As he seeks deputies and assistance among the townspeople, Kane is rebuffed at each turn, leaving him alone to face Miller and his gang in a fatal gunfight at the film’s end.

          High Noon came to the screen in 1952 at the height of Hollywood’s anti-communist campaign, best known for its practice of blacklisting, by which actors, writers, directors, producers, and others in the film industry could be denied employment based upon past or present membership in or sympathy for the American Communist Party.  Developed and administered by film industry organizations and luminaries, among them Cecil B. DeMille, John Wayne and future American president Ronald Reagan, blacklisting arose during the early Cold War years as Hollywood’s response to the work of the United States House of Representatives’ Committee on Un-American Activities, better known as HUAC.

            Until surpassed by Senator Joseph McCarthy, HUAC was the driving force in post World War II America’s campaign to uproot communists and communist sympathizers from all aspects of public life.  The Committee exerted pressure on Hollywood personnel with suspected communist ties or sympathies to avoid the blacklist by “cooperating” with the Committee, which entailed in particular “naming names” – identifying other party members or sympathizers.  Hollywood blacklisting had all the indicia of what we might today call a “witch hunt.” Blacklisting also came close to curtailing High Noon altogether.

         Glenn Frankel’s engrossing, thoroughly-researched High Noon: The Hollywood Blacklist and the Making of An American Classic captures the link between the film classic and Hollywood’s efforts to purge its ranks of present and former communists and sympathizers. Frankel places the anti-communist HUAC investigations and the Hollywood blacklisting campaign within the larger context of a resurgence of American political conservatism after World War II – a “right wing backlash” (p.45) — with the political right struggling to regain the upper hand after twelve years of New Deal politics at home and an alliance with the Soviet Union to defeat Nazi Germany during World War II.  There was a feeling then, as today, Frankel explains, that usurpers had stolen the country: “outsiders had taken control of the nation’s civil institutions and culture and were plotting to subvert its security and values” (p.x).   The usurpers of the post-World War II era were liberals, Jews and communists, and “self-appointed guardians of American values were determined to claw it back” (p.x).

          Hollywood, with its “extraordinary high profile” and “abiding role in our national culture and fantasies” (p.xi), was uniquely placed to shape American values and, to many, communists and Jews seemed to be doing an inordinate amount of the shaping.  In an industry that employed nearly 30,000 persons, genuine communists in Hollywood probably never exceeded 350, with screenwriters roughly half of the 350.  But 175 screenwriters, unless thwarted, could freely produce what right-wing politicians termed “propaganda pictures” designed to undermine American values.  Communists constituted a particularly insidious threat because they looked and sounded indistinguishable from others in the industry, yet were “agents of a ruthless foreign power whose declared goal was to destroy the American way of life” (p.x).  That a high portion of Hollywood’s communists were Jewish heightened suspicion of the Jews who, from Hollywood’s earliest days as the center of the film industry, had played an outside role as studio heads, screenwriters, and agents.  Jews in Hollywood were at once “uniquely powerful” and “uniquely vulnerable” to the attacks of anti-Semites who accused them of “using the movies to undermine traditional American values” (p.13).

            Frankel’s account of this struggle over security and values involves a multitude of individuals, primarily in Hollywood and secondarily in Washington, but centers upon the interaction between three: Gary Cooper, Stanley Kramer, and Carl Foreman.  Cooper was the star of High Noon and Kramer its producer.   Foreman wrote the script and was the film’s associate director until his refusal in September 1951 to name names before HUAC forced him to leave High Noon before its completion.  Foreman and Kramer, leftist leaning politically, were “fast-talking urban intellectuals from the Jewish ghettos of Chicago [Foreman] and New York [Kramer]” (p.xvi).  Foreman had been a member of the American Communist Party as a young adult in the 1930s until sometime in the immediate post-war years; Kramer’s relationship to the party is unclear in Frankel’s account.  Cooper was a distinct contrast to Foreman and Kramer in about every respect, a “tall, elegant, and reticent” (p.xvi) Anglo-Saxon Protestant from rural Montana, the product of conservative Republican stock who liked to keep a low profile when it came to politics.

            Although Cooper was the star of High Noon, Foreman emerges as the star in Frankel’s examination of HUAC investigations and blacklisting. Foreman saw his encounter with HUAC in terms similar to those which Cooper, as Will Kane, encountered in Hadleyville: he was the marshal, HUAC seemed like the gunmen coming to kill the marshal, and the “hypocritical and cowardly citizens of Hadleyville” found their counterparts in the “denizens of Hollywood who stood by passively or betrayed him as the forces of repression bore down” (p.xiii).  The filming of High Noon had begun a few days prior to Foreman’s testimony before HUAC and was completed in just 32 days, on what amounted to a shoestring budget of $790,000.  How the 84-minute black-and-white film survived Foreman’s departure constitutes a mini-drama within Frankel’s often gripping narrative.

* * *

         In most accounts, Hollywood’s blacklisting practices began in 1947, when ten writers and directors — the “Hollywood Ten” — appeared before HUAC and refused to answer the committee’s questions about their membership in the Communist Party.  They were cited for contempt of Congress and served time in prison.  After their testimony, a group of studio executives, acting under the aegis of the Association of Motion Picture Producers, fired the ten and issued what came to be known as the Waldorf Statement, which committed the studios to firing anyone with ties to the Communist Party, whether called to testify before HUAC or not.  This commitment in practice extended well beyond party members to anyone who refused to “cooperate” with HUAC.

           Neither Foreman nor Kramer was within HUAC’s sights in 1947.  At the time, the two had banded together in the small, independent Stanley Kramer Production Company, specializing in socially relevant films that aimed to attract “war-hardened young audiences who were tired of the slick, superficial entertainments the big Hollywood studios specialized in and [were] hungry for something more meaningful” (p.59).  In March 1951, Kramer Production became a sub-part of Columbia Pictures, one of Hollywood’s major studios.   In June of that year, while finishing the script for High Noon, Foreman received his subpoena to testify before HUAC.  The subpoena was an “invitation to an inquisition” (p.xii), as Frankel puts it.

           HUAC, in the words of writer David Halberstam, was a collection of “bigots, racists, reactionaries and sheer buffoons” (p.76). The Committee acted as judge, jury and prosecutor, with little concern for basic civil liberties such as the right of the accused to call witnesses or cross-examine the accuser.  Witnesses willing to cooperate with the Committee were required to undergo a “ritual of humiliation and purification” (p.xii), renouncing their membership in the Communist Party and praising the Committee for its devotion to combating the Red plot to destroy America.  A “defining part of the process” (p.xiii) entailed identifying other party members or sympathizers – the infamous “naming of names” — which became an end in itself for the HUAC, not merely a means to obtain more information, since the Committee already had the names of most party members and sympathizers working in Hollywood.  Forman was brought to the Committee’s attention by Martin Berkeley, an obscure screenwriter and ex-Communist who emerges as one of the book’s more villainous characters — Hollywood’s “champion namer of names” (p.241).

           Loath to name names, Foreman had few good options.  The primary alternative was to invoke the Fifth Amendment against self-incrimination and refuse to answer questions. But such witnesses appeared to have something to hide, and often were blacklisted for failure to cooperate with the Committee.  When he testified before HUAC in September 1951, Forman stressed that he loved his country as much as anyone on the Committee and used his military service during World War II to demonstrate his commitment to the United States.  But he would go no further, refusing to name names.  Foreman conceded for the record that he “wasn’t a Communist now, and hadn’t been one in 1950 when he signed the Screen Writers Guild loyalty oath” (p.201).  The Committee did not hold Foreman in contempt, as it had done with the Hollywood Ten.  But it didn’t take Foreman long to feel the consequences of his refusal to “cooperate.”

           Kramer, who had initially been supportive of Foreman, perhaps out of concern that Foreman might name him as one with communist ties, ended by acceding to Columbia Pictures’ position that Foreman was too tainted to continue to work for its subsidiary.  Foreman left Kramer Production with a lucrative separation package, more than any other blacklisted screenwriter. His attempt to start his own film production company went nowhere when it became clear that anyone working for the company would be blacklisted.  Foreman, a “committed movie guy” who “passionately believed in [films] as the most successful and popular art from ever invented” (p.218), was finished in Hollywood.  He and Kramer never spoke again.

* * *

            Kramer had had little direct involvement with the early shooting of High Noon. But after Foreman’s departure, he reviewed the film and was deeply dismayed by what he saw.  He responded by making substantial cuts, which he later claimed had “saved” the film.  But in Frankel’s account, Cooper rather than Kramer saved High Noon, making the film an unexpected success.  Prior to his departure, Foreman had suggested to Cooper, working for a fraction of his normal fee, that he consider withdrawing from High Noon to preserve his reputation.  Cooper refused. “You know how I feel about Communism,” Frankel quotes Cooper telling Foreman, “but you’re not a Communist now and anyhow I like you, I think you’re an honest man, and I think you should do what is right” (p.170-71).

            Kramer and Foreman were initially reluctant to consider Cooper for the lead role in High Noon.  At age fifty, he “looked at least ten years too old to play the marshal.  And Cooper was exactly the kind of big studio celebrity actor that both men tended to deprecate” (p.150).  Yet, Cooper’s “carefully controlled performance,” combining strength and vulnerability, gave not only his character but the entire picture “plausibility, intimacy and human scale” (p.252), Frankel writes.  Will Kane is “no superhuman action hero, just an aging, tired man seeking to escape his predicament with his life and his new marriage intact, yet knowing he cannot . . . It is a brave performance, and it is hard to imagine any other actor pulling it off with the same skill and grace” (p.252).  None of the “gifted young bucks” whom Kramer and Foreman would likely have preferred for the lead role, such as Marlon Brando, William Holden, or Kirk Douglas, could have done it with “such convincing authenticity, despite all their talent.  In High Noon, Gary Cooper is indeed the truth” (p.252).

            High Noon also established Cooper’s co-star, Grace Kelly, playing Marshal Kane’s new wife Amy in her first major film.  Kelly was some 30 years younger than Cooper and many, including Kramer, considered the pairing a mismatch. But she came cheap and the pairing worked. Katy Jurado, a star in her native Mexico, played the other woman in the film, Helen Ramirez, who had been the girlfriend of both Marshal Kane and his adversary Miller.  During the film, she is involved romantically with Kane’s feckless deputy, Harvey Pell, played by Lloyd Bridges.  High Noon was only Jurado’s second American film, but she was perfect in the role of a sultry Mexican woman.  By design, Foreman created a dichotomy between the film’s male hero — a man of “standard masculine characteristics, inarticulate, stubborn, adept at and reliant on gun violence” (p.253) — and its two women characters who do not fit the conventional models that Western films usually impose on female characters.  The film’s “sensitive focus on Helen and Amy – remarkable for its era and genre – is one of the elements that make it an extraordinary movie” (p.255), Frankel contends.

           Frankel pays almost as much attention to the movie’s stirring theme song, “Do Not Forsake Me, Oh My Darling,” sung by Tex Ritter, as he does to the film’s characters.  The musical score was primarily the work of Dimitri Tiomkin, a Jewish immigrant from the Ukraine, with lyricist Ned Washington providing the words.  The pair produced a song that could be “sung, whistled, and played by the orchestra all the way through the film, an innovative approach that had rarely been used in movies before ” (p.230). Ritter’s raspy voice proved ideally suited to the song’s role of building tension in the film (the better known Frankie Laine had a “more synthetic and melodramatic” (p.234) version that surpassed Ritter’s in sales).  The song’s narrator is Kane himself, addressing his new bride and expressing his fears and longings in music.  The song, whose melody is played at least 12 times during the movie, encapsulates the plot while explaining the marshal’s “inner conflict in a way that he himself cannot articulate” (p.232). Its repetition throughout the film reminds us that Kane’s life and happiness are “on the line, yet he cannot walk away from his duty” (p.250).

           Frankel also dwells on the use of clocks in the film to heighten tension as 12:00 o’clock, High Noon, approaches.  The clocks, which become bigger as the film progresses, “constantly remind us that time is running out for our hero.  They help build and underscore the tension and anxiety of his fruitless search for support.  There are no dissolves in High Noon – none of the usual fade-ins and fade-outs connoting the unseen passage of time – because time passes directly in front of us.  Every minute counts – and is counted” (p.250).

           High Noon was an instant success from the time it came out in the summer of 1952, an “austere and unusual piece of entertainment,” as Frankel describes the film, “modest, terse, almost dour . . . with no grand vistas, no cattle drives, and no Indian attacks, in fact no gunplay whatsoever until its final showdown.  Yet its taut, powerful storytelling, gritty visual beauty, suspenseful use of time, evocative music, and understated ensemble acting made it enormously compelling” (p.249).   But the film was less popular with critics, many of whom considered the film overly dramatic and corny.

          The consensus among the cognoscenti was that the film was “just barely disguised social drama using a Western setting and costumes,” as one critic put it, the “favorite Western for people who hate Westerns” (p.256).  John Wayne argued that Marshal Kane’s pleas for help made him look weak.  Moreover, Wayne didn’t like the negative portrayal of the church people, which he saw it as an attack on American values.  The American Legion also attacked the film on the ground that it was infected by the input of communists and communist sympathizers.

* * *

          After leaving the High Noon set, Foreman spent much of the 1950s in London, where he had limited success in the British film industry while his marriage unraveled.  For a while, he lost his American passport, pursuant to State Department policy of denying passports to anyone it had reason to suspect was Communist or Communist-leaning, making him a man without a country until a court overturned State Department policy.  Kramer left Columbia pictures after High Noon.  He went back to being an independent producer and in that capacity established a reputation as Hollywood’s most consistently liberal filmmaker.  To this day, the families of Foreman and Kramer, who died in 1984 and 2001, respectively, continue to spar over which of the two deserves more credit for High Noon’s success.  Cooper continued to make films after High Noon, most of them westerns of middling quality, “playing the same role over and over” (p.289) as he aged and his mobility grew more restricted.  He kept in touch with Foreman up until his death from prostate cancer in 1961.

* * *

         Frankel returns at the end of his work to Foreman’s view of High Noon as an allegory for the Hollywood blacklisting process — a single man seeking to preserve honor and confront evil alone when everyone around him wants to cut and run. But, Frankel argues, seen on the screen at a distance of more than sixty years, the film’s politics are “almost illegible.” Some critics, he notes, have suggested that Kane, rather than being a brave opponent of the blacklist, could “just as readily be seen as Senator Joseph McCarthy bravely taking on the evil forces of Communism while exposing the cowardice and hypocrisy of the Washington establishment” (p.259).  Sometimes a good movie is just a good movie.

Thomas H. Peebles

La Châtaigneraie, France

October 3, 2018

 

6 Comments

Filed under Film, Politics, United States History

Minding Our Public Language

Mark Thompson, Enough Said:

What’s Gone Wrong With the Language of Politics 

          In Enough Said: What’s Gone Wrong with the Language of Politics, Mark Thompson examines the role which “public language” — the language we use “when we discuss politics and policy, or make our case in court, or try to persuade anyone of anything else in a public context” (p.2) — plays in today’s cacophonous political debates.  Thompson, currently Chief Executive Officer of The New York Times and before that General Director of the BBC, contends that there is a crisis in contemporary democratic decision-making today that at heart is a crisis of political language.  Public language appears to be losing its power to explain and engage, thereby threatening the bond between people and politicians. “Intolerance and illiberalism are on the rise almost everywhere,” Thompson writes, and the way our public language has changed is an “important contributing and exacerbating factor” (p.297-98).

          Thompson seeks to revive the formal study of rhetoric as a means to understand and even reverse the contemporary crisis of public language.  Rhetoric is simply the “study of the theory and practice of public language” (p.2).  Rhetoric “helps us to make sense of the world and to share that understanding. It also teaches us to ‘pay heed’ to the ‘opposite side,’ the other” (p.361). Democracies need public debate and therefore competition in the mastery of public persuasion. Rhetoric, the language of explanation and persuasion, enables collective decision-making to take place.

        Across the book’s disparate parts, Thompson’s central concern is today’s angry and polarized political climate often referred to as “populist,” in which the word “compromise” has become pejorative, the adjective “uncompromising” is a compliment, and the “public presumption of good faith between opposing parties and factions” (p.97) seems to have largely evaporated.  Thompson recognizes that the current populist wave is founded upon a severe distrust of elites.  Given his highest-of-high-level positions at the BBC and The New York Times (along with a degree from Oxford University), Thompson is about as elite as one can become.  He thus observes from the top of a besieged citadel.  Unsurprisingly, Thompson brings a well-informed Anglo-American perspective to his observations, and he shines in pointing to commonalities as well as differences between Great Britain and the United States. There are occasional glances at continental Europe and elsewhere – Silvio Berlusconi’s rhetorical skills are examined, for example – but for the most part this is an analysis of public language at work in contemporary Britain and the United States.

          In the book’s first half, Thompson uses the terminology of classical rhetoric to frame an examination of what he considers the root causes of today’s crisis in public language. Among them are the impact of social media on political discourse and how the pervasive use of sales and marketing language has devalued public debate.  Social media platforms such as Facebook and Twitter have given rise to a “Darwinian natural selection of words and phrases,” he writes, in which, “by definition, the only kind of language that emerges from this process is language that works. You hear it, you get it, you pass it on. The art of persuasion, once the grandest of the humanities and accessible at its highest level only to those of genius – a Demosthenes or a Cicero, a Lincoln or a Churchill – is acquiring many of the attributes of a computational science. Rhetoric not as art, but as algorithm” (p.187).  The use of language associated with sales and marketing serves further to give political language “some of the brevity, intensity and urgency we associate with the best marketing,” while stripping away its “explanatory and argumentative power” (p.191).

          In the second half, Thompson shifts way from applying notions of classical rhetoric to public debate and focuses more directly upon the debate itself in three settings: when scientific consensus confronts spurious scientific claims; when claims for tolerance and respect for racial, religious or ethnic minorities seek to override untrammeled freedom of expression; and when, after the unprecedented and still unfathomable devastation of the 20th century’s world wars, leaders seek to take their country into war.  Thompson’s analyses of these situations are lucid and clearheaded, but for all the common sense and good judgment that he brings to them, I found this section more conventional and less original than the book’s first half, and consequently less intriguing.

* * *

       Thompson starts with a compelling example to which he returns throughout the book, involving the once ubiquitous Sarah Palin and her rhetorical attack on the Affordable Care Act (ACA), better known as Obamacare. Before the ACA was signed into law, one Elizabeth McCaughey, an analyst with the Manhattan Institute, a conservative think tank, looked at a single clause among the 1,000 plus pages of the proposed legislation and drew the conclusion that the act required patients over a certain age to be counseled by a panel of experts on the options available for ending their lives. McCaughey’s conclusion was dead wrong. The clause merely clarified that expenses would be covered for those who desired such counseling, as proponents of the legislation made clear from the outset.

         No matter. Palin grabbed the ball McCaughey had thrown out and ran with it. In one of her most Palinesque moments, the one-term Alaska governor wrote on her Facebook page:

The America I know and love is not one in which my parents or my baby with Down Syndrome will have to stand in front of Obama’s “death panel” so his bureaucrats can decide, based on a subjective judgment of their “level of productivity in society,” whether they are worthy of heath care. Such a system is downright evil (p.4-5).

By placing the words “death panel” and “level of productivity in society” in quotation marks, Palin left the impression that she was quoting from the statute itself.  Thus presented, the words conjured up early 20th century eugenics and Nazi doctors at death camps.  To her supporters, Palin had uncovered “nothing less than a conspiracy to murder” (p.7).

        In the terminology of classical rhetoric, “death panel” was an enthymeme, words that might not mean much to a neutral observer but were all that Palin’s supporters needed to “fill in the missing parts of her argument to construct a complete critique of Obamacare” (p.30).   It had the power of compression, perfect for the world of Facebook and Twitter, and the effect of a synecdoche, in which the part stands for the whole.  Its words were prophetic, taking an imagined future scenario and presenting it as current reality.  Palin’s claim was symptomatic of today’s polarized political debate. It achieved its impact “by denying any complexity, conditionality or uncertainty,” building on a presumption of “irredeemable bad faith,” and rejecting “even the possibility of a rational debate” with the statute’s supporters (p.17).

        Thompson considers Palin’s rhetorical approach distinct in keys ways from that of Donald Trump.    Writing during the 2016 presidential campaign, Thompson observes that Trump had “rewritten the playbook of American political language” (p.80). Trumpian rhetoric avoids cleverness or sophistication:

There are no cunning mousetraps like the “death panel.” The shocking statements are not couched in witty or allusive language. His campaign slogan – Make America Great Again! – could hardly be less original or artful. Everything is intended to emphasize the break with the despised language of the men and women of the Washington machine. There is a wall between them and you, Trump seems to say to his audience, but I am on this side of the wall alongside you. They treat you as stupid, but you understand things far better than they do. The guarantee that I see the world as you do is the fact that I speak in your language, not theirs (p.79-80).

        Yet Thompson roots both Palin’s populism and that of Trump in a rhetorical approach that dates from the 18th century Enlightenment termed “authenticism,” a mode of expression that prioritizes emotion and simplicity of language, and purports to engage with the “lowliest members of the chosen community” (p.155).  To the authenticst, if something “feels true, then in some sense it must be true” (p.155).  Since the Enlightenment, authenticism has been in tension with “rhetorical rationalism,” which venerates fact-based arguments and empirical thinking.  Authenticism rises as trust in public leaders declines.   Authenticists take what their rationalist opponents regard as their most egregious failings, “fantasies dressed up as facts, petulance, tribalism, loss of control of one’s own emotions,” and “flip them into strengths.”  Rationalists may consider authenticism “pitifully cruel, impossible to sustain, downright crazy,” but it can be a compelling rhetorical approach for the “right audience in the right season” (p.356).

        Authenticism found the right audience in the right season in Brexit, Britain’s June 2016 referendum vote to leave the European Union, with people voting for Brexit because they were “sick and tired of spin, prevarication and policy jargon” (p.351).   A single topic referendum such as Brexit, unlike a general election, requires a “minimum level of understanding of the issues and trade-offs involved,” Thompson writes. By this standard, the Brexit referendum should be considered a “disgrace” (p.347).  Those opposing Brexit had little to offer “in the way of positivity to counterbalance the threats; its Tory and Labour leaders seemed scarcely more enthusiastic about Britain’s membership [in] the EU than their opponents.  Their campaign was lackluster and low-energy.  They deserved to lose” (p.347).

        In understanding how classical rhetoric influences public debate, Thompson attaches particular significance to George Orwell’s famous essay “Politics and the English Language,” the “best-known and most influential reflection on public language written in English in the twentieth century” (p.136).  Although Orwell claimed that his main concern in the essay was clarity of language, what he cared most about, Thompson contends, was the “beauty of language . . . Orwell associated beauty of language with clarity, and clarity with the ability of language to express rather than prevent thought and, by so doing, to support truthful and effective political debate” (p.143).  Orwell’s essay thus embodied the “classical understanding of rhetoric,” specifically the “ancient belief that the civic value of a given piece of rhetoric is correlated with its excellence as a piece of expression” (p.143).

* * *

      In the book’s second half, Thompson looks at the public debate over a host of contentious issues that have riveted the United Kingdom and the United States in recent years, beginning with the deference that democratic debate should accord to questionable scientific claims.  So-called climate skeptics, who challenge the overwhelming scientific consensus on anthropogenic global warming, can make what superficially sounds like a compelling case that their views should be entitled to equal time in forums dedicated to the elaboration of public issues, such as those provided by the BBC or The New York Times.  Minority scientific views have themselves frequently evolved into accepted scientific understanding (one 19th century example was the underlying cause of the Irish potato famine, discussed here  in 2014 in my review of John Kelly’s The Graves Are Walking).  Refusal to accord a forum for such views can easily be cast as a “cover up.”

         Thompson shows how members of Britain’s most distinguished scientific body, the Royal Society, once responded to public skepticism over global warming by becoming advocates, presenting the scientific consensus on the need for action in terms unburdened by the caution and caveats that are usually part of scientific explanation, and emphasizing the bad faith of climate change skeptics. Its efforts largely backfired. The more scientists sound like politicians with an agenda, the “less convincing they are likely to be” (p.211).   The same issue arose when a British medical researcher claimed to have a found connection between autism and measles, mumps and rubella vaccinations. The research was subsequently found to be fraudulent, but not before a handful of celebrities and a few politicians jumped aboard an anti-vaccination movement (including, in the United States, Robert F. Kennedy, Jr., and Donald Trump, when he was more celebrity than politician), with an uncountable number of parents opting not to have their children vaccinated.

       Thompson’s discussion of the boundaries of tolerance and free speech raises a similar issue: to what degree should democratic forums include those whose views are antithetical to democratic norms. While at the BBC, Thompson needed to decide whether the BBC would invite the British National Party (BNP), which flirted with Holocaust denial but had demonstrated a substantial following at the ballot box, to a broadcast that involved representatives of Britain’s major parties. In the face of strident opposition, Thompson elected to include the BNP representative and explains why here: the public “had the right to see him and listen to him responding to questions put to him by a studio audience itself made up of people like them. They did so and drew their own conclusions” (p.263).

       Thompson also delivers a full-throated rebuke to American universities that have disinvited speakers because students objected to their views.  The way to defeat extremists and their defenders, whether in faculty lounges or the halls of power, is simply to out-argue them, he contends.  Freedom of expression is best considered a right to be enjoyed “not just by those with something public to say but by everyone” (p.262-63), as a means by which an audience can seek to reach its own judgment. With a few exceptions like child pornography or incitement to violence, Thompson finds no support for the notion that suppressing ideas of which we disapprove is a better way to defeat them in a modern democracy than confronting and debating them in public.

       In a chapter entitled simply “War,” Thompson argues that war is today the greatest rhetorical test for a political leader:

To convince a country to got to war, or to rally a people’s courage and optimism during the course of that war, depends on your ability to persuade those who are listening to you to risk sacrificing themselves and their children for some wider public purpose. It is words against life and limb. [It includes the] need for length and detail as you explain the justification of the war; the simultaneous need for brevity and emotional impact; authenticity, rationality, authority; the search for a persuasiveness that does not – cannot— sound anything like marketing given the blood and treasure that are at stake” (p.219).

        Today, it is almost impossible for any war to be well received in a democracy, except in the very short term.  This is undoubtedly an advance over the days when war was celebrated for its gallantry and chivalry. But, drawing upon the opposition to the Vietnam War in the United States in the 1960s, and to Britain’s decision to join the United States in the second Iraq war in 2003, Thompson faults anti-war rhetoric for its tendency to assume bad faith almost immediately, to “omit awkward arguments or to downplay unresolved issues, to pretend that difficult choices are easy, to talk straight past the other side in the debate, to oversimplify everything” (p.254-55).

* * *

      Thompson does not see today’s populist wave receding any time soon. “One can believe that populism always fails in the end – because of the crudity of its policies, its unwillingness to do unpopular but necessary things, its underlying divisiveness and intolerance – yet still accept that it will be a political fact of life in many western countries for years to come” (p.363).  He ends by abandoning the measured, “this-too-shall-pass” tone that prevails throughout most of his wide-ranging book to conclude on a near-apocalyptic note.   A storm is gathering, he writes, which threatens to be:

greater than any seen since the global infernos of the twentieth century. If the first premonitory gusts of a global populist storm were enough to blow Britain out of Europe and Donald Trump into the White House, what will the main blasts do? If the foretaste of the economic and social disruption to come was enough to show our public language to be almost wholly wanting in 2016, what will happen when the hurricane arrives?” (p.364).

       Is there anything we can do to restore the power of public language to cement the bonds of trust between the public and its leaders?  Can rhetorical rationalists regain the upper hand in public debate? Thompson argues that we need to “put public language at the heart of the teaching of civics . . . We need to teach our children how to parse every kind of public language” (p.322).  Secondary school and university students need to know “how to listen, how to know when someone is trying to manipulate them, how to discriminate between good arguments and bad ones, how to fight their own corner clearly and honestly” (p.366).   This seems like a sensible starting place.  But it may not be sufficient to withstand the hurricane.

Thomas H. Peebles

Bordeaux, France

January 18, 2018

 

 

 

 

 

 

 

 

 

5 Comments

Filed under American Politics, British History, Intellectual History, Language, Politics

Complementary Lives

Thomas Ricks, Churchill & Orwell:

The Fight For Freedom 

       Winston Churchill and George Orwell seem like an unlikely pairing for a dual biography. They were of different generations — Churchill was born in 1874, Orwell was born as Eric Blair in 1903; they pursued different career paths, Churchill as a career politician par excellence, Orwell as a journalist and writer; and there is no record that they ever met.  In Churchill & Orwell: The Fight For Freedom, Thomas Ricks seeks to give a new twist to both men in a work that, in highly condensed form, emphasizes their complementary lives in the 1930s and 1940s.  Ricks, among the foremost contemporary writers on war, with a talent for explaining complex military operations without over-simplifying, contends that Churchill and Orwell “led the way, politically and intellectually, in responding to the twin totalitarian threats of fascism and communism” (p.3).

       Unlike most of their peers, Ricks argues, Churchill and Orwell recognized that the 20th century’s key question was “not who controlled the means of production, as Marx thought, or how the human psyche functioned, as Freud taught, but rather how to preserve the liberty of the individual during an age when the state was becoming powerfully intrusive into private life” (p.3). The legacies of the two men were also complementary: Churchill’s wartime leadership “gave us the liberty we enjoy now. Orwell’s writing about liberty affects how we think about it now” (p.5).

        Churchill and Orwell further shared an uncommon facility with language: each was able to articulate the challenges which 20th century democracy faced in robust, unflinching English prose.  Churchill was “intoxicated by language, reveling in the nuances and sounds of words” (p.11).  Orwell added several words and expressions to the English language, such as “doublethink” and “Big Brother,” and had a distinct style in examining politics and culture that has become the “accepted manner of modern discussion of such issues” (p.262).

            Ricks identifies additional commonalities in the two men’s backgrounds.  Each had a privileged upbringing.  Churchill was a descendant of the Dukes of Marlborough. His father, Lord Randolph Churchill, was a prominent Conservative Party Member of Parliament.  Orwell’s father was a high level civil servant in India, where Orwell was born.  Neither felt close to his father.    Both attended “public schools,” upper class boarding schools, with Churchill’s father telling young Winston that he was just another of the “public school failures” (p.9).  Although Orwell once described his background as “lower upper middle class,” he attended Eton, England’s uppermost public school.  Each had experience in Britain’s far-flung empire: Orwell, who was born in India, spent a formative period in the 1920s in Burma as a policeman; Churchill had youthful adventures in India and the Sudan and served as a war correspondent in South Africa during the Boer War, 1899-1902.  Orwell too had a brief stint as a war correspondent during the Spanish Civil War, 1936-39.

            There is even a mirror image similarity to the two men’s situations in the 1930s. Churchill was a man of the political right who was never fully trusted by his fellow conservatives, and had a nearly complete fallout with the Conservative Party over appeasement of Hitler in the late 1930s.  Orwell was a conventional left-wing socialist until his experiences in the Spanish Civil War opened his eyes to the brutality and dogmatism that could be found on the political left. But their career trajectories moved in opposite directions during World War II and its aftermath. Churchill came off the political sidelines in the 1930s to peak as an inspirational politician and war leader in 1940 and 1941.  Thereafter, Ricks argues, he went into downward slide that never reversed itself.  Orwell remained an obscure, mid-level writer throughout World War II.  His career took off only after publication of his anti-Soviet parable  Animal Farm in 1945, followed four years later by his dystopian classic, 1984.  Orwell’s reputation as a seminal writer, Riggs emphasizes, was established mostly posthumously, after his death from tuberculosis at age 47 in 1950.

          But while Churchill and Orwell recognized the threat that totalitarian systems posed, their political visions were at best only partially overlapping.  The need to preserve the British Empire animated Churchill both during and after World War II, whereas Orwell found the notion of colonization abhorrent.   Orwell’s apprehensions about powerfully intrusive states also arising in the West most likely intrigued but did not consume Churchill. As long as Britain stayed out of Stalin’s clutches, it is unlikely that Churchill fretted much about it evolving into the bleak, all-controlling state Orwell described in 1984.  Ricks’ formulation of the common denominator of their political vision – the need to preserve individual liberty in the face of powerful state intrusions into private life – applies aptly to Orwell.  But the formulation seems less apt as applied to Churchill.

* * *

          Riggs’ dual biographical narrative begins to gather momentum with the 1930s, years that were  “horrible in many ways.”  With communism and fascism on the rise in Europe, and an economic depression spreading across the globe, there was a “growing sense that a new Dark Age was at hand” (p.45). But for Churchill, the 1930s constituted what he termed his “wilderness years,” which he spent mostly on the political sidelines.  By this time, he was considered somewhat of a crank within Conservative Party circles, “flighty, with more energy than judgment, immovable in his views but loose in party loyalties” (p.54).  He had spent much of the 1920s railing against the threat that Indian independence and the Soviet Union posed to Britain. In the 1930s he targeted an even more ominous menace: Adolph Hitler, whose Nazi party came to power in Germany in 1933. One reason that Churchill’s foreboding speeches on Germany were greeted with skepticism, Ricks notes, was that he had been “equally intense about the dangers of Indian independence” (p.47).

      Churchill’s fulminations against the Nazi regime were not what fellow Conservative Party members wanted to hear. Many British conservatives regarded Nazi Germany as a needed bulwark against the Bolshevik menace emanating from Moscow. Churchill’s rupture with Conservative party hierarchy seemed complete after the 1938 Munich accords, engineered by Conservative Party Prime Minister Neville Chamberlain, which dismembered the democratic state of Czechoslovakia.  For Churchill, Munich was a “disaster of the first making . . . the beginning of the reckoning” (p.60).  He issued what Ricks terms an “almost Biblical” warning about the consequences of Munich: “This is only the first sip, the first foretaste of a bitter cup which will be proffered to us year by year unless by a supreme recovery of moral health and marital vigor, we arise again and take our stand for freedom as in the olden time” (p.60).

            Orwell in the 1930s, still using his birth name Eric Blair for many purposes, was a “writer [and] minor author of mediocre novels that had not been selling well” (p.2-3).  Yet he had already discovered what Ricks terms his “core theme,” the abuse of power, a thread that “runs throughout all his writings, from his early works to the very end” (p.23).  When civil war broke out in Spain in 1936, Orwell volunteered to fight for the Republican side against Franco’s Nationalist uprising. What Orwell saw during his seven months in Spain “would inform all his subsequent work,” Ricks writes. “There is a direct line from the streets of Barcelona in 1937 to the torture chambers of 1984” (p.65).

         Orwell joined a unit known by the Spanish acronym POUM, Partido Obrero de Unificación Marxista, the Workers Party of Unified Marxism, which Ricks describes as a “far-left splinter group. . . vaguely Trotskyite,” politically most distinctive for being anti-Stalinist and thus “anathema to the Soviet-controlled Communist Party in Spain” (p.67).  The NKVD, the Russian spy agency deeply involved in Spain during the Civil War, targeted the Spanish POUM for liquidation. “When the crackdown on POUM came in the spring of 1937,” Ricks writes, “Orwell and his fellows would become marked men” (p.68).

          Orwell almost died in May 1937 when he was shot in the neck while fighting against Franco’s insurgents in Barcelona. He was evacuated to Britain to recuperate. While in Britain, the Spanish Communist Party officially charged Orwell and his wife with spying and treason.  During his recuperation, Orwell wrote Homage to Catalonia, his most noteworthy book to date, in which he hammered two main points: “The first is that Soviet-dominated communism should not be trusted by other leftists. The second is that the left can be every bit as accepting of lies as the right” (p.76).  Orwell “went to Spain to fight fascism,” Ricks writes, “but instead wound up being hunted by communists. This is the central fact of his experience of the Spanish Civil War, and indeed it is the key fact of his entire life” (p.44). In Spain, Orwell “developed his political vision and with it the determination to criticize right and left with equal vigor” (p.77).

          The Soviet Union’s non-aggression pact with Germany, executed in August 1939, in which the two powers agreed to divide much of Eastern Europe between them, was a “final moment of clarity” for Orwell. “From this point on, his target was the abuse of power in all its forms, but especially by the totalitarian state, whether left or right” (p.82).  The pact “had the effect on Orwell that the Munich Agreement had on Churchill eleven months earlier, confirming his fears and making him all the more determined to follow the dissident political course he was on, in defiance of his mainstream leftist comrades” (p.81).

          Churchill in Ricks’ interpretation peaked in the period beginning in May 1940, when he became Britain’s Prime Minister at a time when Britain stood alone in Europe as the only force fighting Nazi tyranny. “These were the months in which Churchill became England’s symbolic rallying point” (p.110).  In June 1941, Hitler invaded the Soviet Union and, suddenly, Churchill’s nemesis from the 1920s was Britain’s ally.   “Any man or state who fights on against Nazism will have our aid,” Churchill told the British public in a radio broadcast.  “It follows, therefore, that we shall give whatever help we can to Russia and the Russian people” (p.142-43). When Japan bombed Pearl Harbor and Hitler declared war on the United States in December 1941, just as suddenly Churchill had a second powerful ally.

           In a chapter on the fraught months between May 1940 and December 1941, entitled “Fighting the Germans, Reaching Out to the Americans,” Ricks analyzes Churchill’s speeches as Prime Minister, still “good reading seventy-five years after their delivery” (p.110). He gives particular attention to Churchill’s speech to the United States Congress in late December 1941, in which the Prime Minister presented to representatives of his new wartime ally his vision of the Anglo-American partnership in wartime.  The address was what Ricks describes as a rhetorical “work of political genius . . . more than a speech, it was the diplomatic equivalent of a marriage proposal”(p.149-51).   But with that speech, Ricks argues, Churchill’s best days were already behind him.

            The 1943 meeting in Tehran between Churchill, Roosevelt and Stalin was a turning point for Churchill, the “first time Roosevelt began to act as if he held the senior role in the partnership. It was in Iran that Churchill realized that his dream of dominating a long-term Anglo-American alliance would not come to fruition” (p.169).  Churchill flew out of Tehran “in a black mood, anguished by the passing of British supremacy in the world. After that conference, his personality seemed to change. The dynamo of 1940 became the sluggard of 1944 – increasingly forgetful, less eloquent, and often terribly tired, napping more often and sleeping in late many mornings” (p 171).  Churchill was “off his game at the end of the war and after. The plain facts of British decline were becoming harder to ignore. Churchill’s oratory of this period ‘seemed in danger of degenerating into mere windy bombast’” (p.220), Ricks writes, quoting historian Simon Schama.

          As World War II loomed, Orwell was “seen as a minor and somewhat cranky writer” (p.82), now out of favor with many of his former allies on the political left.  He was not able to enlist in the army because of ill health.  Yet, World War II “energized” him as a writer. Although the war “seemed to knock fiction writing out of Orwell for several years. . . [i]n 1940 alone he produced more than one hundred pieces of journalism – articles, essays, and reviews” (p.127).  His writings showed consistently strong support for Churchill’s war leadership — Churchill was the “only Conservative Orwell seems to have admired” (p.129).

           Orwell joined the BBC’s Overseas Service in August 1941. “There, for more than two years, working on broadcasts to India, he engaged in the kind of propaganda that he spent much of his writing life denouncing,” putting himself “in an occupation that ran deeply against his grain” (p.143).  Orwell’s tenure at the BBC “intensified his distrust of state control of information” (p.145). During the war years, Orwell began work on Animal Farm, published in 1945 as the war ended.

           Animal Farm is a tale of “political violence and betrayal of ideals” (p.176), in which the pigs lead other farm animals in a revolt against their human masters, only to become themselves enslavers. In Animal Farm, the pigs “steadily revise the rules of the farm to their own advantage, and along with it their accounts of the history of farm.”  A single sentence from the book — “All animals are equal, but some are more equal than others” — may be Orwell’s most lasting contribution to modern thought about totalitarianism.  Animal Farm foreshadows the concern that dominated 1984, that controlling the past as well as the present and future, was an “essential aspect of total state control” (p.178-79).

        Orwell was dying of tuberculosis with just seven months to live when 1984 was published in June 1949 (Orwell apparently chose his title by reversing the digits “4” and “8” of 1948, the year he finished the work). The 1943 Tehran conference influenced the world that Orwell described in 1984, consisting of three totalitarian super states, Oceania, Eastasia, and Euroasia, with England reduced to “Airstrip One.” The novel’s hero is a “miserable middle-aged Englishman” (p.225) named Winston Smith. It is unclear whether Orwell’s selection of the name had any relationship to Churchill. Riggs points out that Winston Smith’s life in England bore far more similarities to Orwell’s life than to that of Churchill.

           Smith’s world is one of universal surveillance, where the state’s watchword is “Big Brother is Watching You,” and the ruling party’s slogan’s are “”War is Peace,” “Freedom is Slavery,” and “Ignorance is Strength.”  Objective reality “does not exist or at least is deemed to be illegal by the all-seeing state” (p.226).  Smith’s most significant act is “simply to observe accurately the world around him. Collecting facts is a revolutionary act. Insisting on the right to do so is perhaps the most subversive action possible” (p.226-27).  At a time when Churchill was warning the post-war world that the Soviet Union had erected an Iron Curtain across Europe, 1984 was driven by Orwell’s concern that powerful states on both sides of the curtain would not only forbid people to express certain thought but would also tell them what to think.

          The immediate reaction to both Animal Farm and 1984 was middling at best. It was not until after Orwell’s death in 1950 that the two works attracted worldwide attention and made the former Eric Blair a familiar household name. How Orwell’s reputation took off after his death constitutes a major portion of Ricks’ treatment of Orwell.  Based upon references, allusions, and tributes appearing daily in the media around the world, Ricks concludes, Orwell is a “contemporary figure in our culture. In recent years, he may even have passed Churchill, not in terms of historical significance but of current influence. It has been one of the most extraordinary posthumous performances in British literary history” (p.245).

         While Orwell in 1984 “looked forward with horror,” Churchill spent the post war years working on his war memoirs, “looking back in triumph” (p.221).  Ricks provides an extensive analysis of those memoirs.   Orwell’s last published article was a review of Their Finest Hour, the second of the Churchill war memoirs. Orwell concluded his review by describing Churchill’s writings as “more like those of a human being than of a public figure” (p.233), high praise from the dying man.  There is no indication that Churchill ever read Animal Farm, but he may have read 1984 twice.

* * *

          The Fight for Freedom is not a dual biography based on parallelism between two men’s lives, unlike  Allan Bullock’s masterful Parallel Lives, Hitler and Stalin. Nor is there quite the parallelism in Churchill and Orwell’s political visions that Ricks assumes.  Other factors add a strained quality to The Fight for Freedom.  Numerous digressions fit awkwardly into the narrative: e.g., Margaret Thatcher as “Churchill’s rightful political heir” (p.142); Tony Blair trying to be Churchillian as he took the country into the Iraq war; Martin Luther King forcing Americans to confront the realities of racial discrimination; and Keith Richards defending his dissipated life style by pointing to Churchill’s fondness for alcohol.  There is also a heavy reliance upon other writers’ assessments of the two men. The text thus reads at points like a Ph.D. dissertation or college term paper, with a “cut and paste” feel.  Then there are many Orwell quotations that, Ricks tells us, could have been written by Churchill; and Churchill quotations that could have come from Orwell’s pen. All this suggests that the threads linking the two men may be too thin to be stretched into a coherent narrative, even by a writer as skilled as Thomas Ricks.

Thomas H. Peebles

La Châtaigneraie, France

November 11, 2017

 

 

 

 

 

 

3 Comments

Filed under British History, English History, European History, History, Language, Political Theory, Politics