Is Democracy a Universal Value?

 

Larry Diamond, Ill Winds:

Saving Democracy from Russian Rage, Chinese Ambition, and American Complacency (Penguin Press) 

Stanford professor Larry Diamond is one of America’s foremost authorities on democracy – what it is, how it works in diverse countries throughout the world, how it can take hold in countries with little or no history of democratic governance – and how it can be lost.  Diamond brings a decidedly pragmatic perspective to his subject.  His extensive writings focus in particular on how to sustain fragile democratic governance.  He rarely dwells on classical theory or delves into the origins of democracy.  He is more likely to provide an assessment of the prospects for democracy in contemporary Nicaragua, Nigeria or Nepal, or most anywhere in between, than assess the contribution to modern democracy of, say, Thomas Hobbes or Jean-Jacques Rousseau.  In the two decades following the fall of the Berlin wall and the demise of the Soviet Union, Diamond’s bottom line seemed to be that democracy had the upper hand in most corners of the world – the Middle East being at best a giant question mark – and was steadily extending to numerous countries that had hitherto been considered unlikely places for it to take hold.

That was then. Today, Diamond says that he is more concerned about the future of democracy than at any time in the forty plus years of his career.  He begins Ill Winds: Saving Democracy from Russian Rage, Chinese Ambition, and American Complacency, a distinctly more guarded assessment of democratic prospects across the globe than his earlier writings, by noting that the march toward democracy began to slow around 2006.  The independent Freedom House, which tracks democratic progress worldwide, found that 2017 was the twelfth consecutive year that the number of countries declining in liberty significantly outstripped those gaining.

Rather than democracy, it is now authoritarian government — sometimes termed “illiberal democracy” and often associated with nativist, xenophobic “populism” — that seems to be on the rise across the globe.  Throughout much of the world, Diamond notes, authoritarian governments and their autocratic leaders are “seizing the initiative, democrats are on the defensive, and the space for competitive politics and free expression is shrinking” (p.11).  Today’s world has “plunged into a democratic recession” (p.54), with democracy finding itself “perched on a global precipice.”  If authoritarian ascendancy and democratic erosion continue, Diamond warns, we may reach a “tipping point where democracy goes bankrupt suddenly – plunging the world into depths of oppression and aggression that we have not seen since the end of World War II” (p.293).

Diamond’s sub-title reveals that the “ill winds” of his title are blowing chiefly from a Russia rife with “rage,” and a China abounding in “ambition,” while the United States stands by “complacently” rather than blowing in the opposite direction, as it once did.  If the United States does not reclaim its traditional place as the keystone of democracy, Vladimir Putin of Russia, Xi Jinping of China, and their admirers “may turn autocracy into the driving force of the new century” (p.11).  Emboldened by the “new silence from Donald Trump’s America,” the “new swagger” emanating from Jinping’s China and Putin’s Russia have allowed autocrats across the globe to “tyrannize their opponents openly and without apology”(p.58).

Diamond starts his urgent and alarming assessment with general, introductory chapters that provide a working definition of democracy and summarize the present world wide crisis, for example, “Why Democracies Succeed and Fail,” “The March and Retreat of Democracy,” and “The Authoritarian Temptation.”  He then devotes a chapter to each of his three main actors, the United States, Russia and China.  From there, he moves to a series of recommendations on how established democracies can counter the forces that seem to be leading many countries away from democracy and toward authoritarian styles of governance.  His recommendations include combatting public corruption (the “soft underbelly of authoritarian rule;” p.192); and making the Internet safe for democracy (the “global fight for freedom is inseparable from the fight for internet freedom;” p.259).

In a book about the future of global democracy, Diamond’s recommendations are oddly U.S. centric. They are mostly about how the United States can promote democracy more effectively abroad and render its internal institutions and practices more democratic.  There is little here about what other established democracies – for example, Great Britain, Germany or Australia — can do to be more effective abroad or more democratic at home.  Diamond moreover breaks little new ground in this work.

Few readers are likely to be surprised to learn that Russia and China constitute the world’s major anti-democratic actors; that Hungary and Poland, both part of the European Union, the quintessential  democracy project, are among the most prominent countries moving away from democracy and toward authoritarianism; or that countries otherwise as diverse as Turkey, India, the Philippines and Brazil are moving in the same direction.  Nor does Diamond venture into unfamiliar territory when he argues that the United States under President Donald Trump appears to be more on the side of the authoritarians and populists rather than those seeking to institutionalize democracy in their countries.

But Diamond is an accomplished  salesman for democratic governance, the product he has relentlessly pedaled for over four decades, and his salesmanship skills are on full display here.  Amidst all the reasons he provides for pessimism about democracy’s worldwide prospects, readers will be reassured to find more than a little of the optimism that characterized his earlier works.  Although authoritarians may seem to be on the rise everywhere, people across the globe are not losing their faith in democracy, he argues.   Democracy for Diamond remains nothing less than a “universal value” (p.159).  The world’s democracies quite simply “have the better ideas” (p.225), he writes.  But is modern democracy up to the task of halting and reversing the world’s authoritarian turn?  Is it capable of countering effectively Russian rage and Chinese ambition?  These are the questions Diamond wrestles with throughout this timely and passionately argued work.

* * *

For Diamond, democracy at its core is a system of government where people choose and can change their leaders in regular, free and fair elections.  Such a system should also include strong protections for basic liberties, such as freedom of speech, press and religion; protection for racial and cultural minorities; a robust rule of law and an independent judiciary; trustworthy law enforcement institutions; and a lively civil society.   Diamond says little here about the economic systems of countries seeking to establish and sustain democratic institutions.  But at least since the fall of the Soviet Union in 1991, most democracy experts agree that market economies allowing for free enterprise — along with ample room for state regulation in the public interest — are most compatible with modern democracy.

But sustaining democracy over the longer term depends more on culture than institutions, Diamond argues.  A country’s citizens need to believe in democracy and be “willing to defend it as a way of life” (p.25), in which case the level of economic development and the precise design of institutions matter less. When democracy lacks broad support, it will “always be a fragile reed” (p.25).   And the paramount component of democratic culture is legitimacy, the “resilient and broadly shared belief that democracy is better than any other imaginable form of government.  People must commit to democracy come hell or high water, and stick with it even when the economy tanks, incomes plunge, or politicians misbehave” (p.25).

Democracy is hardly restricted to those economically advanced countries we call “Western” (“Western” and “the West” include not just the countries of Western Europe and North America but also prosperous democratic countries that are not geographically part of the West, such as Japan and New Zealand).  A country does not have to be economically well off to institutionalize democracy, Diamond insists. Many African countries have made earnest starts.  But successful transitions to democracy nonetheless remain strongly linked to economic prosperity, he argues, citing the examples of Greece, Spain, Chile, South Korea, Taiwan and South Africa.

But Russia and China are undermining democracy in all corners of the globe, each blowing its own “ill winds” across the planet.  In Russia’s case, they are the winds of “anger, insecurity, and resentments of a former superpower;” with China, those of “ambitions, swagger, and overreach of a new one” (p.130-31).  Both are investing heavily in efforts to “promote disinformation and covertly subvert democratic norms and institutions” (p.12).   Among today’s foes of democracy, only two leaders, Vladimir Putin and Xi Jinping, have “enough power and ambition to undermine the entire global liberal order” (p.161).

Russia experienced some shallow and tentative moves toward democracy in the 1990s, in the aftermath of the collapse of the Soviet Union.  But since Putin assumed power in 2000, the movement has been almost exclusively in the opposite direction.  Deeply insecure about the legitimacy of his rule, Putin believes that the West is “seeking to encircle Russia and keep it weak” (p.111).   The 2013-14 “Eurormaidan Revolution” in Ukraine, which brought down Viktor Yanukovych, a key autocratic partner, infuriated Putin.   The United States had “toppled his closest ally, in a country he regarded as an extension of Russia itself,” as an American journalist put it.  “All that money American had spent on prodemocracy NGOs in Ukraine had paid off” (p.112).

Russia has mastered the use of social media to “stimulate division, increase social and racial unrest, and undermine the self-assurance of the major Western democracies – and work to divide them from one another” (p.112). Its most dramatic targets were Hilary Clinton and the 2016 U.S. Presidential election. Clinton “would almost certainly have won the Electoral College if there had been no Russian intervention” (p.118), Diamond asserts, although he offers no evidentiary support for this assertion.  In hacking the 2016 US election, Putin succeeded in both of his apparent aims: to “sow division and discord in American democracy . . . [and] to punish Clinton and elect Trump” (p.118).

But the 2016 election was just one instance of Russia’s use of social media disinformation campaigns to undermine liberal democracy.  These campaigns, assaults “on truth itself” and  on the “very notion that there can be ‘an objective, verifiable set of facts” (p.119), often aim to strengthen extremist political forces within established democracies.  They “do not need to – and do not really aim to – persuade democratic publics that Russia’s positions are right, only that a democracy’s government and political leaders cannot be believed or trusted” (p.119).  Russia under Putin has sought to wreak havoc within the European Union, aiming in particular to end the economic sanctions that Europe and the United States imposed on Russia in retaliation for its aggression in Ukraine.  Russia almost certainly provided significant illicit funding to the Brexit campaign, Diamond contends, helping to tip Britain into leaving the European Union, a “major achievement for a Kremlin that has the destruction of European unity as one of its major aims” (p.121).

But Diamond emphasizes that Russia is a declining power whose “malign intentions and nationalist bravado cannot disguise its outstripped economy and shrinking importance to the twenty-first century world” (p.124).  In the long run, the “ambitions of a rising China, not the resentments of a falling Russia” represent the greatest external challenge to global democracy.  Today’s China, still recovering from what many Chinese consider a century of humiliation at the hands of Japan and the West, is the world’s “most dynamic power” (p.144), with global reach and power that will “increasingly and inevitably dwarf Russia’s” (p.124).

China seeks hegemony over all of Asia and the Pacific, Diamond argues.  It also increasingly aspires to challenge the United States for global leadership, “economically, politically, and, some believe, eventually militarily” (p.131).  Its military spending is now second only to that of the United States and it may catch America militarily “sooner than we care to imagine” (p.142-43).  China has already established a claim to global dominance in such  transformative technologies as artificial intelligence, robotics, drones, and electric cars.

Manipulating social media massively and aggressively, China is also building a “sweeping surveillance state that aims to assess every digital footprint of every Chinese citizen and then compile each person’s ‘social credit score.’” (p.236).  It readily shares its “Orwellian tools” with other a autocratic regimes, “threatening an ‘Arab Spring in reverse’ in which digital technology enable ‘state domination and repression at a staggering scale’” (p.237).

China’s foreign aid goes disproportionately to the world’s autocrats, many of whom think that China has developed a secret formula.  While some authoritarian regimes dislike China’s heavy-handed attempts to win influence and gain control — sometimes considered a new form of colonialism — others are lured to China’s side by “money, power, ambition, and simple admiration for its sheer success” (p.144).  In addition to assisting the world’s autocracies and countries that could bend in that direction, China also focuses on influencing the world’s democracies.

Diamond sees China playing a longer and more patient game than Russia in its dealing with the West. Through media deals, investments, partnership agreements, charitable and political donations, and positions on boards of directors, it is seeking wider and deeper infiltration into what Diamond calls the “vital tissues of democracies” (p.133): publishing houses, entertainment industries, technology companies, universities, think tanks, non-governmental organizations.  Favorable views of China, he notes, exceed that of the United States in much of the world.

Prior to Donald Trump’s successful 2016 presidential candidacy, Diamond considered the United States uniquely qualified to lead the global resistance to Russian rage and Chinese ambition.  Since Trump became president, however, the United States appears to be more on the side of the authoritarians and populists rather than those seeking to institutionalize democracy in their countries – or, at best, on the sidelines while Russia and China seek to extend their influence and undermine democracy.  If there is any upside to the Trump presidency, Diamond notes, it is that it provides a glimpse into the alarming consequences of world without American leadership and steadfastness, a “far more frightening and dangerous place, with muscular, corrupt dictatorships dominating large swaths of the globe through blatant coercion and covert subversion” (p.287).

Trump’s unremitting insistence that the United States is being cheated by its friends and allies has propelled the country “down the self-defeating path of ‘America alone’” (p.301).  His decision to withdraw the United States from the Trans-Pacific Partnership (TPP), a 2016 twelve-nation Pacific Rim free-trade agreement, “so visionary and so necessary,” constitutes in Diamond’s view the “most grievous self-inflicted wound to America’s global leadership since the creation of the liberal world order after World War II” (p.144).  US withdrawal from the TPP amounted to a “massive gift to authoritarian China and a body blow to democratic aspirations in Southeast Asia” (p.144-45), serving  as a “stunning symbol – and accelerator – of both China’s rise and America’s descent.  As the great democracy that dominated world politics in the twentieth century retreated, the great dictatorship that aims to dominate world politics in the twenty-first could hardly believe its luck” (p.145).

Diamond provides an extensive set of recommendations on how the United States and other advanced democratic countries can deliver more sustainable assistance to aspiring and fragile democracies to counter Russia and China.  Priorities need to be combatting kleptocracy, public corruption, and international money laundering; making the internet safe for democracy; and improving  public diplomacy through  smarter uses of “soft power” to counter Russia and China’s “sharp power.”

Kleptocracy, a recent term now frequently used for high level state corruption, involves the theft of state resources that could have advanced the public good but instead were diverted for private gain – hospitals and schools that were not built, for example – and by definition constitutes a crime against a country’s citizens.  Kleptocracy depends upon using the international financial system to “move, mask, and secure ill-gotten fortunes across borders,” posing the “single most urgent internal threat to democracy,” a threat which renders fragile democracies “all the more vulnerable to external subversion” (p.184).  Many of the world’s democracies, not least the United States, are complicit in providing refuge for the ill-gotten gains of the world’s kleptocrats.  Global transfers of untraceable funds have enabled a “stunning array of venal dictators and their family members, political allies, and business cronies to acquire property and influence in the West as well as to corrupt democracy and the rule of law within free nations” (p.184).

Diamond’s recommendations for combatting public corruption and international money laundering are for the most part US-oriented (e.g. modernize and strengthen the Foreign Agents Registration Act; empower the Treasury Department’s Financial Crimes Enforcement Network to conduct its own investigations).  But he also offers some general recommendations that all the world’s advanced democracies could and should follow (e.g. end anonymous shell companies and real estate purchases).

Today, moreover, the Internet and related technologies – email, text messaging, photo sharing – have the potential to uncover public corruption, as well as highlight human rights abuses, expose voter fraud, and organize demonstrations.   These technologies played a major role in the protests in 2011 that brought down Egyptian dictator Hosni Mubarak; and those that challenged Iran’s blatantly fraudulent 2009 elections.   But many modern authoritarian regimes – not just Russia and China — have developed sophisticated means to to “manipulate, manage, vilify, and amplify public opinion online” (p.234). Freedom House considers  growing state level manipulation of social media one of the leading causes of the steady eight-year decline in global Internet freedom.  Making the Internet a safe place for democracy requires a “concerted partnership among democratic governments, technology companies, civil-society groups, and individual ‘netizens’” (p.229).

Diamond also provides a set of recommendations for how the United States can fine tune its own internal democratic mechanisms through, for example, adoption of ranked choice voting, reducing the gerrymandering of legislative districts and the influence of money in politics — worthy objectives, but markedly out of line with the priorities of the Trump administration and today’s Republican Party.  Looking beyond the Trump administration, however, Diamond argues that the tide of authoritarianism can be reversed.

Few people celebrate authoritarianism as a superior system, “morally or practically” (p.225 ).  There are no large-scale surveys of public opinion showing a popular groundswell for authoritarianism.  Rather, in  surveys from every region of the world, “large to overwhelming majorities of the public, on average, said that democracy is the best form of government and that an unaccountable strongman is a bad idea” (p.159-60).  Within even the world’s most tenacious autocracies, “many people want to understand what democracy is and how it can be achieved.  Even many dictators and generalissimos know and fear democracy’s allure” (p.225).  In this networked age, “both idealism and the harder imperatives of global power and security argue for more democracy, not less” (p.200).

* * *

The best way to counter Russian rage and Chinese ambition, Diamond counsels, is to show that Moscow and Beijing are “on the wrong side of history; that people everywhere yearn to be free, and that they can make freedom work to achieve a more just, sustainable and prosperous society” (p.200).   Yet Diamond makes clear that checking the worldwide authoritarian tide depends to an unsettling degree upon the United States reversing its present course and prioritizing anew the global quest for democracy.

 

Thomas H. Peebles

La Châtaigneraie, France

June 26, 2020

 

 

2 Comments

Filed under American Politics, World History

Misjudgments and Misdeeds of an Unseen Power Broker

Jefferson Morley, The Ghost:

The Secret Life of  CIA Spymaster James Jesus Angleton

(St. Martin’s)

James Jesus Angleton served as the Central Intelligence Agency’s head of counterintelligence — its top spy and effectively the number three person in the agency — from 1954 until he was forced into retirement in 1975.  Although his name is a less familiar than that of the FBI’s original director, J. Edgar Hoover, I couldn’t help thinking of Hoover as I read Jefferson Morley’s trenchant biography, The Ghost: The Secret Life of CIA Spymaster James Jesus Angleton.  Both were immensely powerful, paranoid men who repeatedly broke or skirted the law to advance their often-idiosyncratic versions of what United States national security required.  Throughout their careers, both were able to avoid almost all attempts to hold them accountable for their misdeeds.  With the passage of four decades since Hoover’s death in 1972 and Angleton’s departure from the CIA three years later, we can see that the two men seem  embodied what has recently come to be known as the “Deep State,” a nearly independent branch of government in which officials secretly manipulate government policy, as Morley puts it, “largely beyond the view of the Madisonian government and the voting public” (p.xi).

Morley demonstrates that the notorious COINTELPRO operation, associated today with Hoover and arguably his most dubious legacy, actually began as a joint FBI-CIA undertaking that Angleton concocted.  COINTELPRO aimed to infiltrate and disrupt dissidents and included among its targets Dr. Martin Luther King, left leaning organizations, and Vietnam anti-war protestors.  The original idea that Angleton sold to a skeptical Hoover, who considered the CIA a “nest of liberals, atheists, homosexuals, professors, and otherwise feminized men who specialized in wasting the taxpayer dollar” (p.71), was that the Bureau would target subjects within the United States while the Agency would take the lead in targeting subjects outside the United States.

From there, the CIA and FBI collaborated on LINGUAL, an elaborate and extensive program to read American citizens’ mail, which Morley terms perhaps Angleton’s “most flagrant violation of the law” (p.82); and on CHAOS, an operation designed to infiltrate the entire anti-Vietnam war movement, not just people or organizations that engaged in violence or contacted foreign governments. Post-Watergate hearings brought the existence and extent of COINTELPRO, LINGUAL and CHAOS  to light, along with numerous other chilling exercises of authority attributed to the FBI and CIA, leading to Angleton’s involuntary retirement from the agency.

Morley, a freelance journalist and former Washington Post editor, does not make the Hoover comparison explicitly.  He sees in Angleton a streak of Iago, Othello’s untrustworthy advisor: outwardly a “sympathetic counselor with his own agenda, which sometimes verged on the sinister” (p.158).  Angleton served four American presidents with “seeming loyalty and sometimes devious intent” (p.159), he writes (of course, the same could be said of Hoover, who served eight presidents over the course of a career that began in the 1920s).

Writing in icy prose that pieces together short, punchy vignettes with one word titles, Morley undertakes to show how Angleton was able to elevate himself from a “staff functionary” at the CIA, a new agency created in 1947, to an “untouchable mandarin” who had an “all but transcendent influence on U.S. intelligence operations for two decades” (p.67).  At the height of the Cold War, Morley writes, Angleton became an “unseen broker of American power” (p.158).

But Morley’s biography might better be viewed as a compendium of the misjudgments and misdeeds that punctuated Angleton’s career from beginning to end.  Angleton’s judgment failed him repeatedly, most notoriously when his close friend and associate, British intelligence agent Kim Philby, was revealed to have been a Soviet spy from World War II onward (I reviewed Ben McIntyre’s biography of Philby here in 2016). The Philby revelation convinced Angleton that the KGB had also planted an agent within the CIA, precipitating a disastrous and abysmally unsuccessful “mole hunt” that paralyzed the CIA for years and damaged the careers of many innocent fellow employees, yet discovered no one.

The book’s most explosive conjuncture of questionable judgment and conduct involves Angleton’s relationship to Lee Harvey Oswald, President John F. Kennedy’s presumed assassin.  Angleton followed Oswald closely from 1959, when he defected to the Soviet Union, to that fateful day in Dallas in 1963.  Thereafter, Angleton tenaciously withheld his knowledge of Oswald from the Warren Commission, charged with investigating the circumstances of the Kennedy assassination, to the point where Morley suggests that Angleton should have been indicted for obstruction of justice.  The full extent of Angleton’s knowledge of Oswald has yet to come out, leaving his work laden with fodder for those of a conspiratorial bent who insist that Oswald was something other than a lone gunman, acting alone, as the Warren Commission found (in 2015, I reviewed Peter Savodnik’s biography of Oswald here, in which Savodnik argues forcefully for the lone gunman view of Oswald).

* * *

Born in 1917 in Boise, Idaho, Angleton was the son of a prosperous merchant father and a Mexican-American mother (hence the middle name “Jesus”).  At age 16, the young Angleton moved with his family to Milan, where his father ran the Italian-American Chamber of Commerce and was friendly with many leaders in the fascist regime of Benito Mussolini.  For the remainder of his life, James retained a fondness for Italy, Italian culture and, it could be argued, the Italian brand of fascism.

Angleton attended boarding school in England, then went on to Yale as an undergraduate.  At Yale, he demonstrated a keen interest in poetry and came under the influence of the poet Erza Pound, who later became notorious for his Nazi sympathies (after an investigation led by J. Edgar Hoover, Pound was jailed during World War II).  Poetry constituted a powerful method for Angleton, Morley writes.  He would come to value “coded language, textual analysis, ambiguity, and close control as the means to illuminate the amoral arts of spying that became his job.  Literary criticism led him to the profession of secret intelligence.  Poetry gave birth to a spy” (p.8).

During World War II, Angleton found his way to the Office of Strategic Services, the CIA’s predecessor agency.  He spent the later portion of the war years in Rome, where he developed a friendship with Junio Valerio Borghese, “perhaps the most famous fascist military commander in Italy” (p.21).  Angleton helped Borghese avoid execution at the hands of the same partisan forces that captured and executed Mussolini in 1945.  Thanks to Angleton’s efforts, Borghese “survived to become titular and spiritual leader of postwar Italian fascism” (p.27), and one of the United States’ key partners in preventing a Communist takeover of postwar Italy.

Angleton prepared for his assignment in Rome at Bletchley Park in England, the center of Allied code-breaking operations during World War II.  There, Angleton learned the craft of counter-intelligence under the tutelage of Kim Philby, who taught the young American “how to run double agent operations, to intercept wireless and mail messages, and to feed false information to the enemy.  Angleton would prove to be his most trusting friend” (p.18).  After the war, Philby and Angleton both found themselves in Washington, where they became inseparable buddies, the “closest of friends, soul mates in espionage” (p.41).  Each saw in the other the qualities needed to succeed in espionage: ruthlessness, calculation, autonomy, and cleverness.

The news of Philby’s 1963 defection to Moscow iwas “almost incomprehensible” (p.123) to Angleton.  What he had considered a deep and warm relationship had been a sham.  Philby was “his friend, his mentor, his confidant, his boozy buddy,” Morley writes.  And “through every meeting, conference, debriefing, confidential aside, and cocktail party, his friend had played him for a fool” (p.124).  Philby’s defection does not appear to have damaged Angleton’s position within the CIA, but it set him off on a disastrous hunt for a KGB “mole” that would paralyze and divide the agency for years.

Angleton’s mole hunt hardened into a “fixed idea, which fueled an ideological crusade that more than a few of his colleagues denounced as a witch hunt” (p.86).  Angleton’s operation  was multi-faceted,  “consisting of dozens of different mole hunts – some targeting individuals, others focused on components within the CIA (p.135).  Angleton’s suspicions “effectively stunted or ended the career of colleagues who were guilty of nothing” (p.198).  To this day, after the opening of significant portions of KGB archives in the aftermath of the fall of the Soviet Union, there is no indication it ever had a mole burrowed into the CIA.  Angleton’s mole hunt, Morley concludes, “soaked in alcohol” and permeated by “convoluted certitudes,” brought Angleton to the “brink of being a fool” (p.126).

Just as Angleton never gave up his (witch) hunt for the KGB spy within the CIA, he became convinced that Harold Wilson, British Labor politician and for a while Prime Minister, was a Soviet Spy, and never relinquished this odd view either.  And he argued almost until the day he departed from the CIA that the diplomatic sparring and occasional direct confrontation between the Soviet Union and China was an elaborate exercise in disinformation to deceive the West.

While head of counterintelligence at the CIA, Angleton served simultaneously as the agency’s desk officer for Israel, the direct link between Israeli and American intelligence services.  Angleton was initially wary of the Israeli state that came into existence in 1948, in part the residue of the anti-Semitism he had entertained in his youth, in part the product of his view that too many Jews were communists. By the mid-1950s, however, Angleton had overcome his initial reticence to become an admirer of Israel and especially Mossad, its primary intelligence service.

But Angleton’s judgment in his relationship with Israel frequently failed him just as it failed him in his relationship with Philby.  He did not foresee Israel’s role in the 1956 Anglo-French invasion of Suez (the subject of Ike’s Gamble, reviewed here in 2017), infuriating President Eisenhower.  After winning President Johnson’s favor for calling the Israeli first strike that ignited the June 1967 Six Day War (“accurate almost down to the day and time,” p.181), he incurred the wrath of President Nixon for missing Egypt’s strike at Israel in the October 1973 Yom Kippur War.  Nixon and his Secretary of State, Henry Kissinger, were of the view that Angleton had grown too close to Israel.

Angleton, moreover, was almost certainly involved behind the scenes in a 1968 Israeli heist of uranium enriched nuclear fuel to build its own nuclear reactor, lifted from a Pennsylvania power plant known as NUMEC.  A CIA analyst later concluded that NUMEC had been a “front company deployed in an Israeli-American criminal conspiracy to evade U.S.. nonproliferation laws and supply the Israeli nuclear arsenal” (p.261-62).  Angleton’s loyalty to Israel “betrayed U.S. policy on an epic scale” (p.261), Morley writes.

* * *

Morley’s treatment of Angleton’s relationship to to Lee Harvey Oswald and Fidel Castro’s Cuba raises more questions that it answers.  The CIA learned of Oswald’s attempt to defect to the Soviet Union in November 1959, and began monitoring him at that point.  In this same timeframe, the CIA and FBI began jointly monitoring a pro-Castro group, the Fair Play for Cuba Committee, which would later attract Oswald. Although Angleton was a contemporary and occasional friend of John Kennedy (the two were born the same year), when Kennedy assumed the presidency in 1961, Angleton’s view was that American policy toward Fidel Castro needed to be more aggressive. He viewed Cuba as still another Soviet satellite state, but one just 90 miles from United States shores.

The Kennedy administration’s Cuba policy got off to a miserable start with the infamous failure of the April 1961 Bay of Pigs operation to dislodge Castro.  Kennedy was furious with the way the CIA and the military had presented the options to him and fired CIA Director Allen Dulles in the operation’s aftermath (Dulles’ demise is one of the subjects of Stephen Kinzer’s The Brothers, reviewed here in 2014). But elements within the CIA and the military held Kennedy responsible for the failure by refusing to order air support for the operation (Kennedy had been assured prior to the invasion that no additional military assistance would be necessary).

CIA and military distrust for Kennedy heightened after the Cuban Missile Crisis of October 1962, when the United States and the Soviet Union faced off in what threatened to be a nuclear confrontation over the placement of offensive Soviet missiles on the renegade island.  Although Kennedy’s handling of that crisis was widely acclaimed as his finest moment as president, many within the military and the CIA, Angleton included, thought that Kennedy’s pledge to Soviet Premier Khrushchev of no invasion of Cuba in exchange for Soviet withdrawal of missiles had given Castro and his Soviet allies too much.  Taking the invasion option off the table amounted in Angleton’s view to a cave in to Soviet aggression and a betrayal of the anti-Castro Cuban community in the United States.

In the 13 months that remained of the Kennedy presidency, the administration continued to obsess over Cuba, with a variety of operations under consideration to dislodge Castro.  The CIA was also  monitoring Soviet defector Oswald, who by this time had returned to the United States.  Angleton placed Oswald’s’ name on the LINGUAL list to track his mail.  By the fall of 1963, Oswald had become active in the Fair Play for Cuba Committee, passing out FPCC leaflets in New Orleans.  He was briefly arrested for disturbing the peace after an altercation with anti-Castro activists.  In October of that year, a mere one month before the Kennedy assassination, the FBI and CIA received notice that Oswald had been in touch with the Soviet and Cuban embassies and consular sections in Mexico City.  Angleton followed Oswald’s Mexico City visits intensely, yet withheld for the rest of his life precisely what he knew about them .

From the moment Kennedy was assassinated, Angleton “always sought to give the impression that he knew very little about Oswald before November 22, 1963” (p.140).  But Angleton and his staff, Morley observes, had “monitored Oswald’s movements for four years. As the former marine moved from Moscow to Minsk to Fort Worth to New Orleans to Mexico City to Dallas,” the special group Angleton created to track defectors “received reports on him everywhere he went” (p.140-41).  Angleton clearly knew that Oswald was in Dallas in November 1963.   He hid his knowledge of Oswald from the Warren Commission, established by President Lyndon Johnson to investigate the Kennedy assassination. What was Angleton’s motivation for obfuscation?

The most plausible – and most innocent – explanation is that Angleton was protecting his own rear end in an “epic counterintelligence failure” that had “culminated on Angleton’s watch. It was bigger than the Philby affair and bloodier” (p.140).  Given this disastrous counterintelligence failure, Morley argues, Angleton “could have – and should have – lost his job after November 22 [1963].  Had the public, the Congress, and the Warren Commission known of his pre-assassination interest in Oswald or his post-assassination cover-up, he surely would have” (p.157).

But the range of possibilities Morley considers extends to speculation that Angleton may have been hiding his own involvement in a Deep State operation to assassinate the president.   Was Angleton running Oswald as an agent in an assassination plot, Morley asks:

He certainly had the knowledge and ability to do so.  Angleton and his staff had a granular knowledge of Oswald long before Kennedy was killed.  Angleton had a penchant for running operations outside of reporting channels. He articulated a vigilant anti-communism that depicted the results of JFK’s liberal policies in apocalyptic terms. He participated in discussions of political assassination. And he worked in a penumbra of cunning that excluded few possibilities (p.265).

Whether Angleton manipulated Oswald as part of an assassination plot is a question Morley is not prepared to answer.  But in Morley’s view, Angleton plainly “obstructed justice to hide interest in Oswald.   He lied to veil his use of the ex-defector in later 1963 for intelligence purposes related to the Cuban consulate in Mexico City. . . Whoever killed JFK, Angleton protected them. He masterminded the JFK conspiracy and cover up” (p.265).   To this day, no consensus exists as to why Angleton dodged all questions concerning his undisputed control over the CIA’s file on Oswald for four years, up to Oswald’s death in November 1963.  Angleton’s relationship to Oswald remains “shrouded in deception and perjury, theories and disinformation, lies and legends” (p.87), Morley concludes.  Even though a fuller story began to emerge when Congress ordered the declassification of long-secret JFK assassination records in the 1990s,” the full story has “yet to be disclosed” (p.87).

* * *

The burglary at the Democratic National Headquarters in the Watergate Hotel in June 1972 proved to be Angleton’s professional undoing, just as it was for President Richard Nixon.  The burglary involved three ex-CIA employees, all likely well known to Angleton.   In 1973, in the middle of multiple Watergate investigations, Nixon appointed William Colby as agency director, a man determined to get to the bottom of what was flowing into the public record about the CIA and its possible involvement in Watergate-related activity.

Colby concluded that Angleton’s never-ending mole hunts were “seriously damaging the recruiting of Soviet officers and hurting CIA’s intelligence intake” (p.225).  Colby suspended LINGUAL, finding the mail opening operation “legally questionable and operationally trivial,” having produced little “beyond vague generalities” (p.225). At the same time, New York Times investigative reporter Seymour Hersh published a story that described in great detail Operation CHAOS, the agency’s program aimed at anti-Vietnam activists, attributing ultimate responsibility to Angleton.  Immediately after Christmas 1974. Colby moved  to replace Angleton.

For the first and only time in his career, Angleton’s covert empire within the CIA stood exposed and he left the agency in 1975.  When Jimmy Carter became president in 1977, his Department of Justice elected not to prosecute Angleton, although Morley argues that it had ample basis to do so.  In retirement, Angleton expounded his views to “any and all who cared to listen” (p.256).  He took to running reporters “like he had once run agents in the field, and for the same purpose: to advance his geopolitical vision” (p.266).

* * *

Angleton, a life-long smoker (as well as heavy drinker) was diagnosed with lung cancer in 1986 and died in May 1987.  He was, Morley concludes “fortunate that so much of his legacy was unknown or classified at the time of his death..”  Angleton not only “often acted outside the law and the Constitution,” but also, for the most part, “got away with it” (p.271).

Thomas H. Peebles

La Châtaigneraie, France

June 10, 2020

 

2 Comments

Filed under American Politics, Biography, United States History

Reading Darwin in Abolitionist New England

 

Randall Fuller, The Book That Changed America:

How Darwin’s Theory of Evolution Ignited a Nation (Viking)

In mid-December 1859, the first copy of Charles Darwin’s On the Origin of Species arrived in the United States from England at a wharf in Boston harbor.  Darwin’s book explained how plants and animals had developed and evolved over multiple millennia through a process Darwin termed “natural selection,” a process which distinguished On the Origins of Species from the work of other naturalists of Darwin’s generation.   Although Darwin said little in the book about how humans fit into the natural selection process, the work promised to ignite a battle between science and religion.

In The Book That Changed America: How Darwin’s Theory of Evolution Ignited a Nation, Randall Fuller, professor of American literature at the University of Kansas, contends that what made Darwin’s insight so radical was its “reliance upon a natural mechanism to explain the development of species.  An intelligent Creator was not required for natural selection to operate.  Darwin’s’ vision was of a dynamic, self-generation process of material change.  That process was entirely arbitrary, governed by physical law and chance – and not leading ineluctably . . . toward progress and perfection” (p.24).  Darwin’s work challenged the notion that human beings were a “separate and extraordinary species, differing from every other animal on the planet. Taken to its logical conclusion, it demolished the idea that people had been created in God’s image” (p.24).

On the Origins of Species arrived in the United States at a particularly fraught moment.  In October 1859, abolitionist John Brown had conducted a raid on a federal arsenal in Harper’s Ferry (then part of Virginia, today West Virginia), with the intention of precipitating a rebellion that would eradicate slavery from American soil.  The raid failed spectacularly: Brown was captured, tried for treason and hung on December 2, 1859.  The raid and its aftermath exacerbated tensions between North and South, further polarizing the already bitterly divided country over the issue of chattel slavery in its southern states.  Notwithstanding the little Darwin had written about how humans fit into the natural selection process, abolitionists seized on hints in the book that all humans were biologically related to buttress their arguments against slavery.  To the abolitionists, Darwin “seemed to refute once and for all the idea that African American slaves were a separate, inferior species” (p.x).

Asa Gray, a respected botanist at Harvard University and a friend of Darwin, received the first copy of On the Origin of Species in the United States.  He passed the copy, which he annotated heavily, to his cousin by marriage  Charles Loring Brace (who was also a distant cousin of Harriet Beecher Stowe, author of the anti-slavery runaway best-seller Uncle Tom’s Cabin).  Brace in turn introduced the book to three men: Franklin Benjamin Sanborn, a part-time school master and full-time abolitionist activist; Amos Bronson Alcott, an educator and loquacious philosopher, today best remembered as the father of author Louisa May Alcott; and Henry David Thoreau, one of America’s best known philosophers and truth-seekers.  Sanborn, Alcott and Thoreau were residents of Concord, Massachusetts, roughly twenty miles north of Boston, the site of a famous Revolutionary War battle but in the mid-19th century both a leading literary center and a hotbed of abolitionist sentiment.

As luck would have it, Brace, Alcott and Thoreau gathered at Sanborn’s Concord home on New Year’s Day 1860.  Only Gray did not attend. The four men almost certainly shared their initial reactions to Darwin’s work.   This get together constitutes the starting point for Fuller’s engrossing study, centered on how Gray and the four men in Sanborn’s parlor on that New Year’s Day  absorbed Darwin’s book.   Darwin himself is at best a background figure in the study.  Several familiar figures make occasional appearances, among them:  Frederick Douglass, renowned orator and “easily the most famous black man in America” (p.91); Bronson Alcott’s author-daughter Louisa May; and American philosophe Ralph Waldo Emerson, Thoreau’s mentor and friend.  Emerson, like Louisa May and her father, was a Concord resident, and Fuller’s study takes place mostly there, with occasional forays to nearby Boston and Cambridge.

Fuller’s study is therefore more tightly circumscribed geographically than its title suggests.  He spends little time detailing the reaction to Darwin’s work in other parts of the United States, most conspicuously in the American South, where any work that might seem to support abolitionism and undermine slavery was anathema.   The study is also circumscribed in time; it takes place mostly in 1860, with most of the rest confined to the first half of the 1860s, up to the end of the American Civil War in 1865.  Fuller barely mentions what is sometimes called “Social Darwinism,” a notion that gained traction in the decades after the Civil War that purported to apply Darwin’s theory of natural selection to the competition between individuals in politics and economics, producing an argument for unregulated capitalism.

Rather, Fuller charts out the paths each of his five main characters traversed in absorbing and assimilating into their own worldviews the scientific, religious and political ramifications of Darwin’s work, particularly during the tumultuous year 1860.   All five were fervent abolitionists.   Sunburn was a co-conspirator in John Brown’s raid.  Thoreau gave a series of eloquent, impassioned speeches in support of Brown.  All were convinced that Darwin’s notion of natural selection had provided still another argument against slavery, based on science rather than morality or economics.  But in varying degrees, all five could also be considered adherents of transcendentalism, a mid-19th century philosophical approach that posited a form of human knowledge that goes beyond, or transcends, what can be seen, heard, tasted, touched or felt.

Although transcendentalists were almost by definition highly individualistic, most believed that a special force or intelligence stood behind nature and that prudential design ruled the universe.  Many subscribed to the notion that humans were the products of some sort of “special creation.”   Most saw God everywhere, and considered the human mind “resplendent with powers and insights wholly distinct from the external world” (p.54).  Transcendentalism was both an effort to invoke the divinity within man and, as Fuller puts it, also “cultural attack on a nation that had become too materialistic, too conformist, too smug about its place in history” (p.66).

Transcendentalism thus hovered in the background in 1860 as all but Sanborn wrestled with the implications of Darwinism (Sanborn spent much of the year fleeing federal authorities seeking his arrest for his role in John Brown’s raid).  Alcott never left transcendentalism, rejecting much of Darwinism.  Gray and Brace initially seemed to embrace Darwinian theories wholeheartedly, but in different ways each pulled back once he fully grasped the full implications of those theories.   Thoreau was the only one of the five who accepted wholly Darwinism’s most radical implications, using Darwin’s theories to “redirect his life’s work” (p.ix).

Fuller’s study thus combines a deep dive into the New England abolitionist milieu at a time when the United States was fracturing over the issue of slavery with a medium level dive into the intricacies of Darwin’s theory of natural selection.   But the story Fuller tells is anything but dry and abstract.  With an elegant writing style and an acute sense of detail, Fuller places his five men and their thinking about Darwin in their habitat, the frenetic world of 1860s New England.  In vivid passages, readers can almost feel the chilly January wind whistling through Franklin Sanborn’s parlor that New Year’s Day 1860, or envision the mud accumulating on Henry David Thoreau’s boots as he trudges through the melting snow in the woods on a March afternoon contemplating Darwin.  The result is a lively, easy-to-read narrative that nimbly mixes intellectual and everyday, ground-level history.

* * *

Bronson Alcott, described by Fuller as America’s most radical transcendentalist, never accepted the premises of On the Origins of Species.  Darwin had, in Alcott’s view, “reduced human life to chemistry, to mechanical processes, to vulgar materialism” (p.10).  To Alcott, Darwin seemed “morbidly attached to an amoral struggle of existence, which robbed humans of free will and ignored the promptings of the soul” (p.150). Alcott could not imagine a universe “so perversely cruel as to produce life without meaning.  Nor could he bear to live in a world that was reduced to the most tangible and daily phenomena, to random change and process”(p.188).  Asa Gray, one of America’s most eminent scientists, came to the same realization, but  only after thoroughly digesting Darwin and explaining his theories to a wide swath of the American public.

Gray’s initial reaction to Darwin’s work was one of unbounded enthusiasm.  Gray covered nearly every page of the book with his own annotations.  He admired the book because it “reinforced his conviction that inductive reasoning was the proper approach to science” (p.109).  He also admired the work’s “artfully modulated tone, [and] its modest voice, which softened the more audacious ideas rippling through the text” (p.17). Gray was most impressed with Darwin’s “careful judging and clear-eyed balancing of data” (p.110).  To grapple with Darwin’s ideas, Gray maintained, one had to “follow the evidence wherever it led, ignoring prior convictions and certainties or the narrative one wanted that evidence to confirm” (p.110).  Without saying so explicitly, Gray suggested that readers of Darwin’s book had to be “open to the possibility that everything they had taken for granted was in fact incorrect” (p.110).

Gray reviewed On the Origins of Species for the Atlantic Monthly in three parts, appearing  in the summer and fall of 1860.  Gray’s articles served as the first encounter with Darwin for many American readers.  The articles elicited a steady stream of letters from respectful readers.  Some responded with “unalloyed enthusiasm” for a new idea which “seemed to unlock the mysteries of nature” (p.134).  Others, however, “reacted with anger toward a theory that proposed to unravel . . . their belief in a divine Being who had placed humans at the summit of creation” (p.134).  But as Gray finished the third Atlantic article, he began to realize that he himself was not entirely at ease with the diminution of humanity’s place in the universe that Darwin’s work implied.

The third Atlantic article, appearing in October 1860, revealed Gray’s increasing difficulty in “aligning Darwin’s theory with his own religions convictions” (p.213).   Gray proposed that natural selection might be the “God’s chosen method of creation” (p.214).  This idea seemed to resolve the tension between scientific and religious accounts of origins, making Gray the first to develop a theological case for Darwinian theory.  But the idea that natural selection might be the process by which God had fashioned  the world represented what Fuller describes as a “stunning shift for Gray. Before now, he had always insisted that secondary causes were the only items science was qualified to address.  First, or final causes – the beginning of life, the creation of the universe – were the purview of religion: a matter of faith and metaphysics” (p.214).  Darwin responded to Gray’s conjectures by indicating that, as Fuller summarizes the written exchange, the natural world was “simply too murderous and too cruel to have been created by a just and merciful God” (p.211).

In the Atlantic articles, Fuller argues, Gray leapt “beyond his own rules of science, speculating about something that was untestable” (p.214-15 ).  Gray must have known that his argument “failed to adhere to his own definition of science” (p.216).  But, much like Bronson Alcott, Gray found it “impossible to live in the world Darwin had imagined: a world of chance, a world that did not require a God to operate” (p.216).  Charles Brace, a noted social reformer who founded several institutions for orphans and destitute children, greeted Darwin’s book  with an initial enthusiasm that rivaled that of Gray.

Brace  claimed to have read On the Origins of Species 13 times.  He was most attracted to the book for its implications for human societies, especially for American society, where nearly half the country accepted and defended human slavery.  Darwin’s book “confirmed Brace’s belief that environment played a crucial role in the moral life of humans” (p.11), and demonstrated that every person in the world, black, white, yellow, was related to every one else.  The theory of natural selection was thus for Brace the “latest argument against chattel slavery, a scientific claim that could be used in the most important controversy of his time, a clarion call for abolition” (p.39).

Brace produced a tract entitled The Races of the Old World, modeled after Darwin’s On the Origin of Species, which Fuller describes as a “sprawling, ramshackle work” (p.199).  Its central thesis was simple enough: “There is nothing . . . to prove the negro radically different from the other families of man or even mentally inferior to them” (p.199-200).  But much of The Races of the Old World seemed to undercut Brace’s central thesis.  Although the book never defined the term “race,” Brace “apparently believed that though all humans sprang from the same source, some races had degraded over time . . . Human races were not permanent” (p.199-200).  Brace thus struggled to make Darwin’s theory fit his own ideas about race and slavery. “He increasingly bent facts to fit his own speculations” (p.197), as Fuller puts it.

The Races of the Old World revealed Brace’s hesitation in imagining a multi-racial America. He couched in Darwinian terms the difficulty of the races cohabiting,  reverting to what Fuller describes as nonsense about blacks not being conditioned to survive in the colder Northern climate.  Brace “firmly believed in the emancipation of slaves, and he was equally convinced that blacks and white did not differ in their mental capacities” (p.202).  But he nonetheless worried that “race mixing,” or what was then termed race “amalgamation,” might imperil Anglo-Saxon America, the “apex of development. . . God’s favored nation, a place where democracy and Christianity had fused to create the world’s best hope” (p.202).  Brace joined many other leading abolitionists in opposing race “amalgamation.”  His conclusion that “black and brown-skinned people inhabited a lower run on the ladder of civilization” was shared, Fuller indicates, by “even the most enlightened New England abolitionists” (p.57).

No such misgivings visited Thoreau, who  grappled with On the Origins of Species “as thoroughly and as insightfully as any American of the period” (p.11).  As Thoreau first read his copy of the book in late January 1860,  a “new universe took form on the rectangular page before him” (p.75).  Prior to his encounter with Darwin, Thoreau’s thought had often “bordered on the nostalgic.  He longed for the transcendentalist’s confidence in a natural world infused with spirit” (p.157).  But Darwin led Thoreau beyond nostalgia.

Thoreau was struck in particular by Darwin’s portrayal of the struggle among species as an engine of creation.  The Origin of Species revealed nature as process, in constant transformation.  Darwin’s book directed Thoreau’s attention “away from fixed concepts and hierarchies toward movement instead” (p.144-45).  The idea of struggle among species “undermined transcendentalist assumptions about the essential goodness of nature, but it also corroborated many of Thoreau’s own observations” (p.137).  Thoreau had “long suspected that people were an intrinsic part of nature – neither separate nor entirely alienated from it” (p.155).  Darwin now enabled Thoreau to see how “people and the environment worked together to fashion the world,” providing a “scientific foundation for Thoreau’s belief that humans and nature were part of the same continuum” (p.155).

Darwin’s natural selection, Thoreau wrote, “implies a greater vital force in nature, because it is more flexible and accommodating, and equivalent to a sort of constant new creation” (p.246).  The phrase “constant new creation” in Fuller’s view represents an “epoch in American thought” because it “no longer relies upon divinity to explain the natural world” (p.246).  Darwin thus propelled Thoreau to a radical vision in which there was “no force or intelligence behind Nature, directing its course in a determined and purposeful manner.  Nature just was” (p.246-47).

How far Thoreau would have taken these ideas is impossible to know. He became sick in December 1860, stricken with influenza, exacerbated by tuberculosis, and died in June 1862, with Americans fighting other Americans on the battlefield over the issue of slavery.

* * *

            Fuller compares Darwin’s On the Origin of Species to a Trojan horse.  It entered American culture “using the newly prestigious language of science, only to attack, once inside, the nation’s cherished beliefs. . . With special and desolating force, it combated the idea that God had placed humans at the peak of creation” (p.213).  That the book’s attack did not spare even New England’s best known abolitionists and transcendentalists demonstrates just how unsettling the attack was.

Thomas H. Peebles

La Châtaigneraie, France

May 18, 2020

 

10 Comments

Filed under American Society, History, Political Theory, Religion, Science, United States History

The Power of Human Rights

 

Samantha Power, The Education of an Idealist:

A Memoir 

By almost any measure, Samantha Power should be considered an extraordinary American success story. An immigrant from Ireland who fled the Emerald Isle with her mother and brother at a young age to escape a turbulent family situation, Power earned degrees from Yale University and Harvard Law School, rose to prominence in her mid-20s as a journalist covering civil wars and ethnic cleaning in Bosnia and the Balkans, won a Pulitzer Prize for a book on 20th century genocides, and helped found the Carr Center for Human Rights Policy at Harvard’s Kennedy School of Government, where she served as its executive director — all before age 35.  Then she met an ambitious junior Senator from Illinois, Barack Obama, and her career really took off.

Between 2009 and 2017, Power served in the Obama administration almost continually, first on the National Security Council and subsequently as Ambassador to the United Nations.  In both capacities, she became the administration’s most outspoken and influential voice for prioritizing human rights, arguing regularly for targeted United States and multi-lateral interventions to protect individuals from human rights abuses and mass atrocities, perpetrated in most cases by their own governments.  In what amounts to an autobiography, The Education of an Idealist: A Memoir, Power guides her readers through  the major foreign policy crises of the Obama administration.

Her life story, Power tells her readers at the outset, is one of idealism, “where it comes from, how it gets challenged, and why it must endure” (p.xii).  She is quick to emphasize that hers is not a story of how a person with “lofty dreams” about making a difference in the world came to be “’educated’ by the “brutish forces” (p.xii) she encountered throughout her professional career.  So what then is the nature of the idealist’s “education” that provides the title to her memoir?  The short answer probably lies in how Power learned to make her idealistic message on human rights both heard and effective within the complex bureaucratic structures of the United States government and the United Nations.

But Power almost invariably couples this idealistic message with the view that the promotion and protection of human rights across the globe is in the United States’ own national security interests; and that the United States can often advance those interests most effectively by working multi-laterally, through international organizations and with like-minded states.  The United States, by virtue of its multi-faceted strengths – economic, military and cultural – is in a unique position to influence the actions of other states, from its traditional allies all the way to those that inflict atrocities upon their citizens.

Power acknowledges that the United States has not always used its strength as a positive force for human rights and human betterment – one immediate example is the 2003 Iraq invasion, which she opposed. Nevertheless, the United States retains a reservoir of credibility sufficient to be effective on human rights matters when it choses to do so.   Although Power is sometimes labeled a foreign policy “hawk,” she recoils from that adjective.  To Power, the military is among the last of the tools that should be considered to advance America’s interests around the world.

Into this policy-rich discussion, Power weaves much detail about her personal life, beginning with her early years in Ireland,  the incompatibilities between her parents that prompted her mother to take her and her brother to the United States when she was nine, and her efforts as a schoolgirl to become American in the full sense of the term. After numerous failed romances, she finally met Mr. Right, her husband, Harvard Law School professor Cass Sunstein (who also served briefly in the Obama administration). The marriage gave rise to a boy and a girl with lovely Irish names, Declan and Rían, both born while Power was in government.  With much emphasis upon her parents, husband, children and family life, the memoir is also a case study of how professional women balance the exacting demands of high-level jobs with the formidable responsibilities attached to being a parent and spouse.  It’s a tough balancing act for any parent, but especially for women, and Power admits that she did not always strike the right balance.

Memoirs by political and public figures are frequently attempts to write one’s biography before someone else does, and Power’s whopping 550-page work seems to fit this rule.  But Power provides much candor  – a willingness to admit to mistakes and share vulnerabilities – that is often missing in political memoirs. Refreshingly, she also abstains from serious score settling.  Most striking for me is the nostalgia that pervades the memoir.  Power takes her readers down memory lane, depicting a now by-gone time when the United States cared about human rights and believed in bi- and multi-lateral cooperation to accomplish its goals in its dealings with the rest of the world – a time that sure seems long ago.

* * *

Samantha Jane Power was born in 1970 to Irish parents, Vera Delaney, a doctor, and Jim Power, a part-time dentist.  She spent her early years in Dublin, in a tense family environment where, she can see now, her parents’ marriage was coming unraveled.  Her father put in far more time at Hartigan’s, a local pub in the neighborhood where he was known for his musical skills and “holding court,” than he did at his dentist’s office.  Although young Samantha didn’t recognize it at the time, her father had a serious alcohol problem, serious enough to lead her mother to escape by immigrating to the United States with the couple’s two children, Samantha, then age nine, and her brother Stephen, two years younger. They settled in Pittsburgh, where Samantha at a young age set about to become American, as she dropped her Irish accent, tried to learn the intricacies of American sports, and became a fervent Pittsburgh Pirates fan.

But the two children were required under the terms of their parents’ custody agreement to spend time with her father back in Ireland. On her trip back at Christmas 1979, Samantha’s father informed the nine-year old that he intended to keep her and her brother with him.  When her mother, who was staying nearby, showed up to object and collect her children to return to the United States, a parental confrontation ensued which would traumatize Samantha for decades.  The nine year old found herself caught between the conflicting commands of her two parents and, in a split second decision, left with her mother and returned to the Pittsburgh. She never again saw her father.

When her father died unexpectedly five years later, at age 47 of alcohol-related complications, Samantha, then in high school, blamed herself for her father’s death and carried a sense of guilt with her well into her adult years. It was not until she was thirty-five, after many therapy sessions, that she came to accept that she had not been responsible for her father’s death.  Then, a few years later, she made the mistake of returning to Hartigan’s, where she encountered the bar lady who had worked there in her father’s time.   Mostly out of curiosity, Power asked her why, given that so many people drank so much at Hartigan’s, her father had been the only one who died. The bar lady’s answer was matter-of-fact: “Because you left” (p.192) — not what Power needed to hear.

Power had by then already acquired a public persona as a human rights advocate through her work as a journalist in the 1990s in Bosnia, where she called attention to the ethnic cleansing that was sweeping the country in the aftermath of the collapse of the former Yugoslavia.  Power ended up writing for a number of major publications, including The Economist, the New Republic and the Washington Post.   She was among the first to report on the fall of Srebrenica in July 1995, the largest single massacre in Europe since World War II, in which around 10,000 Muslim men and boy were taken prisoner and “seemed to have simply vanished” (p.102). Although the United States and its NATO allies had imposed a no-fly zone over Bosnia, Power hoped the Clinton administration would commit to employing ground troops to prevent further atrocities. But she did not yet enjoy the clout to have a real chance at making her case directly with the administration.

Power wrote a chronology of the conflict, Breakdown in the Balkans, which was later put into book form and attracted attention from think tanks, and the diplomatic, policy and media communities.  Attracting even more attention was  A Problem for Hell: America and the Age of Genocide, her book exploring  American reluctance to take action in the face of 20th century mass atrocities and genocides.  The book appeared in 2002, and won the 2003 Pulitzer Prize for General Non-Fiction.  It also provided Power with her inroad to Senator Barack Obama.

At the recommendation of a politically well-connected friend, in late 2004 Power sent a copy of the book to the recently elected Illinois Senator who had inspired the Democratic National Convention that summer with an electrifying keynote address.  Obama’s office scheduled a dinner for her with the Senator which was supposed to last 45 minutes.  The dinner went on for four hours as the two exchanged ideas about America’s place in the world and how, why and when it should advance human rights as a component of its foreign policy.  Although Obama considered Power to be primarily an academic, he offered her a position on his Senate staff, where she started working late in 2005.

Obama and Power would then be linked professionally more or less continually until the end of the Obama presidency in January 2017.   Once Obama enters the memoir, at about the one-third point, it becomes as much his story as hers. The two did not always see the world and specific world problems in the same way, but it’s clear that Obama had great appreciation both for Power’s intelligence and her intensity. He was a man who enjoyed being challenged intellectually, and plainly valued the human rights perspective that Power brought to their policy discussions even if he wasn’t prepared to push as far as Power advocated.

After Obama threw his hat in the ring for the 2008 Democratic Party nomination, Power became one of his primary foreign policy advisors and, more generally, a political operative. It was not a role that fit Power comfortably and it threatened to be short-lived.  In the heat of the primary campaign, with Obama and Hilary Clinton facing off in a vigorously contested battle for their party’s nomination, Power was quoted in an obscure British publication, the Scotsman, as describing Clinton as a “monster.” The right-wing Drudge Report picked up the quotation, whose accuracy Power does not contest, and suddenly Power found herself on the front page of major newspapers, the subject of a story she did not want.  Obama’s closest advisors were of the view that she would have to resign from the campaign.  But the candidate himself, who loved sports metaphors, told Power only that she would have to spend some time in the “penalty box” (p.187).  Obama’s relatively soft reaction was an indication of the potential he saw in her and his assessment of her prospective value to him if successful in the primaries and the general election.

Power’s time in the penalty box had expired when Obama, having defeated Clinton for his party’s nomination, won a resounding victory in the general election in November 2008.  Obama badly wanted Power on his team in some capacity, and the transition team placed her on the President’s National Security Council as principal deputy for international organizations, especially the United Nations.  But she was also able to carve out a concurrent position for herself as the President’s Senior Director for Human Rights.   In this portion of the memoir, Power describes learning the jargon and often-arcane skills needed to be effective on the council and within the vast foreign policy bureaucracy of the United States government.  Being solely responsibility for human rights, Power found that she had some leeway in deciding which issues to concentrate on and bring to the attention of the full Council.  Her mentor Richard Holbrook advised her that she could be most effective on subjects for which there was limited United States interest – pick “small fights,” Holbrook advised.

Power had a hand in a string of “small victories” while on the National Security Council: coaxing the United States to rejoin a number of UN agencies from which the Bush Administration had walked away; convincing President Obama to raise his voice over atrocities perpetrated by governments in Sri Lanka and Sudan against their own citizens; being appointed White House coordinator for Iraqi refugees; helping create an inter-agency board to coordinate the United States government’s response to war crimes and atrocities; and encouraging increased emphasis upon lesbian, gay, bi-sexual and transgender issues (LGBT) overseas.  In pursuit of the latter, Obama delivered an address at the UN General Assembly on LGBT rights, and thereafter issued a Presidential Memorandum directing all US agencies to consider LGBT issues explicitly in crafting overseas assistance (disclosure: while with the Department of Justice, I served on the department’s portion of the inter-agency Atrocity Prevention Board, and represented the department in inter-agency coordination on the President’s LGBT memorandum; I never met Power in either capacity).

But the Arab Spring that erupted in late 2010 and early 2011 presented  anything but small issues and resulted in few victories for the Obama administration.  A “cascade of revolts that would reorder huge swaths of the Arab world,” the Arab Spring ended up “impacting the course of Obama’s presidency more than any other geopolitical development during his eight years in office” (p.288), Power writes, and the same could be said for Power’s time in government.  Power was among those at the National Security Council who pushed successfully for United States military intervention in Libya to protect Libyan citizens from the predations of their leader, Muammar Qaddafi.

The intervention, backed by a United Nations Security Council resolution and led jointly by the United States, France and Jordan, saved civilian lives and contributed to Qaddafi’s ouster and death.  ButPresident Obama was determined to avoid a longer-term and more open-ended United States commitment, and the mission stopped short of the follow-up needed to bring stability to the country.  With civil war in various guises continuing to this day, Power suggests that the outcome might have been different had the United States continued its engagement in the aftermath of Qaddafi’s death.

Shortly after Power became US Ambassador to the United Nations, the volatile issue of an American military commitment arose again, this time in Syria in August 2013, when proof came irrefutably to light that Syrian leader Bashar al-Assad was using chemical weapons in his effort to suppress uprisings within the country.  The revelations came 13 months after Obama had asserted that use of such weapons would constitute a “red line” that would move him to intervene militarily in Syria.  Power favored targeted US air strikes within Syria.

Obama came excruciatingly close to approving such strikes.  He not only concluded that the “costs of not responding forcefully were greater than the risks of taking military action” (p.369), but was prepared to act without UN Security Council authorization, given the certainty of  a Russian veto of any Security Council resolution for concerted action.   With elevated stakes for “upholding the international norm against the use of chemical weapons” Power writes, Obama was “prepared to operate with what White House lawyers called a ‘traditionally recognized legal basis under international law’” (p.369).

But almost overnight, Obama decided that he needed prior Congressional authorization for a military strike in Syria, a decision taken seemingly with little effort to ascertain whether there was sufficient support in Congress for such a strike.  With neither the Congress nor the American public supporting military action within Syria to save civilian lives, Obama backed down.  On no other issue did Power see Obama as torn as he was on Syria,  “convinced that even limited military action would mire the United States in another open-ended conflict, yet wracked by the human toll of the slaughter.  I don’t believe he ever stopped interrogating his choices” (p.508).

Looking back at that decision with the passage of more than five years, Power’s disappointment remains palpable.  The consequences of inaction in Syria, she maintains, went:

beyond unfathomable levels of death, destruction, and displacement. The spillover of the conflict into neighboring countries through massive refugee flows and the spread of ISIS’s ideology has created dangers for people in many parts of the world. . . [T]hose of us involved in helping devise Syria policy will forever carry regret over our inability to do more to stem the crisis.  And we know the consequences of the policies we did choose. For generations to come, the Syrian people and the wide world will be living with the horrific aftermath of the most diabolical atrocities carried out since the Rwanda genocide (p.513-14).

But if incomplete action in Libya and inaction in Syria constitute major disappointments for Power, she considers exemplary the response of both the United States and the United Nations to the July 2014 outbreak of the Ebola virus that occurred in three West African countries, Guinea, Liberia and Sierra Leone.  United States experts initially foresaw more than one million infections of the deadly and contagious disease by the end of 2015.  The United States devised its own plan to send supplies, doctors and nurses to the region to facilitate the training of local health workers to care for Ebola patients, along with 3,000 military personnel to assist with on-the-ground logistics.  Power was able to talk President Obama out of a travel ban to the United States from the three impacted countries, a measure favored not only by Donald Trump, then contemplating an improbable run for the presidency, but also by many members of the President’s own party.

At the United Nations, Power was charged with marshaling global assistance.   She convinced 134 fellow Ambassadors to co-sponsor a Security Council resolution declaring the Ebola outbreak a public health threat to international peace and security, the largest number of co-sponsors for any Security Council resolution in UN history and the first ever directed to a public health crisis.  Thereafter, UN Member States committed $4 billion in supplies, facilities and medical treatments.  The surge of international resources that followed meant that the three West African countries “got what they needed to conquer Ebola” (p.455).  At different times in 2015, each of the countries was declared Ebola-free.

The most deadly and dangerous Ebola outbreak in history was contained, Power observes, above all because of the “heroic efforts of the people and governments of Guinea, Liberia and Sierra Leone” (p.456). But America’s involvement was also crucial.  President Obama provided what she describes as an “awesome demonstration of US leadership and capability – and a vivid example of how a country advances its values and interests at once” (p.438).  But the multi-national, collective success further illustrated “why the world needed the United Nations, because no one country – even one as powerful as the United States – could have slayed the epidemic on its own” (p.457).

Although Russia supported the UN Ebola intervention, Power more often found herself in an adversarial posture with Russia on both geo-political and UN administrative issues.  Yet, she used creative  diplomatic skills to develop a more nuanced relationship with her Russian counterpart, Vitaly Churkin.  Cherkin, a talented negotiator and master of the art of strategically storming out of meetings, valued US-Russia cooperation and often “pushed for compromises that Moscow was disinclined to make” (p.405).  Over time, Power writes, she and Churkin “developed something resembling genuine friendship” (p.406). But “I also spent much of my time at the UN in pitched, public battle with him” (p.408).

The most heated of these battles ensued after Russia invaded Ukraine in February 2014, a flagrant violation of international law. Later that year, troops associated with Russia shot down a Malaysian passenger jet, killing all passengers aboard.  In the UN debates on Ukraine, Power found her Russian counterpart “defending the indefensible, repeating lines sent by Moscow that he was too intelligent to believe and speaking in binary terms that belied his nuanced grasp of what was actually happening” (p.426). Yet, Power and Churkin continued to meet privately to seek solutions to the Ukraine crisis, none of which bore fruit.

While at the UN, Power went out of her way to visit the offices of the ambassadors of the smaller countries represented in the General Assembly, many of whom had never received  a United States Ambassador.  During her UN tenure, she managed to meet personally with the ambassadors from every country except North Korea.  Power also started a group that gathered the UN’s 37 female Ambassadors together one day a week for coffee and discussion of common issues.  Some involved  substantive matters that the UN had to deal with, but just as often the group focused on workplace matters that affected the women ambassadors as women, matters that their male colleagues did not have to deal with.

* * *

Donald Trump’s surprise victory in November 2016 left Power stunned.  His nativist campaign to “Make America Great Again” seemed to her like a “repudiation of many of the central tenets of my life” (p.534).  As an  immigrant, a category Trump seemed to relish denigrating, she “felt fortunate to have experienced many countries and cultures. I saw the fate of the American people as intertwined with that of individuals elsewhere on the planet.   And I knew that if the United States retreated from the world, global crises would fester, harming US interests” (p.534-35).  As Obama passed the baton to Trump in January 2017, Power left government.

Not long after, her husband suffered a near-fatal automobile accident, from which he recovered. Today, the pair team-teach courses at Harvard, while Power seems to have found the time for her family that proved so elusive when she was in government.  She is coaching her son’s baseball team and helping her daughter survey rocks and leaves in their backyard.  No one would begrudge Power’s quality time with her family. But her memoir will likely leave many readers wistful, daring to hope that there may someday  be room again for  her and her energetic idealism in the formulation of United States foreign policy.

Thomas H. Peebles

La Châtaigneraie, France

April 26, 2020

7 Comments

Filed under American Politics, American Society, Politics, United States History

School Girls on the Front Lines of Desegregation

 

Rachel Devlin, A Girl Stands in the Door:

The Generation of Young Women Who Desegregated America’s Schools

(Basic Books)

When World War II ended, public schools in the United States were still segregated by race throughout much of the country.  Segregated schools were mandated by state legislatures in all the states of the former Confederacy (“the Deep South”), along with Washington, D.C., Delaware and Arizona, while a handful of American states barred racial segregation in their public schools.  In the remainder, the decision whether to segregate was left to local jurisdictions.  Racial segregation of public schools found its constitutional sanction in Plessy v. Ferguson, the United States Supreme Court’s 1896 decision which held that equal protection of the law under the federal constitution did not prohibit states from maintaining public facilities that were “separate but equal.”

But “separate but equal” was a cruel joke, particularly as applied to public schools: in almost every jurisdiction which maintained segregated schools, those set aside for African-Americans were by every objective standard unequal and inferior to counterpart white schools.  In 1954, the Supreme Court, in one of its most momentous decisions, Brown v. Board of Education of Topeka, Kansas, invalidated the Plessy “separate but equal” standard as applied to public schools, holding that in the school context separate was inherently unequal.  The decision preceded by a year and a half the Montgomery, Alabama, bus boycott that made both Rosa Parks and Martin Luther King, Jr., household names.  The pathway leading to Brown was arguably the opening salvo in what we now term the modern Civil Rights Movement.

That pathway has been the subject of numerous popular and scholarly works, the best known of which is Richard Kluger’s magisterial 1975 work Simple Justice.  In Kluger’s account and most others, the National Association for the Advancement of Colored People (NAACP) and its Legal Defense Fund (LDF), which instituted Brown and several of its predecessor cases, are front and center, with future Supreme Court justice Thurgood Marshall, the LDF’s lead litigator, the undisputed lead character.  Yet, Rachel Devlin, an associate professor of history at Rutgers University, maintains that earlier studies of the school desegregation movement, including that of Kluger, overlook a critical point: the students who desegregated educational institutions – the “firsts,” to use Devlin’s phrase — were mostly girls and young women.

Devlin’s research revealed that only one of the early, post-World War II primary and secondary school desegregation cases that paved the way to the Brown decision was filed on behalf of a boy.  Looking at those who “attempted to register at white schools, testified in court, met with local white administrators and school boards, and talked with reporters from both the black and white press,” Devlin saw almost exclusively schoolgirls.  This disparity “held true in the Deep South, upper South, and Midwest” (p.x). After the Brown decision, the same pattern prevailed: “girls and young women vastly outnumbered boys as the first to attend formerly all-white schools” (p.x).

Unlike Kluger, Devlin does not focus on lawyers and lawsuits but rather on the “largely young, feminine work that brought school desegregation into the courts” (p.xi).  She begins with court challenges to state enforced segregation at the university level, some of which began before World War II.  She then proceeds to a host of post-World War II communities that challenged racial segregation in primary and second schools in the late 1940s and early 1950s.  The Brown decision itself, a ruling on segregated schools in Topeka, Kansas, merits only a few pages, after which she portrays the first African-American students to enter previously all-white schools during the second half of the 1950s and into the 1960s.  The pre-Brown challenges to segregated public education that Devlin highlights took place in Washington, D.C., Kansas, Delaware, Texas and Virginia. In her post-Brown analysis, she turns to the Deep South, to communities in Louisiana, Georgia and South Carolina.

Devlin’s intensely factual and personality-driven narrative at times falls victim to a forest-and-trees problem: she focuses on a multitude of individuals — the trees — to the point that the reader  can easily lose sight of the forest — how the featured individuals fit into the overall school desegregation movement.  Yet, there are a multitude of lovely trees to behold in Devlin’s forest – heroic and endearing schoolgirls and the adults who supported them, both men and women, all willing to confront entrenched racial segregation in America’s public schools.

* * *

School desegregation, Devlin writes, differed from other civil rights battles, such as desegregation of lunch counters, public transportation, and parks, in that interacting with white people was not “fleeting or ‘fortuitous,’ but central to the project itself.  School desegregation required sustained interactions with white school officials and students. This fact called for a different approach than other forms of civil rights activism” (p.xxiv).   But Devlin also emphasizes that this different approach gave rise to controversy among affected African-Americans.

In almost every community she studied, there was a dissident African-American faction that opposed desegregation of all-white schools, favoring direct pressure and court cases designed to force school authorities to make good on the “equal” portion of “separate but equal.”  Parents who favored this less frontal approach, while “willing to protest unequal schools, simply wanted a better education for their children while they were still young enough to receive it, not a long, hard campaign against a long-standing Supreme Court precedent” (p.167).  Devlin demonstrates that this quest for equalization, however understandable, was at best quixotic. Time and time again, she shows, the white power structure in the communities she studies had no serious intention of equalizing black and white schools.

Why girls and young women predominated in school desegregation efforts is as much a part of Devlin’s story as the particulars of those efforts at the institutions and in the communities she studies.  After WWII, she notes, there was a “strong, though unstated, cultural assumption that the war to end school desegregation was a girls’ war, a battle for which young women and girls were specially suited” (p.xvi).  With the example of boys and young men who had gone off to fight in World War II fresh in everyone’s minds, Devlin speculates, girls and young women may have felt an “ethical compulsion to act at a young age” (p.xvi).

Devlin was able to interview several of the female firsts for her book as they looked back on their experience in desegregating schools several decades earlier.  These women, she indicates, had been inspired as school girls “not only by a sense of obligation and individual calling but also by the opportunity to do something important and highly visible in a world and at a time when young women did not often earn much public acclaim” (p.225). The boys and young men she studied, by contrast, manifested a “desire to distance themselves from an overt, individual commitment to desegregating schools” (p.223).  Leaving was more of an option for high school age boys who felt alienated in newly desegregated schools.  They had “more mobility – and autonomy – than young women, and it allowed them to walk away from the school desegregation process when they felt it was not working for them” (p.196).   Leaving for girls “did not feel like a choice, both because they understood their parents’ expectations of them and because they had fewer alternatives” (p.196).

* * *

The pathway to Brown in Devlin’s account starts at the university level with Lucille Bluford and Ida Mae Sipuel, two lesser-known women who were denied admission because of their race to, respectively, the University of Missouri School of Journalism and the University of Oklahoma Law School.  Both saw their court cases overshadowed by those of men, Lloyd Gaines and Herman Sweatt, pursuing university level desegregation in court at the same time.  But while the two men’s cases established major Supreme Court precedents, both proved to be disappointing plaintiffs and spokesmen for the desegregation cause, in sharp contrast to Bluford and Sipuel.

Gaines was the beneficiary of one of the Supreme Court’s first major decisions involving higher education, Gaines v. Canada, where the Court ruled in 1938 that the State of Missouri was required either to admit Gaines to the University of Missouri Law School or create a separate facility for him.  Missouri chose the latter option, which Gaines refused.  But he thereafter went missing.  He was last seen taking a train to Chicago and was never heard from again.  Bluford, then a seasoned journalist working for the African-American newspaper the Kansas City Call, not only covered the Gaines litigation decision but also set out to gain admission herself to the University of Missouri’s prestigious School of Journalism.

Both “hardheaded and gregarious” (p.32), Bluford doggedly pursued admission to the university’s journalism school between 1939 and 1942.  In her court case, her lawyer, the NAACP’s Charles Houston, provided the book’s title in his closing argument when he told the court: “A girl stands at the door and a generation waits outside” (p.27).  When Bluford won a victory in court in 1942, Missouri chose to close its journalism school, citing low wartime enrollment, rather than admit Bluford.  But with her uncanny ability to find “significance in small acts of decency and mutual acknowledgement in everyday encounters” (p.11), Bluford turned her energies to reporting on school desegregation cases throughout the country, including both Sipuel’s quest to enter the University of Oklahoma Law School and the Kansas desegregation cases that led to Brown.

Sipuel agreed to challenge the University of Oklahoma Law School’s refusal to admit African-Americans only after her brother Lemuel turned down the NAACP’s request to serve as plaintiff in the case.  In 1946, she refused Oklahoma’s offer create a separate “Negro law school,” and two years later won a major Supreme Court case when the Court ruled that Oklahoma was obligated to provide her with legal education equal to that of whites.  Sipuel became the near perfect first at the law school, Devlin writes, personifying the uncommon array of skills required in that sensitive position:  “personal ambition combined with an ability to withstand public humiliation, charisma in front of the camera and self-sacrificing patience, the appearance of openness with the black and white press corps alongside an implacable determination” (p.67).

The “girl who started the fight,” as one black newspaper described Sipuel, became “something of a regional folk hero” (p.52) as a role model for future desegregation plaintiffs.  The “revelation that school desegregation was in their grasp came not from the persuasive power of NAACP officials and lawyers,” Devlin writes, but from the “‘young girl’ who would not be turned down” (p.37).  Sipuel went on to become the law school’s first African American graduate and thereafter the first African-American to pass the Oklahoma bar.

Sipuel’s engaging and exuberant public persona contrasted with that of Herman Sweatt, who sought to enter the University of Texas’s flagship law school in Austin.  In a 1950 case bearing his name, Sweatt v. Painter, the Supreme Court rejected Texas’ contention that it could satisfy the requirements of the constitution’s equal protection clause by consigning Sweatt to a “Negro law school” it had established in Houston.  The Court’s sweeping decision outlawed segregation in its entirety in graduate school education.  But although Sweatt did not go missing in action like Lloyd Gaines, he never completed his course of study at the University of Texas Law School and proved to be ill suited to the high-visibility, high-pressure role of a desegregation plaintiff.  He exuded neither Sipuel’s enthusiastic commitment to desegregated higher education, nor her grace under fire.

As the Supreme Court was rewriting the rules of university level education, dozens of cases challenging primary and secondary school segregation were percolating in jurisdictions across America, with Washington, D.C., and Meriam, Kansas, near Kansas City, providing the book’s most memorable characters.  Rigidly segregated Washington,  the nation’s capital, had several lawsuits going  simultaneously, each of which featured a strong father standing behind a courageous daughter.

First out of the gate was 14-year old Marguerite Carr.  Amidst much fanfare, in 1947 Marguerite’s father took her to enroll at a newly built white middle school two blocks from her home, where she faced off with the school principal.  When the principal told her, “you don’t want to come here,” Carr smiled, a “sign of social reciprocity, trustworthiness, a willingness to engage,” yet at the same time told the principal respectfully but firmly, “I do want to come to this school” (p.ix).  Carr’s combative response was pitch perfect, Devlin argues, meeting the “contradictory requirements inherent in such confrontations” (p.ix).

Marguerite’s court case coincided with that of Karla Galaza, a Mexican-American who had been attending  a black vocational school with a strong program in dress design until school authorities discovered that she was not black and barred her from the school.  Her stepfather, a Mexican-American activist, filed suit on his daughter’s behalf.  Simultaneously, Gardner Bishop surged into a leadership position during an African-American student strike challenging segregated education in Washington.  Bishop, by day a barber, was an activist who thrust his somewhat reluctant daughter Judine into the strike and subsequent litigation.  Bishop described himself as an outsider in Washington’s desegregation battle, representing the city’s African-American working class rather than its black middle class.  None of these cases culminated in a major court decision.

The NAACP later chose Spotswood Bolling as the lead plaintiff over a handful of girls in the lawsuit that accompanied Brown to the Supreme Court.  The young Bolling was another elusive male plaintiff, dodging all reporters and photographers.  His discomfort with the press “sets in high relief the performances of girl plaintiffs with reporters in the late 1940s (p.173),” Devlin argues.  Girls and young women “felt it was their special responsibility to find ways to address such inquiries. Bolling evidently did not” (p.174).   But the case bearing his name, Bolling v. Sharp, decided at the same time as Brown, held that segregation in Washington’s public schools was unconstitutional even though, as a federal district rather than a state, Washington was not technically bound by the constitution’s equal protection clause.

In South Park, Kansas, an unincorporated section of Merriam, located outside Kansas City, Esther Brown, arguably the book’s most unforgettable character, led a student strike over segregated schools.  Brown, a 23-year-old Jewish woman, committed radical and communist sympathizer, cast herself as merely a “housewife with a conscience” — a “deliberately humble, naïve, and conservative image” (p.108) that she invoked constantly in her dealings with public.  Lucille Bluford covered the strike for the Kansas City Call.  Bluford and the “White Mrs. Brown,” as she was called, subsequently became friends (Esther Brown was not related to Oliver Brown, the named plaintiff in the Brown case).

During the South Park student strike, Esther Brown went out on a limb to promise that she would find a way to pay the teachers herself.  She organized a Billie Holiday concert, but most of her fund raising targeted people of modest means – farmers, laborers, and domestics.  She eventually persuaded Thurgood Marshall that the NAACP should initiate a court case, despite Marshall’s initial reservations — he was suspicious of what he described as a “one woman show” (p.125).  Although the lawsuit was filed on behalf of an even number of boys and girls, Patricia Black, then eight years old, was chosen to testify in court — “setting another pattern of female participation for the cases to come” (p.111).  Black, who wore a white bow in her hair when she testified, reflected years later that she had been “taught how to act,” which meant “having manners . . . sitting up straight . . . making eye contract, being erect, and [being] nice” (p.139).

The South Park lawsuit led to the NAACP’s first major desegregation victory below the university level.  Black grade school students successfully entered the white school in the fall of 1949. The South Park case also inspired the challenge to segregated schooling in Topeka that culminated in the Brown decision.  At the trial in Brown, a 9-year-old girl, Kathy Cape, accepted the personal risk and outsized responsibility of testifying at the trial, rather than  the named plaintiff Oliver Brown, a boy.

With the Supreme Court’s ruling in Brown meriting barely more than a page, Devlin turns in the last third of the book to the schoolgirls who entered previously all white schools in the aftermath of the ruling.  Here, more than in her earlier portions, she describes in stark terms the white opposition to desegregation which, although widespread, was especially ferocious in the Deep South, where the “vast majority of school boards angrily fought school desegregation with every resource available to them” (p.192).  Devlin notes that between 1955 and 1958, southern legislatures passed nearly five hundred laws to impede implementation of Brown.

In New Orleans, three girls, Tessie Prevost, Leona Tate and Ruby Bridges, were chosen to be firsts as eight year olds at Semmes Elementary School.  Years later, Tessie described to Devlin what she, Leona and Ruby had endured at Semmes.  Administrators, teachers, and fellow pupils “did everything in their power to break us” (p.213-14), Prevost recounted.  Even teachers incited violence against the girls:

The teachers were no better that the kids. They encouraged them to fight us, to do whatever it took.  Spit on us. We couldn’t even eat in the cafeteria; they’d spit on our food – we could hardly use the restrooms  . . . They’d punch you, trip you, kick you . . . They’d push you down the steps . . . I got hit by a bat . . . in the face . . . It was every day. And the teachers encouraged it . . . Every day.  Every day (p.214).

The New Orleans girls’ experience was typical of the young firsts from the other Southern communities Devlin studied, including Baton Rouge, Louisiana, Albany, Georgia and Charleston, South Carolina.  Nearly all experienced relentless abuse, “not simply violence and aggression but a systemic, all encompassing, organized form of endless oppression” (p.214). Throughout the South, black schoolgirls demonstrated an extraordinary ability to “withstand warfare within the school when others could not,” which Devlin characterizes as a “barometer of their determination, courage, ability, and strength” (p.218).

* * *

Devlin acknowledges a growing contemporary disillusionment with the Brown decision and school integration generally among legal scholars, historians and ordinary African-Americans.  But the school desegregation firsts who met with Devlin for this book uniformly believe that their actions more than a half-century earlier had “transformed the arc of American history for the better” (p.268).   Even if Brown no longer occupies quite the exalted place it once enjoyed in the iconography of the modern Civil Rights Movement, the schoolgirls and supporting adults whom Devlin portrays in this deeply researched account deserve our full admiration and gratitude.

 

Thomas H. Peebles

La Châtaigneraie, France

April 8, 2020

 

11 Comments

Filed under American Society, United States History

A Time for New Thinking

 

Arthur Haberman, 1930: Europe in the Shadow of the Beast

(Wildred Lurier University Press) 

 

            Anxiety reigned in Europe in 1930.  The Wall Street stock market crash of the previous October and the ensuing economic crisis that was spreading across the globe threatened to undo much of the progress that had been made in Europe after recovering from the self-inflicted catastrophe of World War I.  A new form of government termed fascism was firmly in place in Italy, based on xenophobic nationalism, irrationality, and an all-powerful state.  Fascism seemed antithetical in just about every way to the universal, secular and cosmopolitan values of the 18th century Enlightenment.  In what was by then known as the Soviet Union, moreover, the Bolsheviks who had seized control during World War I were firmly in power in 1930 and were still threatening, as they had in the immediate post-war years, to spread anti-capitalist revolution westward across Europe.  And in Germany, Adolph Hitler and his unruly Nazi party realized previously unimaginable success in legislative elections in 1930, as they challenged the fragile Weimar democracy.  But if anti-democratic political movements and economic upheavals made average citizens across Europe anxious in 1930, few foresaw the extent of the carnage and destruction that the next 15 years would bring. Things were about to get worse — much worse.

In 1930: Europe in the Shadow of the Beast, Arthur Haberman, professor of history and humanities at York University, seeks to capture the intellectual and cultural zeitgeist of 1930. “What makes 1930 such a watershed is that rarely have so many important minds worked independently on issues so closely related,” Haberman writes. “All argued that something was seriously amiss and asked that people become aware of the dilemma” (p.1).  Haberman focuses on how a handful of familiar thinkers and artists expressed the anxiety that their fellow citizens felt; and how, in different ways, these figures foreshadowed the calamities that lay ahead for Europe.  There are separate chapters on Thomas Mann, Virginia Woolf, Aldous Huxley, Ortega y Gasset, Bertolt Brecht, and Sigmund Freud, each the subject of a short biographical sketch.  But each either published a major work or had one in progress in the 1929-31 time frame, and Haberman’s sketches revolve around these works.  He also includes two lesser known sisters, Paulette and Jane Nardal, two Frenchwomen of African descent who promoted writing that expressed identity and solidarity between blacks in Europe, the Americas and Africa.  Another chapter treats the visual arts in 1930, with a dissection of the various schools and tendencies of the time, among them surrealism, cubism, and fauvism.

But before getting to these figures and their works, Haberman starts with a description of an unnamed, composite European middle class couple living in a major but unidentified city in one of the World War I belligerents.  With all the maimed young men walking the streets using canes and crutches, the “metaphor of sickness and a need to be healed was part of everyday life” (p.7) for the couple.  The couple’s unease was “mirrored by the intellectuals they admired, as they all grappled with what Europe had become and where it was heading” (p.15).

In an extensive final chapter, “Yesterday and Today,” and an Epilogue, “”Europeans Today” — together about one quarter of the book — Haberman assigns himself the herculean task of demonstrating the continued relevance of his figures in contemporary Europe.   Here, he seeks to summarize European anxiety today and the much-discussed European crisis of confidence, especially in the aftermath of the 2008 economic downturn.  It’s an overly ambitious undertaking and the least successful portion of the book.

The key figures Haberman portrays in the book’s first portions were a diverse lot, and it would be an uphill task to tie them together into a neat conceptual package. But if there is a common denominator linking them, it is the specter of World War I, the “Great War,” and the reassessment of Western civilization that it prompted.  The Great War ended the illusion that Europe was at the forefront of civilization and introduced “deep cultural malaise” (p.6).  The “so-called most civilized people on earth committed unprecedented a carnage on themselves” (p.36).  It was thus necessary to think in new ways.

Haberman identifies a cluster of related subjects that both represented this new thinking and heightened the anxiety that average Europeans were sensing about themselves and their future in 1930. They include: the viability of secular Enlightenment values; coming to terms with a darker view of human nature; the rise of the politics of irrationality; mass culture and its dangers; fascism as a norm or aberration; identity and the Other in the midst of Western Civilization; finding ways to represent the post war world visually; and dystopian trends of thought.  The new thinking thus focused squarely on what it meant to be European and human in 1930.

* * *

            None of the figures in Haberman’s study addressed more of these subjects in a single work than the Spanish thinker Ortega y Gasset, whose Revolt of the Masses appeared in 1930.  Here, Ortega confronted the question of the viability of liberal democracy and the durability of the Enlightenment’s core values.  Ortega emphasized liberal democracy’s potential for irrationality and emotion to override reason in determining public choices.  He described a new “mass man” who behaved through “instinct and desire,” could be “violent and brutal” (p.55), and “will accept, even forward, both political and social tyranny” (p.53).  Ortega referred to Bolshevism and Fascism as “retrograde and a new kind of primitivism” (p.54).  The two ideologies, he concluded, gave legitimacy to the brutality he saw cropping up across Europe.

Although Ortega posited a dark view of human nature, it was not far from what had been apparent in the works of Sigmund Freud for decades prior to 1930.  Freud, whom Haberman ranks on par with Einstein as the most famous and influential intellect of his time, was 74 years old in 1930.  Although ill with throat cancer that year, Freud used an extended essay, Civilization and its Discontents, to reflect upon the conscious and unconscious, on sanity, insanity, and madness, and on the contradictions we live with.  His reflections became “central to how humans understood themselves as individuals and social beings” (p.143).

Culture and civilization are more fragile than we had thought, Freud contended. We must constantly reinforce those things that keep civilization going: “the limitations on our sexual life, the rule of law, the restrictions on our aggressive nature, and the hopeless commandment to love our neighbors, even if we don’t like them” (p.150).  The insights from Civilization and its Discontents and Freud’s other works were used in  literature, art and the study of religion, along with philosophy, politics and history.  These insights – these Freudian insights — opened for discussion “matters that had been sealed” (p.162), changing the way we think about ourselves and our nature.  Freud “tried to be a healer in a difficult time,” Habermas writes, one who “changed the discourse about humans and society forever” (p.162).

Virginia Woolf claimed she had not read Freud when she worked on The Waves, an experimental novel, throughout 1930.  The Waves nonetheless seemed to echo Freud, especially in its idea that the unconscious is a “layer of our personality, perhaps the main layer.  All of her characters attempt to deal with their inner lives, their perceptions” (p.44). In The Waves, Woolf adopted the idea that human nature is “very complex, that we are sometimes defined by our consciousness of things, events, people and ourselves, and that there are layers of personality” (p.43).  There are six different narrative voices to The Waves.  The characters sometimes seem to meld into one another.

Woolf had already distinguished herself as a writer heavily invested in the women’s suffragette movement and had addressed  in earlier writings how women can achieve freedom independently of men.  Haberman sees Woolf as part of a group of thinkers who “set the stage for the more formal introduction of existentialism after the Second World War . . . She belongs not only to literature but to modern philosophy” (p.46).

With Mario and the Magician, completed in 1930, novelist Thomas Mann made his first explicit foray into political matters.  Mann, as famous in Germany as Woolf was in Britain, suggested in his novel that culture and politics were intertwined in 1930 as never before.  By that year, Mann had become an outspoken opponent of the Nazi party, which he described as a “wave of anomalous barbarism, of primitive popular vulgarity” (p.29).  Mario and the Magician, involving a German family visiting Italy, addressed the implications of fascism for Italy and Europe generally.

Like Ortega, Mann in his novel examined the “abandonment of personality and individual responsibility on the part of the person who joins the crowd” (p.24).  Like Freud, Mann saw humanity as far more irrational and complicated than liberal democracy assumed.  The deified fascist leader in Mann’s view goes beyond offering simply policy solutions to “appeal to feelings deep in our unconscious and [tries] to give them an outlet” (p.24).  Mann was in Switzerland when the Nazis assumed power in 1933.  His children advised him not to return to Germany, and he did not do so until 1949.  He was stripped of his German citizenship in 1936 as a traitor to the Reich.

Still another consequential novel that appeared in 1930, Aldous Huxley’s Brave New World, was one of the 20th century’s first overtly dystopian works of fiction, along with Yevgeny Zamiatin’s We (both influenced George Orwell’s 1984, as detailed in Dorian Lynskey’s study of Orwell’s novel, reviewed here last month).   Brave New World used “both science and psychology to create a future world where all are happy, there is stability, and conflict is ended” (p.132).  The dystopian novel opened the question of the ethics of genetic engineering.   In 1930, eugenics was considered a legitimate branch of science, a way governments sought to deal with the undesirables in their population, especially those they regarded as unfit.  Although bioethics was not yet a field in 1930, Huxley’s Brave New World made a contribution to its founding.  Huxley’s dystopian work is a “cautionary tale that asks what might happen next.  It is science fiction, political philosophy, ethics, and a reflection on human nature all at once” (p.132).

Haberman’s least familiar figures, and for that reason perhaps the most intriguing, are the Nardal sisters, Paulette and Jane, French citizens of African descent, born in Martinique and living in 1930 in Paris.  The sisters published no major works equivalent to Civilization and Its Discontents or Revolt of the Masses.  But they founded the highly consequential La Revue du Monde Noir, a bi-lingual, French and English publication that featured contributions from African-American writers associated with the Harlem Renaissance, along with French-language intellectuals.   Writings in La Revue challenged head-on the notions underlying French colonialism.

Although France in 1930 was far more welcoming to blacks than the United States, the French vision of what it meant to be black was, as Haberman puts it, a “colonialist construction seen through the eyes of mainly white, wealthy elites” (p.89) that failed to acknowledge the richness and variety of black cultures around the world.  Educated blacks in France were perceived as being  “in the process of becoming cosmopolitan, cultured people in the French tradition, a process they [the French] called their mission civilatrice” (p.89).  Like many blacks in France, Paulette and Jane Nardal “refused to accept this formulation and decided that their identity was more varied and complex than anything the French understood” (p.89).

The Nardal sisters advanced the notion of multiple identities, arguing that the black spirit could be “informed and aided by the association with the West, without losing its own core” (p.92).   Blacks have an “alternative history from that of anyone who was white and born in France. Hence, they needed to attempt to get to a far more complex concept of self, one deeper and richer than those in the majority and the mainstream” (p.100).   The Nardals also came to understand the connection between black culture in Europe and gender.  Black women, “like many females, are a double Other, and this makes them different not only from whites but from Black men as well” (p.101; but conspicuously missing in this work is any sustained discussion of the Jew as the Other, even though anti-Semitism was rising alarmingly in Germany and elsewhere in Europe in 1930).

Between 1927 and 1933,  Bertold Brecht collaborated with Kurt Weill to rethink theatre and opera.  Brecht, alone among the thinkers Haberman portrays, brought an explicit Marxist perspective to his work.  Brecht supplied both the lyrics and dialogue to the pair’s plays, while Weill composed the music.   The Three Penny Opera, their joint work first performed in Berlin in 1928, was a decidedly non-traditional opera that proved to be spectacular success in Weimar Germany.

In 1930, the Brecht and Weill produced The Rise and Fall of the City of Mahagonny, an even less traditional production.  Brecht termed Mahagonny “epic theatre,” whose purpose was “not to entertain or provide the audience with an imitation of their lives” (p.70), but rather to engage the audience in issues of social justice.  Epic theatre was designed to “force the spectator to be active, to query his own assumptions”(p.78).

Haberman describes Mahagonny as an angry anti-capitalist production, a strange sort of “utopia of desire,” where money rules.  Its lesson: in a capitalist society, all is “commoditized, no relationship is authentic . . . [M]oney cannot satisfy human needs” (p.81-82).  The Nazis, who enjoyed increased popular support throughout 1930, regularly demonstrated against Mahagonny performances. Both Brecht and Weill fled Germany when the Nazis came to power in early 1933.  Neither The Three Penny Opera nor Mahagonny was performed again in Germany until after World War II.

Haberman sees Brecht and Weill as stage and musical companions to surrealist painters such as René Magritte and Salvador Dali, who were also juxtaposing traditional elements to force audiences to ask what was really going on.  Magritte’s The Key to Dreams, a name that is a direct reference to Freud, was a painting about painting and how we construct reality.  Words are not the objects themselves, Magritte seemed to be saying.  Paintings can refer to an object but are not the object itself.   Salvador Dali was the rising star of surrealism in 1930.  His paintings were at once “provocative, mythic, and phallic, while also using juxtaposition to great effect” (p.115).  As with Magritte, the code of understanding in Dali paintings is “closer to Freudian psychology than it is to ‘reason’” (p.115).

The most transformative shift in the visual arts by 1930 was the abandonment of mimesis, the idea that a work of art should represent external reality.  Artists from the many varying schools regarded external reality as “just appearance, not realty at all.  Now it was necessary to go through or beyond appearance to investigate what was real” (p.107).  Artists like Pablo Picasso, Georges Braque and Henri Matisse “wanted a painting to be seen holistically before being analyzed in its parts” (p.118). Like Woolf in literature, these artists by 1930 were depicting “multiple realities,” with the “whole, deep world of the unconscious guiding us” (p.108).

In the end, Haberman concludes, the perspective of the major artists of 1930 was in line with that of the writers he portrays. All in their own way:

feared where humanity was headed, in some cases they feared what they discovered about human nature. They wrote and created art. They did so in order to both help us know about ourselves and offer some redemption for a hard time. They did so because, in spite of their fears, and in spite of their pessimism, they had hope that our better nature would triumph.   Their works are relevant today as they were in 1930 (p.212).

* * *

                        Articulating their contemporary relevance is the purpose of Haberman’s extensive final chapter and epilogue, where he also seeks to summarize contemporary Europe’s zeitgeist.  The Enlightenment faith in the idea and inevitability of progress has now “more or less ended,” he argues, and the world “no longer seems as automatically better as time moves on” (p.171) – the core insight which World War I provided to the generation of 1930.  The politics of irrationality of the type that so worried Ortega seems again resurgent in today’s Europe.  Nationalism – in Haberman’s view, the most influential of the modern ideologies born in the 19th century – “persists and appears to be growing in Europe in a more frightening manner, in the rise of racist neo-fascist and quasi-fascist parties in many countries. What was once thought impossible after the defeat of Hitlerian Germany is now coming into being” (p.168).

Despite the rise of European social democracy in the aftermath of World War II, there is a trend toward appropriation of wealth in fewer and fewer hands, with the gap between the rich and poor widening.   Traditional religion has less hold on Europeans today than it did in 1930 — although it had no apparent hold on any of the writers and artists Haberman features. The question of the place for the Other – marginalized groups like the blacks of the Nardal sisters’ project – has come to the fore in today’s Europe.  Haberman frames the question as whether today’s Europe, theoretically open, liberal, tolerant and egalitarian, is so “only for those who conform to the norm – who are white, indigenous to whatever place they live, nominally or deeply Christian, and identifying strongly with the nation.”  Or is there something “built into European culture as it is taught and practiced that automatically marginalizes women, Blacks, Jews, Roma, and Muslims?” (p.185).

After posing this unanswerable question, Haberman finishes by returning to his composite couple, explaining how their lives were changed by events between 1930 and 1945.  They lost a son in battle in World War II and some civilian relatives were also killed.  Haberman then fast-forwards to the couple’s granddaughter, born in 1982, who married at age 30 and is now pregnant.   She and her husband are ambivalent about their future.  Peace is taken for granted in the way it was not in 1930.  But there is pessimism in the economic sphere.  The couple sees the tacit social contract between generations fraying. The issues that move the couple most deeply are the environment and concerns about climate change.

* * *

               Through his individual portraits, Haberman provides a creative elaboration upon the ideas which leading thinkers and artists wrestled with in the anxious year of 1930.  Describing contemporary applications of these ideas , as he attempts to do in the latter portion of his work, would be a notable accomplishment for an entire book and his attempt to do so here falls flat.

 

 

Thomas H. Peebles

La Châtaigneraie, France

March 15, 2020

 

 

9 Comments

Filed under European History, Intellectual History

A Defense of Truth

 

Dorian Lynskey, The Ministry of Truth:

The Biography of George Orwell’s 1984 

                           George Orwell’s name, like that of William Shakespeare, Charles Dickens and Franz Kafka, has given rise to an adjective.  “Orwellian” connotes official deception, secret surveillance, misleading terminology, and the manipulation of history.   Several terms used in Orwell’s best known novel, Nineteen Eighty Four, have entered into common usage, including “doublethink,” “thought crime,” “newspeak,” “memory hole,” and “Big Brother.”  First published in June 1949, a little over a half year prior to Orwell’s death in January 1950, Nineteen Eighty Four is consistently described as a “dystopian” novel – a genre of fiction which, according to Merriam-Webster, pictures “an imagined world or society in which people lead wretched, dehumanized, fearful lives.”

This definition fits neatly the world that Orwell depicted in Nineteen Eighty Four, a world divided between three inter-continental super states perpetually at war, Oceania, Eurasia and Eastasia, with Britain reduced to a province of Oceania bearing the sardonic name “Airstrip One.”  Airstrip One is ruled by The Party under the ideology Insoc, a shortening of “English socialism.”  The Party’s leader, Big Brother, is the object of an intense cult of personality — even though there is no hard proof he actually exists.  Surveillance through two-way telescreens and propaganda are omnipresent.  The protagonist, Winston Smith, is a diligent lower-level Party member who works at the Ministry of Truth, where he rewrites historical records to conform to the state’s ever-changing version of history.  Smith enters into a forbidden relationship with his co-worker, Julia, a relationship that terminates in mutual betrayal.

In his intriguing study, The Ministry of Truth: The Biography of George Orwell’s 1984, British journalist and music critic Dorian Lynskey seeks to explain what Nineteen Eighty-Four “actually is, how it came to be written, and how it has shaped the world, in its author’s absence, over the past seventy years” (p.xiv). Although there are biographies of Orwell and academic studies of Nineteen Eighty-Four’s intellectual context, Lynskey contends that his is the first to “merge the two streams into one narrative, while also exploring the book’s afterlife” (p.xv; I reviewed Thomas Ricks’ book on Orwell and Winston Churchill here in November 2017).   Lynskey’s work is organized in a “Before/After” format.  Part One, about 2/3 of the book, looks at the works and thinkers who influenced Orwell and his novel, juxtaposed with basic Orwell biographical background.  Part II, roughly the last third, examines the novel’s afterlife.

But Lynskey begins in a surprising place, Washington, D.C., in January 2017, where a spokesman for President Donald Trump told the White House press corps that the recently-elected president had taken his oath of office before the “largest audience to ever witness an inauguration – period – both in person and around the globe.”  A presidential adviser subsequently justified this “preposterous lie” by characterizing the statement as “alternative facts” (p.xiii).   Sales of Orwell’s book shot up immediately thereafter.  The incident constitutes a reminder, Lynskey contends, of the “painful lessons that the world appears to have unlearned since Orwell’s lifetime, especially those concerning the fragility of truth in the face of power” (p.xix).

How Orwell came to see the consequences of mutilating truth and gave them expression in Nineteen Eighty-Four is the focus of Part I.  Orwell’s brief participation in the Spanish Civil War, from December 1936 through mid-1937, was paramount among his personal experiences in shaping the novel’s worldview. Spain was the “great rupture in his life; his zero hour” (p.4), the experience that lead Orwell to the conclusion that Soviet communism was as antithetical as fascism and Nazism to the values he held dear (Lynskey’s list of Orwell’s values: “honesty, decency, fairness, memory, history, clarity, privacy, common sense, sanity, England, and love” (p.xv)).  While no single work provided an intellectual foundation for Nineteen Eighty Four in the way that the Spanish Civil War provided the personal and practical foundation, Lynskey discusses numerous writers whose works contributed to the worldview on display in Orwell’s novel.

Lynskey dives deeply into the novels and writings of Edward Bellamy, H.G. Wells and the Russian writer Yevgeny Zamytin.  Orwell’s friend Arthur Koestler set out what Lynskey terms the “mental landscape” for Nineteen Eighty-Four in his 1940 classic Darkness at Noon, while the American conservative James Burnham provided the novel’s “geo-political superstructure” (p.126).  Lynskey discusses a host of other writers whose works in one way or another contributed to Nineteen Eighty-Four’s world view, among them Jack London, Aldous Huxley, Friedrich Hayek, and the late 17th and early 18th century satirist Jonathan Swift.

In Part II, Lynskey treats some of the dystopian novels and novelists that have appeared since Nineteen Eighty-Four.  He provides surprising detail on David Bowie, who alluded to Orwell in his songs and wrote material that reflected the outlook of Nineteen Eighty-Four.  He notes that Margaret Atwood termed her celebrated The Handmaid’s Tale a “speculative fiction of the George Orwell variety” (p.241).  But the crux of Part II lies in Lynskey’s discussion of the evolving interpretations of the novel since its publication, and why it still matters today.  He argues that Nineteen Eighty Four has become both a “vessel into which anyone could pour their own version of the future” (p.228), and an “all-purpose shorthand” for an “uncertain present” (p.213).

In the immediate aftermath of its publication, when the Cold War was at its height, the novel was seen by many as a lesson on totalitarianism and the dangers that the Soviet Union and Communist China posed to the West (Eurasia, Eastasia and Oceania in the novel correspond roughly to the Soviet Union, China and the West, respectively).  When the Cold War ended with the fall of Soviet Union in 1991, the novel morphed into a warning about the invasive technologies spawned by the Internet and their potential for surveillance of individual lives.  In the Age of Trump and Brexit, the novel has become “most of all a defense of truth . . . Orwell’s fear that ‘the very concept of objective truth is fading out of the world’ is the dark heart of Nineteen Eighty-Four. It gripped him long before he came up with Big Brother, Oceania, Newspeak or the telescreen, and it’s more important than any of them” (p.265-66).

* * *

                            Orwell was born as Eric Blair in 1903 in India, where his father was a mid-level civil servant. His mother was half-French and a committed suffragette.  In 1933, prior to publication of his first major book,  Down and Out in Paris and London, which recounts his life in voluntary poverty in the two cities, the fledgling author took the pen name Orwell from a river in Sussex .  He changed names purportedly to save his parents from the embarrassment which  he assumed his forthcoming work  would cause.  He was at best a mid-level journalist and writer when he went to Spain in late 1936, with a handful of novels and lengthy essays to his credit – “barely George Orwell” (p.4), as Lynskey puts it.

The Spanish Civil war erupted after Spain’s Republican government, known as the Popular Front, a coalition of liberal democrats, socialists and communists, narrowly won a parliamentary majority in 1936, only to face a rebellion from the Nationalist forces of General Francisco Franco, representing Spain’s military, business elites, large landowners and the Catholic Church.  Nazi Germany and Fascist Italy furnished arms and other assistance for the Nationalists’ assault on Spain’s democratic institutions, while the Soviet Union assisted the Republicans (the leading democracies of the period, Great Britain, France and the United States, remained officially neutral; I reviewed Adam Hochschild’s work on the Spanish Civil War here in August 2017).   Spain provided Orwell with his first and only personal exposure to the “nightmare atmosphere” (p.17) that would envelop the novel he wrote a decade later.

Fighting with the Workers’ Party of Marxist Unification (Spanish acronym: POUM), a renegade working class party that opposed Stalin, Orwell quickly found himself in the middle of what amounted to a mini-civil war among the disparate left-wing factions on the Republican side, all within the larger civil war with the Nationalists.  Orwell saw first-hand the dogmatism and authoritarianism of the Stalinist left at work in Spain, nurtured by a level of deliberate deceit that appalled him.  He read newspaper accounts that did not even purport to bear any relationship to what had actually happened. For Orwell previously, Lynskey writes:

people were guilty of deliberate deceit or unconscious bias, but at least they believed in the existence of facts and the distinction between true and false. Totalitarian regimes, however, lied on such a grand scale that they made Orwell feel that ‘the very concept of objective truth is fading out of the world’ (p.99).

Orwell saw totalitarianism in all its manifestations as dangerous not primarily because of secret police or constant surveillance but because “there is no solid ground from which to mount a rebellion –no corner of the mind that has not been infected and warped by the state.  It is power that removes the possibility of challenging power” (p.99).

Orwell narrowly escaped death when he was hit by a bullet in the spring of 1937.  He was hospitalized in Barcelona for three weeks, after which he and his wife Eileen escaped across the border to France.  Driven to Spain by his hatred of fascism, Orwell left with a “second enemy. The fascists had behaved just as appallingly as he had expected they would, but the ruthlessness and dishonesty of the communists had shocked him” (p.18).  From that point onward, Orwell criticized communism more energetically than fascism because he had seen communism “up close, and because its appeal was more treacherous. Both ideologies reached the same totalitarian destination but communism began with nobler aims and therefore required more lies to sustain it” (p.22).   After his time in Spain, Orwell knew that he stood against totalitarianism of all stripes, and for democratic socialism as its counterpoint.

The term “dystopia” was not used frequently in Orwell’s time, and Orwell distinguished between “favorable” and “pessimistic” utopias.   Orwell developed what he termed a “pitying fondness” (p.38) for nineteenth-century visions of a better world, particularly the American Edward Bellamy’s 1888 novel Looking Backward.  This highly popular novel contained a “seductive political argument” (p.33) for the nationalization of all industry, and the use of an “industrial army” to organize production and distribution.  Bellamy had what Lynskey terms a “thoroughly pre-totalitarian mind,” with an “unwavering faith in human nature and common sense” that failed to see the “dystopian implications of unanimous obedience to a one-party state that will last forever” (p.38).

Bellamy was a direct inspiration for the works of H.G. Wells, one of the most prolific writers of his age. Wells exerted enormous influence on the young Eric Blair, looming over the boy’s childhood “like a planet – awe inspiring, oppressive, impossible to ignore – and Orwell never got over it” (p.60).  Often called the English Jules Verne, Wells foresaw space travel, tanks, electric trains, wind and water power, identity cards, poison gas, the Channel tunnel and atom bombs.  His fiction imagined time travel, Martian invasions, invisibility and genetic engineering.  The word Wellsian came to mean “belief in an orderly scientific utopia,” but his early works are “cautionary tales of progress thwarted, science abused and complacency punished” (p.63).

Wells was himself a direct influence upon Yevgeny Zamatin’s We which, in Lymskey’s interpretation, constitutes the most direct antecedent to Nineteen Eighty-Four.  Finished in 1920 at the height of the civil war that followed the 1917 Bolshevik Revolution (but not published in the Soviet Union until 1988), We is set in the undefined future, a time when people are referred to only by numbers. The protagonist, D-503, a spacecraft engineer, lives in the One State, where mass surveillance is omnipresent and all aspects of life are scientifically managed.  It is an open question whether We was intended to satirize the Bolshevik regime, in 1920 already a one-party state with extensive secret police.

Zamyatin died in exile in Paris in 1937, at age 53.   Orwell did not read We until sometime after its author’s death.  Whether Orwell “took ideas straight from Zamyatin or was simply thinking along similar lines” is “difficult to say” (p.108), Lynskey writes.  Nonetheless, it is “impossible to read Zamyatin’s bizarre and visionary novel without being strongly reminded of stories that were written afterwards, Orwell’s included” (p.102).

Koestler’s Darkness at Noon offered a solution to the central riddle of the Moscow show trials of the 1930s: “why did so many Communist party members sign confessions of crimes against the state, and thus their death warrants?” Koestler argued that their “years of unbending loyalty had dissolved their belief in objective truth: if the Party required them to be guilty, then guilty they must be” (p.127).  To Orwell this meant that one is punished in totalitarian states not for “ what one does but for what one is, or more exactly, for what one is suspected of being” (p.128).

The ideas contained in James Burnham’s 1944 book, The Managerial Revolution “seized Orwell’s imagination even as his intellect rejected them” (p.122).  A Trotskyite in his youth who in the 1950s helped William F. Buckley found the conservative weekly, The National Review, Burnham saw the future belonging to a huge, centralized bureaucratic state run by a class of managers and technocrats.  Orwell made a “crucial connection between Burnham’s super-state hypothesis and his own long-standing obsession with organized lying” (p.121-22).

Orwell’s chronic lung problems precluded him from serving in the military during World War II.  From August 1941 to November 1943, he worked for the Indian Section of the BBC’s Eastern Service, where he found himself “reluctantly writing for the state . . . Day to day, the job introduced him to the mechanics of propaganda, bureaucracy, censorship and mass media, informing Winston Smith’s job at the Ministry of Truth” (p.83; Orwell’s boss at the BBC was notorious Cambridge spy Guy Burgess, whose biography I reviewed here in December 2017).   Orwell left the BBC in 1943 to become literary editor of the Tribune, an anti-Stalinist weekly.

While at the Tribune, Orwell found time to produce Animal Farm, a “scrupulous allegory of Russian history from the revolution to the Tehran conference” (p.138), with each animal representing an individual, Stalin, Trotsky, Hitler, and so on.  Animal Farm shared with Nineteen Eighty-Four an “obsession with the erosion and corruption of memory” (p.139).  Memories in the two works are gradually erased, first, by the falsification of evidence; second, by the infallibility of the leader; third, by language; and fourth, by time.  Published in August 1945, Animal Farm quickly became a best seller.  The fable’s unmistakable anti-Soviet message forced Orwell to remind readers that he remained a socialist.  “I belong to the Left and must work inside it,” he wrote, “much as I hate Russian totalitarianism and its poisonous influence of this country” (p.141).

Earlier in 1945, Orwell’s wife Eileen died suddenly after being hospitalized for a hysterectomy, less than a year after the couple had adopted a son, whom they named Richard Horatio Blair.  Orwell grieved the loss of his wife by burying himself in the work that culminated in Nineteen Eighty-Four.   But Orwell became ever sicker with tuberculosis as he worked  over the next four years on the novel which was titled The Last Man in Europe until almost immediately prior to publication (Lynskey gives no credence to the theory that Orwell selected 1984 as a inversion of the last two digits of 1948).

Yet, Lynskey rejects the notion that Nineteen Eighty-Four was the “anguished last testament of a dying man” (p.160).  Orwell “never really believed he was dying, or at least no more than usual. He had suffered from lung problems since childhood and had been ill, off and on, for so long that he had no reason to think that this time would be the last ” (p.160).  His novel was published in June 1949.  227 days later, in January 1950, Orwell died when a blood vessel in his lung ruptured.

* * *

                                    Nineteen Eighty-Four had an immediate positive reception. The book was variously compared to an earthquake, a bundle of dynamite, and the label on a bottle of poison.  It was made into a movie, a play, and a BBC television series.  Yet, Lynskey writes, “people seemed determined to misunderstand it” (p.170).  During the Cold War of the early 1950s, conservatives and hard line leftists both saw the book as a condemnation of socialism in all its forms.  The more astute critics, Lynskey argues, were those who “understood Orwell’s message that the germs of totalitarianism existed in Us as well as Them” (p.182).  The Soviet invasion of Hungary in 1956 constituted a turning point in interpretations of Nineteen Eighty-Four.  After the invasion, many of Orwell’s critics on the left “had to accept that they had been wrong about the nature of Soviet communism and that he [Orwell] had been infuriatingly right” (p.210).

The hoopla that accompanied the actual year 1984, Lynskey notes wryly, came about only because “one man decided, late in the day, to change the title of his novel” (p.234).   By that time, the book was being read less as an anti-communist tract and more as a reminder of the abuses exposed in the Watergate affair of the previous decade, the excesses of the FBI and CIA, and the potential for mischief that personal computers, then in their infancy, posed.  With the fall of the Berlin wall and the end of communism between 1989 and 1991, focus on the power of technology intensified.

But today the focus is on Orwell’s depiction of the demise of objective truth in Nineteen Eighty-Four, and appropriately so, Lynskey argues, noting how President Trump masterfully “creates his own reality and measures his power by the number of people who subscribe to it: the cruder the lie, the more power its success demonstrates” (p.264).  It is truly Orwellian, Lynskey contends, that the phrase “fake news” has been “turned on its head by Trump and his fellow authoritarians to describe real news that is not to their liking, while flagrant lies become ‘alternative facts’” (p.264).

* * *

                                 While resisting the temptation to term Nineteen Eighty-Four more relevant now than ever, Lynskey asserts that the novel today is nonetheless  “a damn sight more relevant than it should be” (p.xix).   An era “plagued by far-right populism, authoritarian nationalism, rampant disinformation and waning faith in liberal democracy,” he concludes, is “not one in which the message of Nineteen Eighty-Four can be easily dismissed” (p.265).

Thomas H. Peebles

La Châtaigneraie, France

February 25, 2020

2 Comments

Filed under Biography, British History, European History, Language, Literature, Political Theory, Politics, Soviet Union

Pursuit of the Heroic

 

 

Elisha Waldman, This Narrow Space:

A Pediatric Oncologist, His Jewish, Muslim and Christian Patients,

and a Hospital in Jerusalem

(Schocken, $25.95)

 Pietro Bartolo and Lidia Tilotta, Tears of Salt:

A Doctor’s Story (Norton, $25.95)

                           The practice of medicine at its best – preventing and curing diseases, relieving pain and easing human suffering, helping families manage grief – is almost by definition noble, with a built-in potential for heroism that few other professions enjoy (comparisons to the practice of law come uneasily to mind).  Of course, not all medical practitioners realize their potential for heroism.  But the pursuit of the heroic is at the heart of two doctors’ recent memoirs: This Narrow Space: A Pediatric Oncologist, His Jewish, Muslim and Christian Patients, and a Hospital in Jerusalem, by Dr. Elisha Waldman; and Tears of Salt: A Doctor’s Story, by Dr. Pietro Bartolo and Lidia Tilotta.

Dr. Waldman, an American-born physician, worked for seven years as a pediatric oncologist in Israel at Jerusalem’s Hassadah Hospital, where he treated young cancer patients amidst the complexities of Israeli society, rigidly divided between Jews and Arabs.   During the migrant crises of the 21st century’s second decade, Dr. Bartolo served as first medical responder for the waves of refugees from Africa and the Middle East arriving on the remote Mediterranean island of Lampedusa, a part of Italy but closer to Tunisia than to Sicily and the Italian mainland.  Bartolo was featured in the award-winning 2016 documentary film Fire at Sea, Fuoccoammare in Italian.

Of the two memoirs, Dr. Waldman’s is the more layered.  Treating young cancer patients in Jerusalem is part of a larger story of Waldman’s move from the United States to Israel, termed aliyah in Hebrew, a kind of religious and spiritual homecoming.   His memoir involves both his spiritual quest to know and better understand his Jewish religious faith and his more pragmatic efforts to fit into Israeli society, which he found bewilderingly in its complexity.  At its core, Dr. Waldman’s memoir entails his search for himself and the place where he thinks he belongs.

Dr. Bartolo, by contrast, maintains few doubts or second thoughts about who he is or where he belongs.   He grew up on Lampedusa and, after studying medicine in Sicily, returned as a young doctor to practice medicine on his home island, at a time before it became a focal point in the migrant crises of the 2010s.  His story, as told by himself and co-author Lidia Tilotta (and ably translated from the Italian by Chenxin Jiang), consists of one incident after another of unflinching work in the face of constant emergency conditions, as he tries to save the lives and ease the pain of migrants to Europe seeking to escape turmoil elsewhere in the world.

* * *

                         Elisha Waldman, the son of a Conservative rabbi, grew up in Connecticut.  As an undergraduate at Yale, he majored in religious studies.   Zionism was a crucial part of his upbringing.  He traveled often to Israel as a youth, and spent four years in medical school in Tel Aviv.  He explains his decision in 2007 to return to Israel to practice medicine: “Where better to explore my faith and identity than in the country that I had been raised to think of as my other home?” (W, p.24).

Upon arriving in Israel, Waldman went to work immediately at Hadassah, one of Jerusalem’s leading hospitals.  His accounts of his efforts to provide care and comfort to children afflicted with cancer are touching.  We meet many of his patients.  They come from both Jewish and Arab families.  Only a few are cured.   Most die.  Waldman could speak to his Jewish patients and their parents in his rapidly improving Hebrew, if not in English, but had to work through a translator with most Arab patients and their families.

Whatever the language, finding the right formula for communication with his young patients and their typically distraught parents proved consistently elusive.  The art of effective communication, Waldman notes, is not part of a doctor’s formal training, yet success as a pediatric oncologist depends on it.  “And it is there that we so often fail,” he writes.  In Israel, the additional factors of “multiple cultural and religious traditions, the incendiary politics of the area, and language differences can make effective communication seem almost impossible” (W, p.70).   Waldman wondered whether his Palestinian patients saw him as “another occupier, a foreign transplant who has come to take their land” (W, p.65).   With his attempts to show empathy to his Palestinian patients and their families, he hoped to represent to them a “more compassionate, more liberal side of the Zionist enterprise” (W, p.65).

One graphic example of the rigid the divide between Arabs and Jews in Israel occurred when Waldman had his initial meeting with an American-born Orthodox Jewish father of a child cancer patient.   Waldman’s trusted female assistant, Fatma, an Arab for whom he had great respect and an excellent professional rapport, was also part of thediscussion.   Afterward, the father took Waldman aside and told him that he was “a little uncomfortable with an Arab being in charge of my son’s care” (W, p.87).  The astounded Waldman wanted to tell the father that this insult to a cherished colleague and friend was the “opposite of everything I came to Israel for, everything I believe the state should be” (W, p.87).  But because the man’s son was to be his patient, Waldman had to tread lightly. “This is what the Zionist dream has become,” he writes despondently.  Two American Jews “sitting in Jerusalem, taking positions on the merits and trustworthiness of an Arab woman with roots in this city that go back centuries” (W, p.88).

The heart-breaking cases Waldman must deal with also raise the omnipresent question that he returns to throughout his memoir: how can the God Waldman wants to believe in allow these young people to be afflicted by this frequently fatal disease? Why must these innocent children suffer this cruel and unwarranted fate?  His initial reaction at being exposed to so much suffering was to give up on God.  “Why not simply abandon the idea of God, when so much evidence would seem to point to the impossibility of His existence?” (W, p.129).   But he fell back on the notion of “process theology,” a notion he had studied as an undergraduate which posits that God is good but limited: “Although He has a will in this world and wishes only for good, there are certain things He cannot accomplish because they are beyond His control” (W, p.130).

Process theology conforms in an odd way with how Waldman looks at his role as a doctor: “I want only the best for my patient, I do my best for them, but sometimes it’s just not possible to achieve the miracle they are hoping for” (W, p.130).  Waldman’s experiences with his young cancer-afflicted patients led him to sense that there is “some force, whether it’s an omnipotent divine being or the irrepressible human sprit, that gives life meaning . . .  And despite the sometimes harrowing things I see, my faith may wobble but it doesn’t ever fail completely” (W, p.126-27).

Largely unsuccessful in answering the overriding theological question of why “terrible things happen to good and innocent people,” (W, p.126), Waldman realized a more immediate and tangible success in establishing Israel’s first palliative care unit at Hadassah.  The concept of palliative care, he explains, has generally been understood too narrowly, consisting strictly of measures to provide end-of-life dignity and comfort.   But there is a more holistic approach to palliative care that focuses upon providing a “broad spectrum of support for children and their families, regardless of the patient’s prognosis” (W, p.186).  A palliative care team might be called to “help think about particularly challenging symptoms, such as complex pain management or sleep issues.  Sometimes we are called to help patents and their families with difficult decision-making regarding treatments or interventions” (p.168).

In 2008, the medical profession in the United States recognized the holistic approach to palliative care as a separate medical subspecialty, with its own training programs and board certification. But the approach was largely unknown in Israel.  When Waldman pitched the idea to Israeli colleagues, the initial reaction of many was skeptical.   “So you basically just talk? . . . That’s so American” (W, p.186), one told him.  Others, especially within the oncology department at Hadassah, were intrigued.  Waldman took a leave of absence from Hadassah to take training in Boston in holistic palliative care.  He returned to Israel eager to advance what he had learned in Boston, and went on to become the country’s leading proponent.  He also engaged in fund raising in the United States to create facilities at Hadassah and elsewhere in Israel for the holistic approach to palliative care.

But Waldman’s efforts to institutionalize holistic palliative care in Israel ran up against several realities of Israeli life, among them a series of draconian cuts to Hadassah’s operating budget and an ensuing staff strike at the hospital.  Waldman became concerned that the funds he had raised for palliative care were being used for general hospital expenses, and was never able to get a clarifying answer from hospital administrators.  His efforts also coincided with a major civil conflict in Gaza and Jerusalem, after three Jewish teenagers had disappeared near the Gaza village of Hebron.  Three Israeli teenaged boys thereafter captured, tortured and killed a Palestinian boy in East Jerusalem, seemingly an act of revenge, and right wing Jewish thugs took to the street, beating up anyone who looked Palestinian or Arab.

By this time, Waldman had also found the woman of his life, a fellow American living in Israel. When he received an offer to set up and run a palliative care unit at a highly respected New York teaching hospital, with the job security he didn’t have in Israel, it turned out to be an offer he couldn’t refuse.  In 2014, Waldman and his future wife returned to the United States, leaving this reader feeling let down after being immersed in the heart-rending particulars of his interactions with his young cancer-afflicted patients in Jerusalem.  Today Dr. Waldman is chief of the division of pediatric palliative care at the Ann and Robert H. Lurie Children’s Hospital of Chicago.

* * *

                         Pietro Bartolo is the son of a fisherman, one of seven children.  As a boy, he learned from his father the skills needed to survive and succeed in this challenging line of work.  He begins his memoir with an incident at sea in which he nearly drowned, when the future doctor was 16 years old.  The trauma never left him.  He didn’t know it at the time, but his subsequent life would be “scarred by a capricious sea that spits out living or dead bodies at will” (B, p.14-15).  His memoir is a series of short anecdotes, not arranged in chronological or any other discernible order, in which his interactions with migrants arriving on Lampedusa are interspersed with accounts of his youth and personal background.  “Tears of salt,” the book’s title, refers to how saltwater winds and breezes mix with one’s own tears.  Such tears flow frequently when Bartolo proves unable to save a life, or mitigate an injury or pain of refugees, typically fleeing terrorism,  civil war and political unrest.

Libya is the common departure point for most of the migrants arriving on Lampedusa, with a hellish desert crossing preceding a hellish journey across the sea.  The migrants’ ordeals are difficult to understand unless one has made the trip, Bartolo explains:

The heat is stifling. You are crammed onto a pickup truck, and if you so much as sit in the wrong place, you will be thrown out and left to die. When the water runs out, you are reduced to drinking your own urine. Finally you arrive in Libya and think the nightmare is over, but it has only just begun: ill treatment, prison, torture. Only if you manage to survive all of this do you finally make it onto a boat. Only then, if you do not die on the open sea, will you arrive at your destination and begin to hope that your life can start all over again (B, p.23).

The first wave of migrants began to arrive on Lampedusa in the first quarter of 2011, at the height of the Arab Spring.  Arrivals continued throughout the decade and surged a second time in 2015, in part as fallout from the civil war in Syria.  Two horrific shipwrecks took place near Lampedusa within days of one another in October 2013.  The first, with refugees from Eritrea, Somalia, and Ghana, caused nearly 400 deaths. The second, involving Syrians and Palestinians, resulted in 34 deaths.

One of the survivors of the latter shipwreck explained to Bartolo that when their boat capsized, he was carrying his nine-month-old daughter in his arms and trying to keep his three-year-old son and wife afloat.  With no help arriving, the man was faced with an irrevocable choice: if he kept treading water, all four of them would drown.  In the end, he opened his right hand, and let go of his son. He watched his son “disappear forever under the waves” (B, p.48-49).  Although a doctor is “not supposed to let his patients see that he is overwhelmed,” Bartolo could not help weeping with the man. “I did not have it in me to hold myself together” (B, p.49),  he writes.

Equally horrific is the story of a boat that arrived on Lampedusa in 2011.   Although the passengers seemed distraught, they also did not appear to have abnormal physical symptoms.  But in a freezer normally used for storing fish, Bartolo inadvertently stepped upon several corpses, mostly young people.  The young bodies were “naked, piled on top of each other, some with limbs intertwined. It was Dantesque” (B, p.144).  The victims had clearly been beaten and the traffickers had threatened and intimidated the survivors into silence.  For days afterward, Bartolo writes, “I could think of nothing else . . .When I thought about the brutes that did this, I saw red” (B, p.146).

As a witness to this level of inhumanity, Bartolo, although religious, is impatient with the type of ruminations on faith that run through Waldman’s memoir.  His response to the question how and why a loving God can allow large-scale human suffering is blunt:

God? God has nothing to do with this. It is human beings who are to blame, not God. Greedy, ruthless human beings who put their trust in money and power . . . those who are willing to let half the world live in poverty, who sanction conflict and even finance it. The problem is human beings, not God (B, p.120).

The 2016 documentary film Fire at Sea, the work of distinguished film director Gianfranco Rosi, turned out to be a proverbial godsend for Bartolo, who had been looking for a way to tell the world more about the migrant crisis on Lampedusa than what was contained in quickly forgotten news clips.  The film sets the dangerous sea crossing against everyday life on Lampedusa.  Bartolo saw the film for the first time in Berlin, where it was a finalist in the Golden and Silver Bears competition.  It was precisely what he had been seeking: a “raw, unequivocally clear message that would shatter all the lies and prejudice surrounding this issue, awaken the public conscience, and open people’s eyes” (B, p.143).  The film was “not just a documentary: it was a complicated narrative told at a measured pace and in hushed tones, but with captivating power and subtlety” (B, p.142).

In the memoir’s personal background anecdotes, Bartolo recounts how he grew up in the post World War II era, a time in Italy when the sons – and, to a lesser extent, daughters — of fishermen, farmers and factory workers could for the first time realistically aspire to be doctors, lawyers, engineers or teachers.  Bartolo’s father, adamant that his son should not follow in his footsteps as a fisherman, sent him to boarding school in Sicily because Lampedusa lacked a quality secondary school.  While in boarding school, he met his future wife Rita.  Returning to Lampedusa after Bartolo completed his medical studies  in Sicily was a difficult move for Rita.  The couple had three children, two girls, Grazia and Rosanna, followed by a boy, Giacomo, and Bartolo provides his readers with glimpses of each.

Looming inescapably in the background of Dr. Bartolo’s personal and professional stories is the island of Lampedusa, “[b]reathtakingly beautiful and breathtakingly remote” (B, p.18).  Lampedusa is not an easy place to live, Bartolo acknowledges, a “small piece of the earth’s crust that broke off from Africa and drifted toward Europe.  As such, it is something of a symbolic gateway between the two continents” (B, p.186).   The island was arguably the most welcoming spot on earth for incoming refugees during the last decade’s recurrent refugee crises.  Although there is no hard proof that it was so welcoming in large measure because of Dr. Bartolo’s efforts, readers instinctively feel that this is the case.  In 2019, Dr. Bartolo was elected to the European Parliament as a member of the center-left Democracy party and now may be found as frequently in Brussels and Strasbourg as on Lampedusa.

* * *

                             The migrant crises of the last decade may have fueled ugly xenophobia, hyper-nationalism and racism among large swaths of Europeans.  But they brought out the heroism in Dr. Bartolo, which seems to jump off every page of his brutally forthright memoir.   Dr. Waldman’s efforts to bridge the seemingly intractable divides of modern Israel in caring for cancer-afflicted children are also heroic, but commingled in his memoir with his quest to find his inner self.   In different ways, each memoir constitutes a reminder of the nobility of which the medical profession is capable.

Thomas H. Peebles

Prospect, Kentucky USA

February 10, 2020

3 Comments

Filed under Uncategorized

Lenny as Paterfamilias

 

Jamie Bernstein, Famous Father Girl:

A Memoir of Growing Up Bernstein (Harper)

 

In Famous Father Girl: A Memoir of Growing Up Bernstein, Jamie Bernstein, daughter of legendary conductor, composer and overall musical genius Leonard Bernstein (1918-1990), sheds light upon how she grew up in the shadow of the legend.  In Jamie’s early years, her family looked outwardly conventional, or at least conventional for the upper crust Manhattan milieu in which she and her two siblings were raised.  Jamie, the oldest child, was born in 1952; her brother Alexander followed two years later, and their younger sister Nina was born in 1962.

Their mother Felicia Montealegre – “Mummy” throughout the memoir — was a native of Chile and a Roman Catholic from a semi-aristocratic background, a contrast to her American-born Jewish husband from a first-generation immigrant family.  Felicia was an accomplished pianist and aspiring actress, an elegant and insightful woman who was highly engaged in the lives of her children and served as the family “policeman” and “stabilizer” (p.100).   But Felicia died of cancer in 1978 at age 56.

In 1951, Felicia married Jamie’s father, most frequently referred to here as “Daddy,” but also as “Lenny,” “LB,” and “the Maestro.”  Felicia’s husband was already a world-class conductor and composer when they married, and became ever more the celebrity as the couple’s three children grew up.  Jamie’s portrait of Bernstein the father and husband conforms to what most readers passingly familiar with Bernstein would anticipate: a larger than life figure who quickly filled up any room he entered; ebullient, exuberant, and eccentric; a chain smoker, a prodigious talker as well as music maker; and a man who loved jokes,  spent much time under a sunlamp, and had a proclivity for kissing on the lips just about everyone he met, male or female.  The insights into Bernstein’s personality and how he filled the role of father and husband are one of two factors that make this memoir . . . well, memorable.

The other factor is Bernstein’s sexuality. Despite the appearances of conventional marriage and family life, the bi-sexual Maestro leaned heavily toward the gay side of the equation.  Jamie’s elaboration upon how she became aware of her father’s preference for other men, and the effect of her father’s homosexuality on her mother and the family, constitute the memoir’s backbone.  Although she provides her perspective on her father’s musical achievements, she spends more time on Bernstein as paterfamilias than Bernstein as music maker.  Jamie also reveals how she struggled to find her own pathway through life as an adolescent and young adult, feeling stalked by her family’s name and her father’s fame.

* * *

                        Jamie became aware of her father’s sexual preferences as a teenager.  She had landed a summer job at the Tanglewood Summer Music Festival in Western Massachusetts, where her father conducted.  People at Tanglewood talked freely about her father and the men he was involved with:

They talked about it quite casually in front of me, so I pretended I knew all about it – but I didn’t. I mentally reviewed past experiences; had I sensed, or observed, anything to indicate that my father was homosexual?  He was extravagantly affectionate with everyone: young and old, male and female. How could I possibly tell what any behavior meant? And anyway, weren’t homosexuals supposed to be girly? . . . Yet there was nothing I could detect that was particularly effeminate about my father. How exactly did he fit into this category?  I was bewildered and upset.  I couldn’t understand any of it – but in any case, my own existence seemed living proof that the story was not a simple one (p.123).

Thereafter, Jamie wrote her father a letter about what she had learned at Tanglewood.  When she joined her parents at their weekend house in Connecticut, her father took her outside.   He denied what he described as “rumors” that were propagated, he said, by persons who envied his professional success and hoped to jeopardize his career.  Later, Jamie wondered whether her mother had forced her father to deny everything.  After her confrontation with her father, she began to discuss her father’s sexual complexities with her siblings but never again raised the subject with either parent.

Jamie learned subsequently that prior to her parents’ marriage, Felicia had written to her future husband: “You are a homosexual and may never change . . . I am willing to accept you as you are, without being a martyr and sacrificing myself on the L.B. altar” (p.124).  Her clear-eyed mother had entered into her marriage knowing full well, Jamie concluded, that she was “marrying a tsunami – and a gay one at that” (p.172).  Her  parents may have reached an agreement, perhaps tacit, that her father would confine his philandering to the time he was one the road.  At home, he was to be very conventional.

But that agreement came to an end in in 1976, when Leonard took a separate apartment in New York to spend time with a young man, Tommy Cothren, with whom he had fallen “madly in love” (p.188).  Her father, Jamie writes, was “starting a new life – so he was cheerful, acting exuberantly gay and calling everyone ‘darling’” (p.188).  In the rift between her parents, her brother Alexander seemed to be taking Felicia’s side while Jamie worried that she was not being sufficiently supportive of her mother.  She was “trying so hard to be equitable.  I wanted my father to find his true self and be happy with who he was . . . but I couldn’t help being ambivalent over how gracelessly he was going about it, and how much pain he was inflicting on our mother . . . Sometimes I wondered if I should have been taking sides.” (p.187).

These wrenching family issues became moot two years later, when Felicia died of breast cancer. Jamie notes that her father was quite attentive to her mother as her condition worsened.  The loss of Felicia “ripped through our family’s world with a seismic shudder.   She was so adored, so deeply beautiful . . . and gone so unbearably too soon, at fifty-six” (p.218).  In the absence of Mummy, Jamie writes, her father became “as untamed as a sail flapping in a squall. The family’s preexisting behavioral boundaries were gone; now anything could happen” (p.233).  Her father’s “intense physicality and flamboyance had always been there, but now, in the absence of Felicia’s calming influence, it became a beast unleashed” (p.235).  After Felicia’s death, Leonard spent an increasing amount of time in Key West, in the Florida Keys, where the sunshine and gay intellectual culture attracted him.

Bernstein himself died in 1990, at the relatively young age of 72, from a form of lung cancer associated with asbestos exposure rather than his life-long cigarette habit (a habit which his wife shared and one which Jamie detested from an early age).  The Maestro’s final years were ones where sexual liberation combined with physical and mental decline.  He suffered from depression and “hated getting older, hated his diminishing physicality.  But the other part of the problem – and the two were inextricably intertwined – was that he was continuing to put prodigious quantities of uppers, downers, and alcohol into a body that was growing ever less efficient at metabolizing all those substances” (p.258).  His “decades of living at maximum volume appeared to be catching up with him at last” (p.316), Jamie writes.

At a concert at Tanglewood just months prior to his death, Bernstein had trouble conducting Arias and Baracollees, a piece he had written.  “[H]is brain was so oxygen-deprived by that point that he couldn’t track the complexities of his own music” (p.319).  When he came out afterwards for his bow, he was “tiny, ashen, and nearly lost inside the white suit that now hung so loosely on him, it looked as if it had been tailored for some other species” (p.319).

One shining exception to Bernstein’s downward spiral in his final years occurred at concerts in Berlin during the 1989 Christmas holiday season, the month following the fall of the Berlin wall.  Bernstein conducted a “mighty ensemble comprising players volunteering from various orchestras around the world who, along with four soloists and a local girls’ chorus, gave a pair of performances of Beethoven’s Ninth Symphony: one in East Berlin and one in West Berlin.”  And to make the performances “extra-historic,” Bernstein changed Schiller’s text in the final “Ode to Joy” movement: “now it was ‘Ode to Freedom.’ “Freiheit!’ The word rang out again and again, wreathed in Beethoven’s harmonies, and the world watched it on television on Christmas Day” (p.313).  The Berlin concerts were in Jamie’s view her father’s “peak performance,” the “pinnacle” of his lifelong advocacy for world peace and brotherhood, “never more eloquently expressed, and never to so many, than through Beethoven’s notes in that historical Christmas performance” (p.313).

But Bernstein’s progressive political orientation did not always play so well at home.  In 1970, Felicia hosted a fundraiser at their Park Avenue apartment which Leonard attended, designed to assist the families of 21 members of the Black Panther party who were in jail with inflated bail amounts, “awaiting trial for what turned out to be trumped-up accusations involving absurd bomb plots” (p.109).  The Black Panthers advocated black empowerment “by any means necessary” and were anti-Zionist, making them scary even in liberal New York.  No journalists were invited to the fundraiser, but somehow two snuck in, the New York Times society writer and an upcoming journalist, Tom Wolfe (deceased subsequent to the memoir’s publication).

An article in the Times the next day heaped scorn on the event.  “Everything about this article was loathsome,” Jamie writes, “and my parents were both aghast. But that was just the beginning” (p.112).  The Times followed a few days later with an editorial chastising the couple for mocking the memory of Martin Luther King.  The militant Jewish Defense League organized pickets in front of the Bernstein’s building and the couple became the “butt of ridicule” (p.113) in New York and nationally.  Then, weeks later, Wolfe came out with an article in New York magazine entitled “That Party at Lenny’s,” followed by Radical Chic, a book centered on the event.  “My mother’s very serious fundraiser had become her celebrity husband’s ‘party’” (p.116), Jamie writes.

Wolfe’s works had the effect of setting in stone the misinterpretation and mockery of the Panther event.  Jamie contends bitterly that Wolfe never comprehended the depth of the damage he wreaked on her family.  Unlike her father, Felicia had no work to back her up in the aftermath of the Panther debacle and grew increasingly despondent.  Four years later, she was diagnosed with cancer and underwent a mastectomy.  Four years after that, she was dead of the disease.  Even when Jamie wrote her memoir, a time when Wolfe himself was near death, “my rage and disgust can rise up in me like an old fever – and in those nearly deranged moments, it doesn’t seem like such a stretch to lay Mummy’s precipitous decline, and even demise, at the feet of Mr. Wolfe” (p.117).

Nor did Wolfe comprehend, Jamie further argues, the degree to which his “snide little piece of neo-journalism rendered him a veritable stooge for the FBI.”  Bureau Director J. Edgar Hoover “may well have shed a tear of gratitude that this callow journalist had done so much of the bureau’s work by discrediting left-wing New York Jewish liberals while simultaneously pitting them against the black activist movement –thereby disempowering both groups in a single deft stroke” (p.116).  With the Panther incident, the FBI became “obsessed with Leonard Bernstein all over again. Hoover was deeply paranoid about the Black Panthers” (p.305).  But Jamie reveals how, thanks to a Freedom of Information Act request for files on her father, the family learned that Hoover had been “obsessing on Leonard Bernstein since the 1940s, when informers started supplying insinuations that Bernstein was a Communist” (p.315).  The 800-page Bernstein file “substantially increased in girth during the Red Scare years in the 1950s, when my father had even been briefly denied a passport” (p.305).

Well before Felicia’s death, it was clear to Jamie that her father had become a “Controversial Person – a long, complex evolution from his wunderkind public persona of the 1950s” (p.296).  But in addition to her father’s story, Jamie’s memoir also provides her perspective on her own challenges “growing up Bernstein,” the memoir’s  sub-title.

* * *

                      Jamie grew up with so many of the trappings of Manhattan wealth that this portion of the story seems stereotypical, bordering on caricature.  Her family lived in fancy Manhattan apartments, eventually the famous Dakota, where John Lennon was a neighbor until he was killed in front of the building (he was killed shortly after Jamie had walked past the shooter, seemingly just one of many groupies waiting to get a glance of the singer).  The Bernstein family had a life-long South American nanny, Julia Vega, who was a major part of the family and is a presence throughout the memoir.  The three children relied primarily upon chauffeurs and limousines for local transportation. They enjoyed a secondary residence for weekend and summer getaways, first in Connecticut, then in East Hampton.  The children traveled all over the globe with their father as they grew up.  They attended elite Manhattan private schools, and all three attended Harvard, the school from which Leonard had graduated prior to World War II.  Jamie indicates that admission to Harvard brought little elation for herself or her two siblings; they always had “crippling doubts” (p.148) whether they gained admission on their merits or because they were Leonard Bernstein’s children (at Harvard, Jamie’s first year roommate was Benazir Bhutto, daughter of Pakistan’s prime minister who was assassinated when she became Pakistan’s prime minister).

As a young adult, Jamie followed her father into the music world, although her particular niche was more popular than classical music (a niche her father deeply appreciated; he too loved the Beatles). She was hardly surprised that she enjoyed considerably less success than her father. “Sure, I was musical, but I really was a very poor musician” (p.277).   She stopped fretting about comparisons to her father when she stopped trying to be a musician herself. “It turned out that if I just refrained from making music with my own body, I was much calmer . . . [M]aking music with my own body had mostly made me a mess” (p.362-63).

Jamie had her share of boyfriends as a teenager and young adult, and she manages to tell her readers quite a bit about many of them.  Her first date was with Marlon Brando’s nephew.  She smoked a lot of marijuana, experimented with a host of other mind-expanding substances, and spent a good portion of her early adulthood stoned – with her brother Alexander seemingly even more of a pothead as a young man.   She also partook of Erhard Seminars Training, aka “EST,” a “repackaging of Zen Buddhist principles for Western consumption” (p.175) and a quintessential 1970s way of “getting in touch with one’s inner feelings,” as we said back then.

Late in the memoir, a few years before her father’s death in 1990, Jamie married David Thomas, a man she had met several years earlier at Harvard.  By the end of the memoir, she has given birth to two children, a boy and a girl, and is a devoted mother — but one either separated or divorced from her husband.  She writes that her marriage had centered on David’s ability to relate to her father and fit into the family.  The thrill was gone after Leonard died.  Although the marriage “hung on for another decade,” the “deep harmony we experienced while Daddy was alive never returned” (p.337).   After the detailed run through so many boyfriends, readers will be disappointed that Jamie provides no further insight into why her marriage floundered.

Jamie found her professional niche in preserving her father’s legacy by chance, after volunteering to help her daughter’s preschool start a music program.  “It was the one and only regular music gig I ever had” (p.336), she writes.  Finding that she had a knack for bringing music to young people, a forte of her father, she devised The Bernstein Beat, a project modeled after her father’s Young People’s Concerts but focused on her father’s music.  Jamie presented The Bernstein Beat across the globe, in places as diverse as China and Cuba (in Cuba, she surprised herself by narrating in Spanish, her mother’s native tongue).  She also co-produced a documentary film, Crescendo: The Power of Music, on a program she had observed in Venezuela designed to use music as a way to reach at risk young people and keep them away from street violence.  The film, first presented at the Philadelphia Film Festival, won several prizes and Netflix bought it.

Around 2008, Jamie’s long-time friend, conductor Michael Tilson Thomas, asked her to design and present educational concerts for adults with his Miami-based orchestral academy, the New World Symphony. It turned out to be “the best job ever” for her, to the point that she felt she had become the “poster child for life beginning at fifty” (p.361).  She also began to edit a Leonard Bernstein newsletter, apprising readers of Bernstein-related performances and events.  Preserving her father’s legacy has been a “good trade-off,” she writes: “leading a musician’s life minus the music–making part” (p.362-63).

* * *

                        Jamie writes in a breezy, easy-to-read style, mixing candor – her memoir is nothing if not candid — with ample doses of humor, much of it self-deprecatory.  But without the connection to her father, Jamie’s story is mostly one of a Manhattan rich kid’s angst.  The memoir’s real interest lies in Jamie’s  insights into the character and complexity of her father.

Thomas H. Peebles

Washington, D.C.

January 25, 2020

 

 

 

11 Comments

Filed under American Society, Biography, Music

Uncovering Hair and Corruption in Iran

 

Masih Alinejad, The Wind in My Hair:

My Fight for Freedom in Modern Iran (Little, Brown & Co.,)

              Masih Alinejad, an Iranian national now living in Brooklyn, is recognized internationally as an outspoken advocate for women’s rights and human rights.  She is best known for supporting Iranian women’s right to decide for themselves whether they wish to wear the hijab, the veil covering a women’s hair that is mandatory attire for women and girls as young as seven in contemporary Iran.  She has amassed an impressive string of awards, including the United Nation’s International Women’s Rights Award, the Association for International Broadcasting’s Media Excellence Award, and the Swiss Freethinker Association’s Freethinker Prize.

The title of Masih’s autobiography/memoir, The Wind in My Hair: My Fight for Freedom in Modern Iran, captures her objective for herself and for women who wear the hijab not by choice: all women should have a right to feel the wind in their hair, if that’s what they desire.  From an early age, Masih explains, she looked at her hair as “part of my identity, but you couldn’t see it.  When I was growing up, my hair was no longer part of my body. It had been hijacked and replaced with a head scarf” (p.30).  Before challenging the compulsory hijab, Masih was an investigative journalist in Iran, exposing corruption within the most powerful spheres of the country’s political elite.

Masih was born in 1976, three years prior to the Islamic Revolution of 1979 that overthrew the regime of Shah Mohammad Reza Pahlavi, ending nearly two millennia of rule by Persian kings.  She describes herself as a child of that revolution, one who has “lived nearly all my life under its shadow.  My story is the story of modern Iran, the tension between the secular tendencies of its population and the forced Islamification of the society, and the struggle of women, especially young women, for their rights against the introduction of Sharia law, against violations of human rights and civil liberties” (p.23).  The Shah had reformed Sharia law to allow women many basic rights, with the hijab being largely a matter of personal choice.  But the Shah’s reforms were reversed after the revolution and the state extended increasing control over women’s lives, including the compulsory hijab.  The changes “didn’t happen overnight,” Masih writes, and Iranian women “resisted and put up a fight, especially over the issue of compulsory hijab, which set the tone for how women’s rights would shape up” (p.29).

Through a Facebook page that she established while in exile, entitled “My Stealthy Freedom,” Masih provided a platform for widespread resistance to the compulsory hijab.  On a whim, she posted a picture of herself with no hair covering and cherry blossoms in the background. Exalting in how free she felt, she says she was no longer a “hostage,” a loaded word in Iran. “That simple photograph and message changed my life” (p.308).   Critics complained that she was exploiting a freedom available to her only because she was not in Iran.  Even her reform-minded friends back in Iran thought this was the wrong fight to pick.  To many , the hajib was at best a minor irritant in a country where so many things were wrong.

True enough, Masih responded, but she was sure that, given “half a chance, millions of Iranians would remove their hijab, especially in the privacy of their own cars” and that “every Iranian woman had picture like this, taken in private moments, alone or with friends” (p.311-12).  Even though they could be arrested for showing themselves without covered hair, Iranian women proved eager to show they were “free, powerful, and not ashamed of their bodies” (p.315).  In numbers that astounded her, Iranian women posted photos capturing the “guilty pleasure of breaking unjust rules that allow us a modicum of dignity” (p.313).  Her campaign against the compulsory hijab attracted the attention of super-executive Sheryl Sandberg, who encouraged her to write this memoir.

Masih traveled an improbable path to international fame.  She was born and spent her early years in a dirt-poor rural village in northern Iran, Ghomikola, population 650 — “as far away from the country’s elites as possible,” she notes (p.9).  Parents raising children in this traditional Shiite Muslim village hoped above all that their children would conduct t themselves with  honor and avoid bringing shame to their families.  Young Masih, mischievous and rebellious, fell well short of these overriding parental expectations.  She was expelled from her high school after stealing books from a local bookstore and incurred a jail sentence for the seditious activity of organizing a book club of high school age students.  She found herself pregnant without being married, and gave birth to a son after entering into a hurried if not quite arranged marriage. When the marriage floundered shortly thereafter, she divorced and lost custody of her son.

But divorced and without her son, Masih almost miraculously landed a job as a journalist with a reform-oriented newspaper in Iran, where her professional career took off.  Tensions surrounding the controverted 2009 re-election of Iranian President Mahmoud Ahmadinejad forced Masih into exile, first in the United Kingdom, then in the United States.  In exile, she regained custody of her son, completed a university degree, met the man to whom she is presently married, and undertook her campaign against the compulsory hijab. 

Masih’s effervescent personality shines through all phases of her memoir.  She has an audacious streak that often borders on recklessness. She is frequently absent-minded and disorganized. She has difficulty wearing matching sox, and is always losing apartment and car keys.  Yet, she has an uncanny ability to concentrate when the moment requires intense concentration.  In her frequent face offs with authority figures, among them the omnipresent religious and security police in Iran, along with ayatollahs and political leaders, almost always male, she is breathtakingly quick on her feet.  Her sharp responses to authorities are often leavened with irony that borders on wisecracking. She is someone most of us would like to know.

Masih’s memoir can be broken into three portions: 1) her youth and early adulthood, including her imprisonment, pregnancy and divorce; 2) her years as an investigative reporter in Iran; and 3) her exile years, when she achieved international stardom.  Surprisingly, the last portion, detailing her most highly visible accomplishments, is the least engrossing; it seems disjointed and scattershot, as if written hurriedly to meet a publication deadline.  But the first two sections, charting her unlikely pathway to stardom, make for engrossing and often compelling reading.

* * *

                Masih was the youngest of six children; all slept in the same room. Their house lacked indoor plumbing, a kitchen and a place to bathe or shower (but did include a television).  The family grew most of its own food.  Despite grinding poverty, Masih seemed to have had a happy childhood.  She loved to climb trees and pick pears and walnuts. Her family spoke a local dialect and Masih didn’t learn to speak Persian until she went to school.

Masih’s parents were religiously observant Shiite Muslims, a trait that they somehow failed to pass to their youngest daughter.  Neither was formally educated, but both believed in education for their children.  They wanted their daughters to complete high school before they married, whereas many Ghomikola families saw no advantage to educating girls.  Masih’ s father, AghaJan, was a peddler who sold fruits and vegetables.  He was a fervent believer in the 1979 Islamic Revolution.  His highly traditional views of appropriate roles for girls and young women placed him increasingly at odds with his youngest daughter.  Masih found less and less to talk about with her father as a teenager and, in her adult life, the two stopped communication altogether.

Masih experienced no such break with her mother, Zarrin.  Functionally illiterate and barely five feet tall, Zarrin married AghaJan when she was 14.  But she had skills as a tailor and worked on clothes for people in the village, sometimes offering sewing classes.  It was unusual in Ghomikola for a married woman to earn her own money, rather than being entirely dependent upon her husband.  Her mother was also the source of a decidedly non-traditional expression that guided Masih throughout her adult life: “If they lock the front door, go in through the back door. If the doors are barred, go through the windows. If they shutter the windows, climb through the chimney. Never let them lock you out. Always try to get in” (p.156). Yet, as much as Masih loved and respected her mother, she seemed to know from an early age that she wanted a different life for herself.  “For Mother, family and reputations had special meanings that were lost on me.  Her days were predictable, while I wanted mine to be full of surprises” (p.78).

Thanks to her mother’s intervention with local school authorities, Masih was reassigned to another high school when she was apprehended stealing books from a local bookstore, which she rationalized as necessary to feed a voracious reading habit in a family that could not afford to buy books.  At her new school, Masih and a classmate started a book club that featured leftist literature about human rights, freedom and the meaning of democracy.  The group also drafted and distributed an underground pamphlet advocating freedom for political prisoners,  activities considered seditious in Iran.  The group included a young man, Reza, who seemed interested in Masih, telling her that he was writing poetry for her.

Although Reza turned out to be Masih’s first romantic interest, the romance was placed on hold when first Reza, then Masih, were arrested and sent to prison for anti-revolutionary activities.  While in prison, Masih learned that she was pregnant with Reza’s child.  After appearing in a “Revolutionary Court, ” with secret proceedings and no right to a lawyer, Masih received a five-year sentence, suspended on condition of what amounted to “good behavior.” But she did not graduate from high school and still had to deal with her pregnancy.

“I had dreams of traveling and exploring the world,” Masih writes, “and now before I had even left Ghomikola I was trapped. My destiny was already set. . . I had to explain away another mark of shame to my parents – I was pregnant before being properly married” (p.100). Abortion proved not to be an option and she bore the child she carried, her son Pouyan, who in different ways remains part of his mother’s story for the rest of the memoir.  Although she was not ready for being either a wife or a mother, Masih married Reza.  It was not the usual sequence in Ghomikola, where “very few women get pregnant before their wedding night . . . I was bringing dishonor to my family” (p.107).

With Reza unable to find a job, the couple set out for Tehran.  Masih worked briefly as a photographer while Reza wrote poetry. Then, seemingly out of the blue, Reza returned to their apartment one day to announce that he was in love with another woman, whom he wanted to marry.  He found their marriage too confining for his poetic ambitions and needed a divorce – he couldn’t write and she was holding him back. “Once again, I had notched a family first,” Masih writes despondently: “The first woman in our family to be arrested, the first to be jailed, and the first to be pregnant before her wedding. I would now be the first in all of Ghomikola to be divorced. It didn’t matter that Reza was leaving me; everyone would think that it was somehow my fault” (p.134).

Masih had no chance of retaining custody of the couple’s son under Iranian law.  AghaJan urged her to return to Ghomikola where he would help her find a new husband, leading her to the realization that divorced women in Iran have “no identity of their own. My father was not unique; he was a reflection of Iranian culture.  In many villages and small cities, there is an expectation that a divorced woman should sit at home and wait for her next husband” (p.144). As she turned 24, Masih’s short marriage was over, she had lost custody of her son and she had a prison record but no high school diploma.  Yet, she says she “blossomed” after her divorce and loss of custody. “It was painful,” she writes, “but I was suddenly free to grow and be myself. I wasn’t looking for new directions in my life, but I had little choice. The hardships I went through forged me” (p.143).

* * *

                     By sheer audacity, Masih landed an interview with Hambastegi, a daily paper associated with reform politics. She volunteered to work without pay at the outset, to see how it worked out.  She was assigned to cover Iran’s parliament, the Majlis.  She memorized the phone numbers of relevant parliamentarians and called them at all times of day or night, playing up her status as a neophyte woman reporter.  She knew the parliamentarians’ personal histories, had a loud voice, and understood how male politicians “can be relied upon to be patronizing to women,” thereby providing her with “great quotes” (p.157).  Iran’s conservative newspapers referred to her as “the Ugly Duckling,” which she considered a badge of honor.

In her most sensational scoop, using carefully cultivated sources – shades here of Watergate and “Deep Throat” – Masih exposed how lawmakers routinely lined their pockets with secret bonus payments above and beyond their salaries.  Through tough talk and more than a little bluffing –Masih says she became a master of the art of bluffing — she extracted a pay stub from a deputy that showed the equivalent of about $1,100 US for “consideration of Deputies’ Expenses.” Suddenly, she had hard evidence of a slush fund to make undeclared payments to the deputies. “There’d be no going back,” she writes. “I would be marked, but the story was worth it. It was for moments like that that I had rebelled against my family and endured all sorts of hardships. I wasn’t naïve. I knew there’d be a price to pay later” (p.195).  Conservative newspapers claimed she had stolen the pay stub, and some indicated that she had obtained it through “flirting.”

Masih “loved being a Majlis reporter . . . [H]olding politicians accountable and exposing their lies were all part of a day’s work,” she writes.   As disorganized as she was in her private life, when it came to covering politics, it was “as if a switch had been turned on” (p.190). Not surprisingly, Masih became persona non grata at Parliament and in 2005 achieved another first: the first journalist to be expelled from the Majlis.  But her expulsion sparked a latent interest in issues particular to women.  “Feminism was taboo in Iran,” she explains. “As a parliamentary journalist, I couldn’t risk being seen to be involved in feminism and women’s rights activism. To be honest, I didn’t have the time; nor did I want to risk another black mark against my name” (p.211).

* * *

                      Masih had vigorously opposed Mahmoud Ahmadinejad since his first election to the Iranian presidency in 2005, in an election probably abetted by voter fraud.  As the 2009 elections approached, Masih, like many younger Iranians, thought the country was poised to elect a genuine reform candidate.  But the election resulted in Ahmadinejad being declared the winner, again amidst credible allegations of voter fraud, precipitating massive post-election demonstrations in June 2009 and a savage crackdown.  Masih was advised to leave Iran for her own safety, and to this day has not returned.

She landed in Britain, where she pursued a degree in communications at Oxford Brooke University, and regained custody of her son, who was then a teenager.  She began producing documentaries focusing on the families of victims killed in the post-election crackdown.   She also pursued a quixotic idea to interview newly elected American president Barack Obama, and surprised herself by how close she came to being granted an interview. She received a visa to enter the United States, but in the aftermath of the contested 2009 Iranian presidential election, the White House decided that strategically the timing for her interview was not right.  While in the United States, Masih made the acquaintance of an Iranian-American journalist for Bloomberg News, Kambiz Foroohar, who became an increasing presence in her life. From the time of their initial meeting, the unflappable Kambiz served as an invaluable check on Masih’s enthusiasm and her tendency to get too far out in front of herself.  The couple married in 2014.

After Facebook executive Sheryl Sandberg mentioned Masih’s “My Stealthy Freedom” page at the “Most Powerful Women Summit,” an event sponsored by Fortune magazine, she and Masih exchanged emails.   Sandberg then invited Masih to Facebook headquarters in Menlo Park, California.  During the visit, Sandberg suggested to Masih that she write a book about her life’s experiences for English-language readers (she had already published a handful of works in Persian).  While in the United States, as a follow up to “My Stealthy Freedom,” Masih also established #WhiteWednesday, which encourages Iranian men and women to wear white on Wednesdays to protest against the compulsory hijab.  She has tried, without much success, to convince high-level women visitors to Iran not to cover their hair.  Almost all, to Masih’s dismay, contend that they need to show sensitivity to local customs.

In the summer of 2016, Masih came out firmly against the ban in some French towns of the burkini, the full-body swimwear used by some Muslim women. “The police in France were behaving just like the morality police in Iran,” (p.367), she writes.  Both had “problems with choices made by women, and both acted as if women’s bodies were the territory of lawmakers and law enforcement, who alone knew what was best” (p.367). But she nonetheless found it more than ironic that Iran, which denies its own women the freedom to choose, called on France to “respect the human rights of Muslims who chose to dress in Islamic fashion” (p.367).

Masih presently works today for the Voice of America’s Persian Service.  Recently, her brother and two siblings of her first husband were arrested, and even her mother was called in for questioning by security officials, all part of what Masih considers an effort to intimidate her into silence from abroad.  Like Masih’s memoir itself, this recent heavy-handedness constitutes a reminder of how little has changed since the 1979 Islamic Revolution.  Iran remains a repressive religious dictatorship, with few secular spaces and no tolerance for notions like due process and the rule of law.  The place of women is still determined by, as Masih puts it, laws “devised by misogynists who find guidance and precedent in the seventh century” (p.141)..

* * *

                         Assiduous readers of this blog will see many resemblances between Masih Alinejad and Manal al-Sherif, the Saudi Arabian woman of about the same age who wrote a memoir about her championing the cause of women driving in her native land, reviewed here in October 2017 (that review also included a work by Sherin Ebadi, a human rights lawyer who was the first Iranian and first Muslim woman to win a Nobel Prize; Ebadi makes brief appearances in Masih’s memoir).  Notwithstanding the geopolitical and religious rivalries that divide their two countries, it is striking how similar the two women’s stories are.  Each mobilized Facebook and other social media to launch a campaign designed to eliminate a state-imposed obstacle to women’s rights.  Each endured a jail sentence.  The personal stories of the two women also align.  Each was raised in poverty by uneducated parents who nonetheless valued education for their children.  After unsuccessful early marriages in countries where the husband-wife relationship is far from equal, both became divorced mothers of young sons.  Each pursued a career and advanced study after divorce, and both now appear to be happily married.  While both continue to be active in issues involving women’s rights and human rights in their native countries, each must do so from afar, with al-Sherif now living in Australia.  How I’d love to put these two women in the same room together, then assume a fly-on-the-wall posture as they exchange war stories.

Saudi Arabia recently lifted its ban on women driving, while the hijab remains obligatory attire in today’s turbulent Iran.  But anyone reading this memoir will come away convinced that, at a minimum, no one should ever underestimate what Masih Alinejad is capable of achieving, for herself and for her country.

Thomas H. Peebles

La Châtaigneraie, France

December 29, 2019

 

3 Comments

Filed under Biography, Gender Issues, Middle Eastern History