Category Archives: United States History

Criticizing Government Was What They Knew How To Do

 

Paul Sabin, Public Citizen:

The Attack on Big Government and the Remaking of American Liberalism

(W.W. Norton & Co., 2021)

1965 marked the highpoint for Democratic President Lyndon Johnson’s Great Society program, an ambitious set of policy and legislative initiatives which envisioned using the machinery of the federal government to alleviate poverty, combat racial injustice and address other pressing national needs.  Johnson was coming off a landslide victory in the November 1964 presidential election, having carried 44 states and the District of Columbia with the highest percentage of the popular vote of any presidential candidate in over a century.  Yet a decade and a half later, in January 1981, Republican Ronald Reagan, after soundly defeating Democratic incumbent Jimmy Carter,  took the presidential oath of office declaring “government is not the solution, it is the problem.”

How did government in the United States go in a fifteen-year period from being the solution to society’s ills to the cause of its problems?  How, for that matter, did the Democratic Party go from dominating the national political debate up through the mid-1960s to surrendering the White House to a former actor who had been considered too extreme to be a viable presidential candidate?  These are questions Yale University professor Paul Sabin poses at the outset of his absorbing Public Citizens: The Attack on Big Government and the Remaking of American Liberalism.  Focusing on the fifteen-year period 1965-1980, Sabin proffers answers centered on Ralph Nader and the “public interest” movement which Nader spawned.

1965 was also the year Nader rocketed to national prominence with his assault on automobile safety, Unsafe at Any Speed.  General Motors notoriously assisted Nader in his rise by conducting a concerted campaign to harass the previously obscure author.  From there, Nader and the lawyers and activists in his movement – often called “Nader’s Raiders” — turned to such matters as environmentalism, consumer safety and consumer rights, arguing that the government agencies charged with regulating these matters invariably came to be captured by the very industries they were designed to regulate, without the voice of the consumer or end user being heard.  “Why has business been able to boss around the umpire” (p.86) was one of Nader’s favorite rhetorical questions.

Because of both industry influence and bureaucratic ineffectiveness, government regulatory authority operated in the public interest only when pushed and prodded from the outside, Nader reasoned.  In Nader’s world, moreover, the Democratic and Republican parties were two sides of the same corrupt coin, indistinguishable in the degree to which they were both beholden to special corporate interests — “Tweddle Dee and Tweddle Dum,” as he liked to put it.

Reagan viewed government regulation from an altogether different angle.  Whereas Nader believed that government, through effective regulation of the private sector, could help make consumer goods safer, and air and water cleaner, Reagan sought to liberate the private sector from regulation.  He championed a market-oriented capitalism designed to “undermine, rather than invigorate, federal oversight” (p.167).  Yet, Sabin’s broadest argument is that Nader’s insistence over the course of a decade and a half that federal agencies used their powers for “nefarious and destructive purposes” (p.167) — — the “attack on big government” portion of his  title – rendered plausible Reagan’s superficially similar attack.

The “remaking of American liberalism” portion of Sabin’s sub-title might have better been termed “unmaking,” specifically the unmaking of the political liberalism rooted in Franklin Roosevelt’s New Deal – the liberalism which Johnson sought to emulate and build upon in his Great Society, based on a strong and active federal government. Following in the New Deal tradition, Roosevelt’s Democratic party controlled the White House for all but eight years between 1933 and 1969.  Yet, when Reagan assumed the presidency in 1981, New Deal liberalism had clearly surrendered its claim to national dominance.

Most interpretations of how and why New Deal liberalism lost its clout are rooted in the 1960s, with the decade’s anti-Vietnam war and Civil Rights movements as the principal actors.  The Vietnam war separated older blue-collar Democrats, who often saw the war in the same patriotic terms as World War II, from a younger generation of anti-war activists who perceived no genuine US interests in the conflict and no meaningful difference in defense and foreign policy between Democrats and Republicans.  The Civil Rights movement witnessed the defection of millions of white Democrats, unenthusiastic about the party’s endorsement of full equality for African Americans, to the Republican Party.

Nader and the young activists following him were also “radicalized by the historical events of the 1960s, particularly the civil rights movement and the Vietnam War” (p. p.48), Sabin writes.  These were their “defining issues,” shaping “their view of the government and their ambitions for their own lives” (p.51).   We cannot imagine Nader’s movement “emerging in the form that it did separate from civil rights and the war” (p.48).  But by elaborating upon the role of the public interest movement in the breakdown of New Deal liberalism and giving more attention to the 1970s, Sabin adds nuance to conventional interpretations of that breakdown.

The enigmatic Nader is the central figure in Sabin’s narrative.  Much of the book analyzes how Nader and his public interest movement interacted with the administrations of Lyndon Johnson, Richard Nixon, Gerald Ford, and Jimmy Carter, along with brief treatment of the Reagan presidency and that of Bill Clinton.  The Carter years, 1977-1981, revealed the public interest movement’s most glaring weakness: its “inability to come to terms with the compromises inherent in running the executive branch” (p.142), as Sabin artfully puts it.

Carter was elected in 1976, when the stain of the Watergate affair and the 1974 resignation of Richard Nixon hovered over American politics, with trust in government at a low point.  Carter believed in making government regulation more efficient and effective, which he saw as a means of rebuilding public trust.   Yet, he failed to craft what Sabin terms a “new liberalism” that could “champion federal action while also recognizing government’s flaws and limitations” (p.156).

That failure was due in no small measure to frequent and harsh criticism emanating from public interest advocates, whose critique of the Carter administration, Sabin writes, “held those in power up against a model of what they might be, rather than what the push and pull of political compromise and struggle allowed” (p.160).  Criticizing government power was “what they knew how to do, and it was the role that they had defined for themselves”  (p.156). Metaphorically, it was “as if liberals took a bicycle apart to fix it but never quite figured out how to get it running again” (p.xvii).

 * * *

Sabin starts by laying out the general parameters of New Deal liberalism: a technocratic faith that newly created administrative agencies and the bureaucrats leading them would act in the public interest by serving as a counterpoint to the power of private, especially corporate, interests.  By the mid-1950s, the liberal New Deal conception of “managed capitalism” had evolved into a model based on what prominent economist John Kenneth Galbraith termed “countervailing powers,” in which large corporations, held in balance by the federal regulatory state, “would check each other’s excesses through competition, and powerful unions would represent the interests of workers.  Government would play a crucial role, ensuring that the system did not tilt too far in one direction or the other” (p.7-8).

Nader’s public interest movement was built around a rejection of Galbraith’s countervailing power model.  The model failed to account for the interests of consumers and end users, as the economist himself admitted later in his career.  If there was to be a countervailing power, Nader theorized, it would have to come through the creation of “independent, nonbureaucratic, citizen-led organizations that existed somewhat outside the traditional American power structure” (p.59).  Only such organizations provided the means to keep power “insecure” (p.59), as Nader liked to say.

Nader’s vision could be described broadly as “ensuring safety in every setting where Americans might find themselves: workplace, home, doctor’s office, highway, or just outside, breathing the air”  (p.36).  In a 1969 essay in the Nation, Nader termed car crashes, workplace accidents, and diseases the “primary forms of violence that threatened Americans” (p.75), far exceeding street crime and urban unrest.  For Nader, environmental and consumer threats revealed the “pervasive failures and corruption of American industry and government” (p.76).

Nader was no collectivist, neither a socialist nor a New Dealer.  He emphasized open and competitive markets, small private businesses, and especially an activated citizenry — the “public citizens” of his title.  More than any peer, Nader sought to “create institutions that would mobilize and nurture other citizen activists” (p.35).  To that end, Nader founded dozens of public interest organizations, which were able to attract idealistic young people — lawyers, engineers, scientists, and others, overwhelmingly white, largely male — to dedicate their early careers to opposing the “powerful alliance between business and government” (p.24).

Nader envisioned citizen-led public interest organizations serving as a counterbalance not only to business and government but also to labor.  Although Nader believed in the power of unions to represent workers, he was “deeply skeptical that union leaders would be reliable agents for progressive reform”  (p.59).  Union bosses in Nader’s view “too often positioned themselves as partners with industry and government, striking bargains that yielded economic growth, higher wages, and unions jobs at the expense of the health and well-being of workers, communities, and the environment” (p.59).   Nader therefore “forcefully attacked the unions for not doing enough to protect worker safety and health or to allow worker participation in governance” (p.64).

Nader‘s Unsafe at Any Speed was modeled after Rachel Carson’s groundbreaking environmental tract Silent Spring, to the point that it was termed the “Silent Spring of traffic safety”  (p.23).  Nader’s auto safety advocacy, Sabin writes, emerged from “some of the same wellsprings as the environmental movement, part of an increasingly shared postwar concern about the harmful and insidious impacts of new technologies and processes” (p.23).  In 1966, a year after publication of Unsafe at Any Speed. Congress passed two landmark pieces of legislation, the Traffic Safety Act and the Highway Safety Act, which forced manufacturers to design safer cars and pressed states to carry out highway safety programs.  Nader then branched out beyond auto safety to tackle issues like meat inspection, natural-gas pipelines, and radiation safety.

Paradoxically, the Nixon years were among the most fruitful for Nader and the public interest movement.  Ostensibly pro-business and friendly with blue-collar Democrats, Nixon presided over a breathtaking expansion of federal regulatory authority until his presidency was pretermitted by the Watergate affair.  The Environmental Protection Agency was created in 1970, consolidating several smaller federal units.  New legislation which Nixon signed regulated air and water pollution, energy production, endangered species, toxic substances, and land use — “virtually every sector of the US economy” (p.114), Sabin writes.

The key characteristics of Nader-influenced legislation were deadlines and detailed mandates, along with authority for citizen suits and judicial review, a clear break from earlier regulatory strategies.  The tough legislation signaled a “profound and pervasive distrust of government even as it expanded federal regulatory powers” (p.82).   Nader and the public interest movement went after Democrats in Congress with a fervor at least equal to that with which they attacked Republican-led regulatory agencies.  Nader believed that “you didn’t attack your enemy if you wanted to accomplish something, you attacked your friend”  (p.82).

In the early 1970s, the public interest movement targeted Democratic Maine Senator Edmund Muskie, the party’s nominee for Vice-President in 1968, whose support for the environmental movement had earned him the moniker “Mr. Pollution Control.” Declaring his environmental halo unwarranted, the movement sought to take down a man who clearly wanted to ride the environmental issue to the White House.  Nader’s group also went after long-time liberal Democrat Jennings Randolph of West Virginia over coal-mining health and safety regulations.  The adversarial posture toward everyone in power, Democrat as well as Republican, continued into the short interim administration of Gerald Ford, who assumed the presidency in the wake of the Watergate scandal.  And it continued unabated during the administration of Jimmy Carter.

As the Democratic nominee for president, Carter had conferred with Nader during the 1976 campaign and thought he had the support of the public interest movement when he entered the White House in January 1977.  Many members of the movement took positions in the new administration, where they could shape the agencies they had been pressuring.  The new president sought to incorporate the public interest movement’s critiques of government into a “positive vision for government reform,” promoting regulatory approaches that “cut cost and red tape without sacrificing legitimate regulatory goals” (p.186).

Hoping to introduce more flexible regulatory strategies that could achieve environmental and health protection goals at lower economic cost, Carter sacrificed valuable political capital by clashing with powerful congressional Democrats over wasteful and environmentally destructive federal projects. Yet, public interest advocates faulted Carter for his purported lack of will more than they credited him for sacrificing his political capital for their causes.  They saw the administration’s questioning of regulatory costs and the redesign of government programs as “simply ways to undermine those agencies.” (p.154).   Their lack of enthusiasm for Carter severely undermined his reelection bid in the 1980 campaign against Ronald Reagan.

Reagan’s victory “definitively marked the end of the New Deal liberal period, during which Americans had optimistically looked to the federal government for solutions” (p.165), Sabin observes.  Reagan and his advisors “vocally rejected, and distanced themselves from, Carter’s nuanced approach to regulation”  (p.172). To his critics, Reagan appeared to be “trying to shut down the government’s regulatory apparatus” (p.173).

But in considering the demise of New Deal liberalism, Sabin persuasively demonstrates that the focus on Reagan overlooks how the post-World War II administrative state “lost its footing during the 1970s” (p.165).    The attack on the New Deal regulatory state that culminated in Reagan’s election, usually attributed to a rising conservative movement, was also “driven by an ascendant liberal public interest movement” (p.166).   Sabin’s bottom line: blaming conservatives alone for the end of the New Deal is “far too simplistic” (p.165).

* * *

Sabin mentions Nader’s 2000 presidential run on the Green Party ticket only at the end and only in passing.  Although the Nader-inspired public interest movement had wound down by then, Nader gained widespread notoriety that year when he gathered about 95,000 votes in Florida, a state which Democratic nominee Al Gore lost officially by 537 votes out of roughly six million cast (with no small amount of assistance from a controversial 5-4 Supreme Court decision).  Nader’s entire career had been a rebellion against the Democratic Party in all its iterations, and his quixotic run in 2000 demonstrated that he had not outgrown that rebellion.  His presidential campaign took his “lifelong criticism of establishment liberalism to its logical extreme” (p.192).

Thomas H. Peebles

Paris, France

May 13, 2022

 

5 Comments

Filed under American Politics, Political Theory, Politics, United States History

American Polarizer

 

 

 

James Shapiro, Shakespeare in a Divided America:

What His Plays Tell Us About Our Past and Our Future

(Penguin Press, 2020)

In June 2017, New York City’s Public Theater staged a production in Central Park of William Shakespeare’s Julius Caesar, directed by Oskar Eustis, as part of the series known as Shakespeare in the Park.  As in many 21st century Shakespeare productions, non-whites had several leading roles and women played men’s parts.  Eustis’ Caesar, knifed to death in Act III, bore more than passing resemblance to President Donald J. Trump: he had strange blond hair, wore overly long red ties, tweeted from a golden bathtub, and had a wife with a Slavic accent.

A protestor interrupted one of the early productions, jumping on stage after the assassination of Caesar to shout, “This is violence against Donald Trump,” according to The New York Times.  Breitbart News picked up on the story with the headline “’’Trump’ stabbed to death.”  Fox News weighed in, expressing concern that the play encouraged violence against the president.  Corporate sponsors pulled out.  Threats were levied not only against the Public Theater and its actors, but also against other Shakespeare productions throughout the country.  A fierce but unedifying battle was fought on social media, with little regard for the ambiguities underlying Caesar’s assassination in the play.

The polemic engendered by Eustis’ Julius Caesar unsettled Columbia University Professor James Shapiro, one of academia’s foremost Shakespeare experts.  Shapiro also serves as Shakespeare Scholar in Residence at the Public Theater and in that capacity had advised Eustis’ team on some of the play’s textual issues. His most recent work, Shakespeare in a Divided America: What His Plays Tell Us About Our Past and Our Future, constitutes his response to the polemic, in which he demonstrates convincingly that the frenzied reaction to the 2017 Julius Caesar performance was no aberrational moment in American history.

Starting and finishing with the 2017 performance, Shapiro identifies seven other historical episodes in which a Shakespeare play has been enmeshed in the nation’s most divisive issues: racism, slavery, class conflict, nationalism, immigration, the role of women, adultery and same sex love.  Each episode constitutes a separate chapter with a specific year. Shapiro dives deeply and vividly into the circumstances surrounding all seven, revealing a flair for writing and recounting American history that rivals what he brings to his day job as an interpreter of Shakespeare, his plays and his age.  Of the seven episodes, the most gripping is his description of the 1849 riot at New York City’s upscale Astor Place Opera House, one of the worst in the city’s history up to that point.  By comparison, the 2017 brouhaha over Julius Caesar seems like a Columbia graduate school seminar on Shakespeare.

* * *

Fueled by raw class conflict, nationalism and anti-British sentiment, the Astor Place riot was described in one newspaper as the “most sanguinary and cruel [massacre] that has ever occurred in this country,” an episode of “wholesale slaughter” (p.49)— all arising out of competing versions of Macbeth, starring competing actors.  The Briton William Macready, performing as Macbeth at Astor Place, and the American Edwin Forrest, simultaneously rendering Macbeth at the Bowery Theatre, only a few blocks away but in a decidedly rougher part of town, offered opposing approaches to playing Macbeth that seemed to highlight national differences between the United States and Great Britain: Forrest, the “brash American, Macready the sensitive Englishman” (p.66).  Macready’s “accent, gentle manliness, and propriety represented a world that was being overtaken by everything that Forrest, guiding spirit of the new and for many coarser age of Manifest Destiny, represented”  (p.66), Shapiro writes.

Shapiro’s description of the riot underscores how theatres in a rapidly growing New York City in the 1840s were democratic meeting points.  They were  “one of the few places in town where classes and races and sexes, if they did not exactly mingle, at least shared a common space. This meant, in practice, that the inexpensive benches in the pit were filled mostly by the working class, the pricier boxes and galleries were occupied by wealthier patrons, and in the tiers above, space was reserved for African Americans and prostitutes” (p.56).  The Astor Place Opera House, built in 1847, was an explicit response of New York’s upper crust to these democratizing tendencies. It did not admit unaccompanied women – there was no place for prostitutes – and it imposed a dress code.  The new rules were seen as fundamentally undemocratic, especially to the city’s large number of recent German and Irish immigrants.

While Forrest opened at the Bowery, Forrest fans somehow obtained tickets to the opening Astor Place performance—who paid for them, Shapiro indicates, remains a mystery—and began heckling Macready, telling him to get off the stage, “you English fool.”  Three days later, the heckling recurred.  But this time a crowd of about 10,000 had gathered outside, an unruly mix of Irish immigrants and native-born Americans, groups that had common cause in anti-English and anti-aristocratic sentiment (many of the Irish immigrants were escaping the Irish potato famine of the mid-1840s, often attributed to harsh British policies; see my 2014 review here of John Kelly’s The Graves Are Walking: The Great Famine and the Saga of the Irish People).  Incited by political leaders and their cronies, the crowd began to throw bricks and stones. They fought a battle with police that continued for several days, with dozens of deaths on both sides.

There were “no winners in the Astor Place riots,” Shapiro writes. The mayhem “brought into sharp relief the growing problem of income inequality in an America that preferred the fiction that it was still a classless society” (p.76).  But the riots also spoke to an “intense desire by the middle and lower classes to continue sharing the public space [of the theatre], and to oppose, violently if necessary, efforts to exclude them from it.  Shakespeare continued to matter and would remain common cultural property in America” (p.78).

In two other powerful chapters, Shapiro demonstrates how Shakespeare’s plays also intertwined with mid-19thcentury America’s excruciating attempts to come to terms with racism and slavery.  One examines abolitionist former president John Quincy Adams’ public feud in the 1830s over what he considered the abominable inter-racial relationship Shakespeare depicts in Othello between Desdemona and the dark-skinned Othello.  In the second, Shapiro shows how, in a twist that was itself Shakespearean, fate linked President Abraham Lincoln, a man who loved Shakespeare and identified with Macbeth, to his assassin, second-rate Shakespearean actor John Wilkes Booth, himself obsessed with both Julius Caesar and what he perceived as Lincoln’s efforts to undermine the supremacy of the white race.

John Quincy Adams, who served as president from 1825 to 1829, found Desdemona’s physical intimacy with Othello, known at the time as “amalgamation” (“miscegenation” did not enter the national vocabulary until the 1860s), to be an “unnatural passion” against the laws of nature.  Adams’ views might have gone largely unnoticed but for a dinner party in 1833, in which the 66 year old former president was seated next to 23 year old Fanny Kemble, a rising young Shakespearean actress from England.  Adams apparently thrust his views of the Othello-Desdemona relationship upon the unsuspecting Kemble.

Two years later, Kemble published a journal about her trip to the United States, in which she described her dinner conversation with the former president.  A piqued Adams felt compelled to respond, elaborating in print about how repellent he found the Desdemona-Othello relationship. The dinner conversation of two years earlier between the ex-president and the rising British actress thus became national news and, with it, Adams’ anxieties about not only the dangers of race-mixing but also the threat posed by disobedient women.

Yet, the ex-president who was so firmly against amalgamation was also a firm abolitionist.  Adams’ abolitionist convictions, Shapiro writes, “seem to have required a counterweight, and he found it in this repudiation of amalgamation” (p.20).  By directing his hostility at Desdemona rather than Othello, moreover, Adams astutely sidestepped criticizing black men, and it “proved more convenient to attack a headstrong young fictional woman than a living one” (p.20).  Although a prolific writer, Adams’ public feud with Kemble represented his sole written attempt to square his disgust for interracial marriage with his abolitionist convictions, and he chose to do so “only through his reflections on Shakespeare” (p.20).

Abraham Lincoln, from humble frontier origins with almost no formal schooling, developed a life-long passion for Shakespeare as a youth.  Shapiro notes that the adult Lincoln regularly asked friends, family, government employees, and relative strangers to listen to him recite, sometimes for hours on end – and then discuss – the same few passages from Shakespeare again and again.  John Wilkes Booth too grew up with Shakespeare, but in altogether different circumstances.

Booth’s father owned a farm in rural Maryland but was also a leading English Shakespearean actor who immigrated to the United States and became a major figure on the American stage.  His three sons followed in their father’s footsteps, with older brothers Edwin and Julius attaining genuine star status, a status that eluded their younger brother John.  Although Maryland was a border state that did not join the Confederacy, John, who had been convinced from his earliest years that whites were superior to blacks, was naturally drawn to the Southern cause.

In 1864, both the year of Lincoln’s re-election and the 300th anniversary of Shakespeare’s birth, Booth was stalking Lincoln and plotting his removal with Confederate operatives.  Lincoln, who had less than six months to live when he was re-elected in November, found himself brooding more and more about Macbeth in his final months, and especially about the murdered King Duncan.  Through his reflection upon the guilt-ridden Macbeth, Shapiro writes, Lincoln felt the “deep connection between the nation’s own primal sin, slavery, and the terrible cost, both collective and personal, exacted by it” (p.113)

After Booth assassinated Lincoln at Ford’s Theater in Washington in April 1865, many of Lincoln’s enemies likened the assassin, whose favorite play was Julius Caesar, to Brutus as a man who killed a tyrant.  But Macbeth proved to be the play that the nation settled on to “give voice to what happened, and define how Lincoln was to be  remembered”(p.116).  Booth had “failed to anticipate that the man he cold-bloodedly murdered would be revered like Duncan, his faults forgotten” (p.118).  For a divided America, the universal currency of Shakespeare’s words offered what Shapiro terms a “collective catharsis” which permitted a “blood-soaked nation to defer confronting once again what Booth declared had driven him to action: the conviction that American ‘was formed for the white not for the black man’” (p.118).

The year 1916 was the 300th anniversary of Shakespeare’s death, a year in which one of his least known plays, The Tempest, was used to bolster the case for anti-immigration legislation. The Tempest centers on Caliban, who is left behind, rather than on those who immigrate.  But the point is the same, Shapiro argues: a “more hopeful community . . . depends on somebody’s exclusion” (p.125).  This notion resonated in particular with Massachusetts Senator Henry Cabot Lodge, an avid Shakespeare reader who led the early 20th century anti-immigration campaign.

The unusual number of performances of The Tempest during that tercentenary year meshed with the fierce debate that Lodge led in Congress over immigration.  The legislation that passed the following year curtailed the influx into the United States of immigrants representing “lesser races,” most frequently a reference to Southern and Eastern Europeans. “How Shakespeare and especially The Tempest were conscripted by those opposed to the immigration of those deemed undesirable is a lesser known part of this [immigration] story” (p.124), Shapiro writes.

Closer to the present, Shapiro has chapters on the 1948 Broadway musical, play, Kiss Me, Kate, later a film, about the cast of Shakespeare’s The Taming of the Shrew, which raised the issue of the roles of women in a post-war society; and on the 1998 film Shakespeare in Love, by far the most successful film to date about Shakespeare or any of his plays, which began as a film about same-sex love but evolved into one about adultery.

Kiss Me, Kate takes place at the backstage of a performance of The Taming of the Shrew.  With music and lyrics provided by Cole Porter, the Broadway musical contrasted the emerging, post-World War II view of the role of women with the conventional stereotyped gender roles in the Shakespeare play itself, thereby featuring “rival visions of the choices women faced in postwar America” (p.160).  In Shakespeare’s play, “women are urged to capitulate and their obedience to men is the norm,” while backstage “independence and unconventionality hold sway” (p.160).  Kiss Me, Kate deftly juxtaposed a “front stage Shakespeare world that mirrored the fantasy of a patriarchal, all-white America” with a backstage one that was “forthright about a woman’s say over her desires and her career” (p.162).

In the earliest version of the film Shakespeare in Love in 1992, Will found himself attracted to the idea of same sex attraction (he was actually attracted to a woman dressed as a man, but the point was that Will thought she was a he).  But same sex love was reduced to a mere hint in the final version, about how the unhappily married Will’s affair with another woman, Viola, helped him overcome his writer’s block, finish Romeo and Juliet, and go on to greatness.  Those creating and marketing Shakespeare in Love, Shapiro writes, “clearly felt that a gay or bisexual Shakespeare was not something that enough Americans in the late 1990s were ready to accept” (p.194).  For box-office success, “Shakespeare could be an adulterer, but he had to be a heterosexual one in a loveless marriage” (p.194).

Shakespeare in Love ends with Viola leaving Will and England for America, reinforcing a myth that persisted from the 1860s through the 1990s of a direct American connection to Shakespeare  — anti-immigration Senator Lodge was one of its most exuberant proponents.  This fantasy, Shapiro writes, speaks to our desire to “forge a physical connection between Shakespeare and America” as the land where his “inspiring legacy came to rest and truly thrived” (p. 193).

* * *

While finding no credible evidence for a direct American  connection to Shakespeare, Shapiro sees a legacy in Shakespeare’s plays that should inspire Americans of all hues and stripes.  Pained by the polarization he witnessed at the 2017 Julius Caesar performance, Shapiro expresses the hope that his book might “shed light on how we have arrived at our present moment, and how, in turn, we may better address that which divides and impedes us as a nation” (p.xxix).  The hope seems forlorn in light of the examples he so brilliantly details, pointing mostly in the other direction: a Shakespeare on the cutting edge of America’s social and political divisions, with his plays often doing the cutting.

Thomas H. Peebles

Paris, France

September 19, 2021

[NOTE: A nearly identical version of this review has also been posted to the Tocqueville 21 Blog, maintained in connection with the American University of Paris’ Tocqueville Review and its Center for Critical Democracy Studies]

 

 

 

10 Comments

Filed under American Politics, American Society, Literature, Politics, United States History

Converging Visions of Equality

 

Peniel E. Joseph, The Sword and the Shield:

The Revolutionary Lives of Malcolm X and Martin Luther King, Jr. (Basic Books)

[NOTE: A version of this review has been posted to the Tocqueville 21 blog: https://tocqueville21.com/books/king-malcolm-x-civil-rights/.  Tocqueville 21 takes its name from the 19th century French aristocrat who gave Americans much insight into their democracy.  It seeks to encourage in-depth thinking about democratic theory and practice, with particular but by no means exclusive emphasis on the United States and France.  The site is maintained in connection with the American University of Paris’ Tocqueville Review and its Center for Critical Democracy Studies].

Martin Luther King, Jr., and Malcolm X met only once, a chance encounter at the US Capitol on March 26, 1964.  The two men were at the Capitol to listen to a debate over what would become the Civil Rights Act of 1964, a measure that banned discrimination in employment, mandated equal access to most public facilities, and had the potential to be the most consequential piece of federal legislation on behalf of equality for African-Americans since the Reconstruction era nearly a century earlier.  There wasn’t much substance to the encounter. “Well, Malcolm, good to see you,” King said.  “Good to see you,” Malcolm responded. There may have been some additional light chitchat, but not much more.  Fortunately, photographers were present, and we are the beneficiaries of several iconic photos of the encounter.

That encounter at the Capitol constitutes the starting point for Peniel Joseph’s enthralling The Sword and the Shield: The Revolutionary Lives of Malcolm X and Martin Luther King, a work that has some of the indicia of a dual biography, albeit highly condensed.  But Joseph, a professor at the University of Texas at Austin who has written prolifically on modern African American history, places his emphasis on the two men’s intellectual journeys.  Drawing heavily from their speeches, writings and public debates, Joseph challenges the conventional view of the two men as polar opposites who represented competing visions of full equality for African Americans.  The conventional view misses the nuances and evolution of both men’s thinking, Joseph argues, obscuring the ways their politics and activism came to overlap.  Each plainly influenced the other.  “Over time, each persuaded the other to become more like himself” (p.13).

My final stages of this review on the convergence of the two men’s thinking coincided with the trial of Derek Chauvin for the killing of George Floyd last May, along with the recent killing of still another black man, Daunte Wright, in the same Minneapolis metropolitan area.  Watching and reading about events in Minneapolis, I couldn’t help concluding that the three familiar words “Black Lives Matter”  –  the movement that led demonstrations across the country and the world last year to protest the Floyd killing — also neatly encapsulate the commonalities that Joseph identifies in The Sword and the Shield.

* * *

In March 1964, King was considered the “single most influential civil rights leader in the nation” (p.2), Joseph writes, whereas Malcolm, an outlier in the mainstream civil rights movement, was “perhaps the most vocal critic of white supremacy ever produced by black America” (p.4).    The two men shared extraordinary rhetorical and organizational skills.  Each was a charismatic leader and deep thinker who articulated in galvanizing terms his vision of full equality for African Americans.  But these visions sometimes appeared to be not just polar opposites but mutually exclusive.

In the conventional view of the time, King, the Southern Baptist preacher with a Ph.D. in theology, deserved mainstream America’s support as the civil rights leader who sought integration of African Americans into the larger white society, and unfailingly advocated non-violence as the most effective means to that end.  White liberals held King in high esteem for his almost religious belief in the potential of the American political system to close the gap between its lofty democratic rhetoric and the reality of pervasive racial segregation, discrimination and second-class citizenship, a belief Malcolm considered naïve.

A high school dropout who had served time in jail, Malcolm became the most visible spokesman for the Nation of Islam (NOI), an idiosyncratic American religious organization that preached black empowerment and racial segregation.  Often termed a “black nationalist,” Malcolm found the key to full equality in political and economic empowerment of African American communities.  He considered racial integration a fool’s errand and left open the possibility of violence as a means of defending against white inflicted violence.  He seemed to embrace some form of racial separation as the most effective means to achieve full equality and improve the lives of black Americans – a position that the media found to be ironically similar to that of the hard-core racial segregationists with whom both he and King were battling.

But Joseph demonstrates that Malcolm was moving in King’s direction at the time of their March 1964 encounter.  Coming off a bitter fallout with the NOI and its leader, Elijah Muhammad, he had cut his ties with the organization just months before the encounter.  He had traveled to Washington to demonstrate his support for the civil rights legislation under consideration.  Thinking he could make a contribution to the mainstream civil rights movement, Malcolm sought an alliance with King and his allies.  Although that alliance never materialized, King began to embrace positions identified with Malcolm after the latter’s assassination less than 11 months later, stressing in particular that economic justice needed to be a component of full equality for African Americans.  King also became an outspoken opponent of American involvement in the war in Vietnam, of which Malcolm long been had critical.

Singular events had thrust both men onto the national stage.  King rose to prominence as a newly-ordained minister who at age 26 became the most audible voice of the 1955-56 Montgomery, Alabama, bus boycott, after Rosa Parks famously refused to give up her seat on a public bus to a white person.  Malcolm’s rise to fame came in 1959 through a nationally televised 5-part CBS documentary on the NOI, The Hate that Hate Produced, hosted by then little-known Mike Wallace.  The documentary was an immediate sensation.  It was a one-sided indictment of the NOI, Joseph indicates, intended to scare and outrage whites.  But it made Malcolm and his NOI boss Elijah Muhammad heroes within black communities across the country.  King seemed to buy into the documentary’s theme, describing the NOI as an organization dedicated to “black supremacy,” which he considered “as bad as white supremacy” (p.85).

But even at this time, each man had connected his US-based activism to anti-colonial movements that were altering the face of Africa and Asia.  Both recognized that the systemic nature of racial oppression “transcended boundaries of nation-states” (p.73).    Malcolm made his first trip abroad in 1959, to Egypt and Nigeria.  The trip helped him “internationalize black political radicalism,” by linking domestic black politics to the “larger world of anti-colonial and Third World liberation movements” (p.18-19), as Joseph puts it.  King, whose philosophy of non-violence owed much to Mahatmas Gandhi, visited India in 1959, characterizing himself as a “‘pilgrim’ coming to pay homage to a nation liberated from colonial oppression against seemingly insurmountable odds”  (p.80).   After the visit, he “proudly claimed the Third World as an integral part of a worldwide social justice movement” (p.80).

After his break with the NOI and just after his chance encounter with King at the US Capitol, Malcolm took a transformative five-week tour of Africa and the Middle East in the spring of 1964.  The tour put him on the path to becoming a conventional Muslim and prompted him to back away from anti-white views he had expressed while with the NOI.  In Mecca, Saudi Arabia, he professed to see “sincere and true brotherhood practiced by all colors together, irrespective of their color.” (p.188).   He went on to Nigeria and “dreamed of becoming the leader of a political revolution steeped in the anti-colonial fervor sweeping Africa” (p.191).  Malcolm’s time in Africa, Joseph concludes, “changed his mind, body, and soul . . . The African continent intoxicated Malcolm X and informed his political dreams” (p.192-93).

By the time of their March 1964 meeting, moreover, the two men had begun to recognize each other’s potential.  After over a decade of forcefully criticizing the mainstream civil rights movement, Malcolm now recognized King’s goals as his own but chose different methods to get there.  Malcolm also had a subtle effect on King.  The “more he ridiculed and challenged King publicly,” Joseph writes, the more King “reaffirmed the strength of non-violence as a weapon of peace capable of transforming American democracy” (p.155).  King for his part had begun to look outside the rigidly segregated South and toward major urban centers in the North, Malcolm’s bailiwick, as possible sites of protest that would expand the freedom struggle beyond its southern roots.

Joseph cites three instances in which Malcolm extended written invitations to King, all of which went unanswered. But in early February 1965, after Malcolm had participated in a panel discussion with King’s wife, King concluded that the time had come to meet with his formidable peer.  Later that month, alas, Malcolm was gunned down in New York, almost certainly the work of the NOI, although details of the assassination remain murky to this day.

In the three years remaining to him after Malcolm’s assassination, King borrowed liberally from the black nationalist’s playbook, embracing in particular the notion of economic justice as a necessary component of full equality for African Americans.  Although he never wavered in his commitment to non-violence, King saw his cause differently after the uprising in the Watts section of Los Angeles in the summer of 1965.  Watts “transformed King,” Joseph writes, making clear that civil unrest in Northern cities was a “product of institutional racism and poverty that required far more material and political resources than ever imagined by the architects of the Great Society” (p.235).  King also began to speak out publicly in 1965 against the escalation of America’s military commitment in Vietnam, marking the beginning of the end of his close relationship with President Johnson.

King delivered his most pointed criticism of the war on April 4, 1967, precisely one year prior to his assassination, at the Riverside Church in New York City, abutting Harlem, Malcolm’s home base.  Linking the war to the prevalence of racism and poverty in the United States, King lamented the “cruel irony of watching Negro and white boys on TV screens as they kill and die together for a nation that has been unable to seat them together in the same schools.” (p.267).  Joseph terms King’s Riverside Church address the “boldest political decision of his career” (p.268).  It was the final turning point for King, marking his formal break with mainstream politics and his “full transition” from a civil rights leader to a “political revolutionary” who “refused to remain quiet in the face of domestic and international crises” (p.268).

After Riverside, in his last year, King became what Joseph describes as America’s “most well-known anti-war activist” (p.271).  King lent a Nobel Prize-winner’s prestige to a peace movement struggling to find its voice at a time when most Americans still supported the war.  Simultaneously, he pushed for federally guaranteed income, decent and racially integrated housing and public schools — what he termed a “revolution of values” (p.287).  During this period, Stokely Carmichael, who once worked with King in Mississippi (and is the subject of a Joseph biography), coined the term “Black Power.”  In Joseph’s view, the Black Power movement represented the natural extension of Malcolm’s political philosophy, post-Malcolm. Although King frequently criticized the movement in his final years, he nonetheless found himself in agreement with much of its agenda.

In his final months. King supported a Poor People’s march on Washington, D.C.  He was in Memphis, Tennessee in April 1968 on behalf of striking sanitation workers, overwhelmingly African-American, who held jobs but were seeking better salaries and more humane working conditions, when he too was felled by an assassin’s bullet.

* * *

After reading Joseph’s masterful synthesis, it is easy to imagine Malcolm supporting King’s efforts in Memphis that April.  And if the two men were still with us today, it is it is equally easy to imagine both embracing warmly the “Black Lives Matter” movement.

 

Thomas H. Peebles

La Châtaigneraie, France

April 20, 2021

 

 

 

7 Comments

Filed under American Politics, American Society, Political Theory, United States History

Digging Deeply Into The Idea of Democracy

 

James Miller, Can Democracy Work:

A Short History of a Radical Idea, From Ancient Athens to Our World

(Farrar, Strauss & Co.,) 

and

William Davies, Nervous States:

Democracy and the Decline of Reason

(WW Norton & Co.)

[NOTE: A condensed version of this review has also been posted to a blog known as Tocqueville 21: https:/tocqueville21.com/books/can-democracy-work.  Taking its name from the 19th century French aristocrat who gave Americans much insight into their democracy, Tocqueville 21 seeks to encourage in-depth thinking about democratic theory and practice, with particular but by no means exclusive emphasis on the United States and France.  The sight is maintained in connection with the American University of Paris’ Tocqueville Review and its Center for Critical Democracy Studies.  I anticipate regular postings on Tocqueville 21 going forward.]

Did American democracy survive the presidency of Donald Trump?  Variants on this question, never far from the surface during that four-year presidency, took on terrifying immediacy in the wake of the assault on the US Capitol this past January. The question seems sure to occupy historians, commentators and the public during the administration of Joe Biden and beyond.  If nothing else, the Trump presidency and now its aftermath bring home the need to dig deeply into the very idea of democracy, looking more closely at its history, theory, practice, and limitations, asking what are its core principles and what it takes to sustain them.  But we might shorten the inquiry to a single, pragmatic question: can democracy work?

This happens to be the title of James Miller’s Can Democracy Work: A Short History of a Radical Idea, From Ancient Athens to Our World.  But it could also be the title of William Davies’ Nervous States: Democracy and the Decline of Reason. The two works, both written during the Trump presidency, fall short of providing definitive or even reassuring answers to the question that Miller, professor of politics and liberal studies at New York’s New School for Social Research, has taken for his title.  But each casts enriching yet altogether different light on democratic theory and practice.

Miller’s approach is for the most part historical. Through a series of selected – and by his own admission “Eurocentric” (M.12) — case studies, he explores how the term “democracy” has evolved over the centuries, beginning with ancient Athens.  The approach of Davies, a political economist at Goldsmiths, University of London, is more difficult to categorize, but might be described as philosophical.  It is grounded in the legacy of 17th century philosophers René Descartes and Thomas Hobbes, his departure point for a complex and not always easy to follow explanation of the roots of modern populism, that combustible mixture of nostalgia, resentment, anger and fear that seemed to have triumphed at the time of the 2016 Brexit vote in Great Britain and the election of Donald Trump in the United States later that year.  Davies is most concerned about two manifestations of the “decline of reason,” his subtitle: the present day lack of confidence and trust in experts and democratically elected representatives; and the role of emotion and fear in contemporary politics.

Miller frames his historical overview with a paradox: despite blatant anti-democratic tendencies across the globe, a generalized notion of democracy as the most desirable form of government retains a strong hold on much, maybe most, of the world’s population.  From Myanmar and Hong Kong to the throng that invaded the US Capitol in January, nearly every public demonstration against the status quo utilizes the language of democracy.  Almost all the world’s political regimes, from the United States to North Korea, claim to embody some form of democracy.  “As imperfect as all the world’s systems are that claim to be democratic,” Miller writes, in today’s world the ideal of democracy is “more universally honored than ever before in human history” (M.211).

But the near-universal adhesion to this ideal is relatively recent, dating largely from the period since World War II, when the concept of democracy came to embrace self-determination of populations that previously had lived under foreign domination.  Throughout most of history, democracy was associated with the danger of mob rule, often seen as a “virtual synonym for violent anarchy” (M.59).   Modern democracy in Miller’s interpretation begins with the 18thcentury French and American Revolutions.  Revolts against the status quo are the heart of modern democracy, he contends.  They are not simply blemishes on the “peaceful forward march toward a more just society” (M.10).  Since the early 19th century, representative government, where voters elect their leaders  — “indirect democracy” – has come to be considered the only practical form of democratic governance for populous nation-states.

* * *

But in 5th and 4th century BCE Athens, where Miller’s case studies begin, what we now term direct democracy prevailed.  More than any modern democracy, a community of near absolute equality existed among Athenian citizens, even though citizenship was tightly restricted, open only to a fraction of the adult male population.  Many of Athens’ rivals, governed by oligarchs and aristocrats, considered the direct democracy practiced in Athens as a formula for mob rule, a view that persisted throughout the intervening centuries.  By the late 18th century, however, a competing view had emerged in France that some sort of democratic rule could serve as a check on monarchy and aristocracy.

In revolutionary Paris in early 1793, in the midst of the bloodiest phase of the French Revolution, the Marquis de Condorcet led the drafting of a proposed constitution that Miller considers the most purely democratic instrument of the 18th century and maybe of the two centuries since.  Condorcet’s draft constitution envisioned a wide network of local assemblies in which any citizen could propose legislation.  Although not implemented, the thinking behind Condorcet’s draft gave impetus to the notion of representative government as a system “preferable to, and a necessary check on, the unruly excesses of a purely direct democracy” (p.M.86).

The debate in the early 19th century centered on suffrage, the question of who gets to vote, with democracy proponents pushing to remove or lesson property requirements for extending the franchise to ever-wider segments of the (male) adult population.  A cluster of additional institutions and practices came to be considered essential to buttress an extended franchise, among them free and fair elections, protection of the human rights of all citizens, and adherence to the rule of law.  But Miller’s 19th century case studies are instances of short term set backs for the democratic cause: the failure of the massive popular movement known as Chartism to extend the franchise significantly in Britain in the 1840s; and the 1848 uprisings across the European continent, at once nationalist and democratic, which sought representative political institutions and something akin to universal male suffrage, but failed everywhere but in France to extend the franchise.

In the second half of the 19th century, moreover, proponents of democracy found themselves confronting issues of economic freedom and social justice in a rapidly industrializing Europe.  Karl Marx, for one, whose Communist Manifesto was published in 1848, doubted whether democracy – “bourgeois democracy,” he termed it – could alleviate widespread urban poverty and the exploitation of workers.  But the most spectacular failure among Miller’s case studies was the Paris Commune of 1871, which collapsed into disastrous violence amidst tensions between economic and political freedom.  Ironically, the fear of violence that the Commune unleashed led to a series of democratizing political reforms throughout Europe, with the right to vote extended to more male citizens.  The organization of workers into unions and the rise of political parties complemented extension of the franchise and contributed to the process of democratization in late 19th and early 20th century Europe.

In the United States, a case apart in Miller’s case studies, a genuinely democratic culture had taken hold by the 1830s, as the young French aristocrat Alexis de Tocqueville recognized during his famous 1831-32 tour, ostensibly to study prison conditions.  As early as the 1790s, there was a tendency to use the terms “republic” and “democracy” as synonyms for the American constitutional system, even though none of the drafters of the 1787 Constitution thought of himself as a democrat.  James Madison derided what he termed pure democracies, “which have ever been spectacles of turbulence and contention” (M.99).  The constitution’s drafters envisioned a representative government in which voters would select a “natural aristocracy,” as John Adams put it, comprising “men of virtue and talent, who would govern on behalf of all, with a dispassionate regard for the common good” (M.92).

The notion of a natural aristocracy all but disappeared when Andrew Jackson split Thomas Jefferson’s Democratic-Republican Party’s in two in his successful run for the presidency in 1828.  Running as a “Democrat,” Jackson confirmed that “democracy” from that point forward would be an “unambiguously honorific term in the American political lexicon” (M. 110), Miller writes.  It was during Jackson’s presidency that Tocqueville arrived in the United States.

Aware of how the institution of slavery undermined America’s democratic pretensions, Tocqueville nonetheless saw in the restlessness of Jacksonian America what Miller describes as a “new kind of society, in which the principle of equality was pushed to its limits” (M.115).  As practiced in America, democracy was a “way of life, and a shared faith, instantiated in other forms of association, in modes of thought and belief, in the attitudes and inclinations of individuals who have absorbed a kind of democratic temperament” (M.7).  Tocqueville nonetheless seemed to have had the Jacksonian style of democracy in mind when he warned against what he called “democratic despotism,” where a majority could override the rights and liberties of minorities.

Woodrow Wilson’s plea in 1917 to the US Congress that the United States enter World War I to “make the world safe for democracy” constitutes the beginning of the 20thcentury idea of democracy as a universal value, Miller argues.  But Wilson’s soaring faith in democracy turned out to be “astonishingly parochial” (M.176).  The post-World War I peace conferences in 1919 left intact the colonies of Britain and France, “under the pretext that the nonwhite races needed more time to become fully mature peoples, fit for democratic institutions” (M.190-91).

The Covenant of the League of Nations, the organization that Wilson hoped would be instrumental in preventing future conflict, “encouraged an expectation of self-determination as a new and universal political right” (M.191), even as the isolationist Congress thwarted Wilson’s plan for United States membership in the League.  For countries living under colonial domination, the expectation of self-determination was heightened after the more murderous World War II, particularly through the 1948 United Nations’ Universal Declaration of Human Rights.  Although a text without enforcement mechanisms, the declaration helped inspire human rights and independence movements across the globe.

Miller finishes by explaining why he remains attracted to modern attempts at direct democracy, resembling in some senses those of ancient Athens, particularly the notion of “participatory democracy” which influenced him as a young 1960s radical and which he saw replicated in the Occupy Wall Street Movement of ten years ago.  But direct democracy, he winds up concluding, is no more viable today than it was at the time of the French Revolution. It is not possible to create a workable participatory democracy model in a large, complex society.  Any “serious effort to implement such a structure will require a delegation of authority and the selection of representatives – in short the creation of an indirect democracy, and at some distance from most participants”  (M.232-33).

The Trump presidency, Miller argues, is best considered “not as a protest against modern democracy per se, but against the limits of modern democracy” (M.239).  Like Brexit, it expressed, in an “inchoate and potentially self-defeating” manner, a desire for “more democracy, for a larger voice for ordinary people” (M.240) – not unlike the participatory democracy campaigns of the 1960s.  At the time of Trump’s January 2017 inauguration, Miller appreciated that he remained free to “protest a political leader whose character and public policies I found repugnant.”  But he realized that he was “also expected to acknowledge, and peacefully coexist with, compatriots who preferred Trump’s policies and personal style.  This is a part of what it means to be a citizen in a liberal democracy” (M.240)  —  a portentous observation in light of the January 2021 assault on the US Capitol.

Democracies, Miller concludes, need to “explore new ways to foster a tolerant ethos that accepts, and can acknowledge, that there are many incompatible forms of life and forms of politics, not always directly democratic or participatory, in which humans can flourish” (M.234).  Although he doesn’t say so explicitly, this sounds much like an acknowledgement that present day populism is here to stay.  By an altogether different route, Davies reaches roughly the same conclusion.

* * *

Davies is far from the first to highlight the challenges to democracy when voters appear to abandon reason for emotion; nor the first to try to explain why the claims of government experts and elected representatives are met with increased suspicion and diminished trust today.  But he may be the first to tie these manifestations of the “decline of reason” to the disintegration of binary philosophical distinctions that Descartes and Hobbes established in the 17thcentury — Descartes between mind and body, Hobbes between war and peace.

For Descartes, the mind existed independently of the body.  Descartes was obsessed by the question whether what we see, hear, or smell is actually real.  He “treated physical sensations with great suspicion, in contrast to the rational principles belonging to the mind” (D.xiii).  Descartes gave shape to the modern philosophical definition of a rational scientific mind, Davies argues, but to do so, he had to discount sensations and feelings.  Hobbes, exhausted by the protracted religious Thirty Years War on the European continent and civil wars in England, argued that the central purpose of the state was to “eradicate feelings of mutual fear that would otherwise trigger violence” (D.xiii).  If people don’t feel safe, Hobbes seemed to contend, it “doesn’t matter whether they are objectively safe or not; they will eventually start to take matters into their own hands” (D.xvi).

Davies shows how Descartes and Hobbes helped create the conceptual foundation for the modern administrative state, fashioned by merchants who introduced “strict new rules for how their impressions should be recorded and spoke of, to avoid exaggeration and distortion, using numbers and public record-keeping” (D.xiii), not least for more efficient tax collection.  Using numbers in this pragmatic way, these 17th century merchants were the forerunners of what we today call experts, especially in the disciplines of statistics and economics, with an ability to “keep personal feelings separate from their observations” (D.xiii).

The conclusions of such experts, denominated and accepted as “facts,” established the value of objectivity in public life, providing a basis for consensus among people who otherwise have little in common.  Facts provided by economists, statisticians, and scientists thus have what for Hobbes was a peace-building function; they are “akin to contracts, types of promises that experts make to each other and the public, that records are accurate and free from any personal bias or political agenda” (D.124), Davies explains.  But if democracy is to provide effective mechanisms for the resolution of disputes and disagreements, there must be “some commonly agreed starting point, that all are willing to recognize,” he warns. “Some things must be outside politics, if peaceful political disputes are to be possible” (D.62).

Davies makes the bold argument that the rise of emotion in contemporary politics and the inability of experts and facts to settle disputes today are the consequences of the break down of the binary distinctions of Descartes and Hobbes.  The brain, through rapid advances in neuroscience, rather than Descartes’ concept of mind, has become the main way we have come to understand ourselves, demonstrating the “importance of emotion and physiology to all decision making” (D.xii).  The distinction between war and peace has also become less clear-cut since Hobbes’ time.

Davies is concerned particularly with how the type of knowledge used in warfare has been coopted for political purposes. Warfare knowledge doesn’t have the luxury of “slow, reasonable open public debate of the sort that scientific progress has been built upon.”  It is “shrouded in secrecy, accompanied by deliberate attempts to deceive the enemy. It has to be delivered at the right place and right time” (D.124), with emotions playing a crucial role.  Military knowledge is thus weaponized knowledge.  Political propaganda has all the indicia of military knowledge at work for political advantage.  But so does much of today’s digital communication.  Political argument conducted online “has come to feel more like conflict” (D.193), Davies observes, with conspiracy theories in particular given wide room to flourish.

The upshot is that democracies are being transformed today by the power of feeling and emotion, in “ways that cannot be ignored or reversed” (D. xvii-xviii).  Objective claims about the economy, society, the human body and nature “can no longer be successfully insulated from emotions”  (D.xiv).  While we can lament the decline of modern reason, “as if emotions have overwhelmed the citadel of truth like barbarians” (D.xv), Davies suggests that we would do better to “value democracy’s capacity to give voice to fear, pain and anxiety that might otherwise be diverted in far more destructive directions”  (D.xvii).

Yet Davies leaves unanswered the question whether there are there limits on the forms of fear, pain and anxiety to which democracy should give voice.  He recognizes the potency of nationalism as a “way of understanding the life of society in mythical terms” (D.87).  But should democracy strive to give voice to nationalism’s most xenophobic and exclusionary forms?  Nowhere does he address racism which, most social scientists now agree, was a stronger contributing factor to the 2016 election of Donald Trump than economic disparity, and it is difficult to articulate any rationale for giving racism a voice in a modern democracy.

In countering climate change skepticism, a primary example of popular mistrust of expert opinion and scientific consensus, Davies rejects renewed commitment to scientific expertise and rational argument – “bravado rationalism,” he calls it  — as insufficient to overcome the “liars and manipulators” (D.108) who cast doubt on the reality of climate change.  But he doesn’t spell out what would be sufficient. The book went to press prior to the outbreak of the Coronavirus pandemic.  Were Davies writing today, he likely would have addressed similar resistance to expert claims about fighting the pandemic, such as the efficacy of wearing masks.

Writing today, moreover, Davies might have used an expression other than “barbarians storming the citadel of truth,” an expression that now brings to mind last January’s assault on the US Capitol.  While those who took part in the assault itself can be dealt with through the criminal justice process, with all the due process protections that a democracy affords accused law breakers, an astounding number of Americans who did not participate remain convinced that, despite overwhelming empirical evidence to the contrary, Joe Biden and the Democrats “stole” the 2020 presidential election from Donald Trump.

* * *

How can a democracy work when there is widespread disagreement with an incontrovertible fact, especially one that goes to democracy’s very heart, in this case the result of the vote and the peaceful transfer of power after an orderly election?  What if a massive number of citizens refuse to accept the obligation that Miller felt when his candidate lost in 2016, to acknowledge and peacefully coexist with the winning side?  Davies’ trenchant but quirky analysis provides no obvious solution to this quandary.  If we can find one, it will constitute an important step in answering the broader question whether American democracy survived the Trump presidency.

 

Thomas H. Peebles

La Châtaigneraie, France

March 17, 2021

 

7 Comments

Filed under American Politics, History, Intellectual History, Political Theory, United States History

What Did Chuckie Know?

 

Jack Goldsmith, In Hoffa’s Shadow:

A Stepfather, a Disappearance in Detroit, and My Search for Truth

(Farrar, Strauss & Giroux)

Until his conviction on jury tampering charges and subsequent imprisonment in 1967, James P. Hoffa had been head of one of American’s most powerful unions, the International Brotherhood of Teamsters, America’s truck drivers.  On July 30, 1975, Hoffa got into a car in a parking lot of a suburban Detroit restaurant and was never seen again.  To this day, and after more than three decades of investigation by top American law enforcement agencies, led by the FBI, we still do not know precisely what happened to Hoffa or who was responsible for his disappearance.  His body has never been recovered.  But for the better part of three decades, a prime suspect as an accomplice – someone likely to know most of the details behind the disappearance – was one Charles Lenton O’Brien, usually referred to as “Chuckie.”

Chuckie was suspected of having been the driver of the car that picked up Hoffa in the Detroit parking lot and drove him to his death. Prior to the union leader’s jail sentence,  Chuckie had been Hoffa’s assistant and since age seven had been exceptionally close to Hoffa personally, to the point that he considered Hoffa to be his stepfather.  Although his responsibilities to Hoffa varied  — he was skilled at hard knuckle “negotiations” with strikebreakers, for example — Chuckie was often perceived as the union leader’s chauffeur, since he drove Hoffa on much of his official business.  In the FBI’s view, Chuckie was the only person with whom Hoffa would have voluntarily gotten into a car.

Moreover, Chuckie had been in the same parking lot on the morning of the disappearance.  He had delivered a load of fresh salmon that afternoon from the Teamsters’ headquarters in downtown Detroit to the home of a Teamsters official who lived not far from the restaurant.  The car he used to deliver the salmon belonged to Joey Giacalone, the son of Detroit Mafia boss Anthony (“Tony”) Giacalone, the man Hoffa thought he was to meet with on that July afternoon and a leading suspect in the case, along with East Coast underworld figure Anthony Provenzano.  The FBI later detected Hoffa’s hair and scent in the car.   On the afternoon of the 30th,  Chuckie was not seen by anyone else in the crucial period  that encompassed the disappearance.  Further, he was known to have had a falling out with his former boss the previous year.

Despite this powerful circumstantial evidence, Chuckie was never indicted and adamantly maintained his innocence, up to his own death last winter at age 86.  Months before he died, Chuckie received high-powered support for his claim of innocence in the form of Harvard Law School professor Jack Goldsmith’s In Hoffa’s Shadow: A Stepfather, A Disappearance in Detroit, and My Search for Truth,which painstakingly seeks to explain how and why the FBI, the US Department of Justice and American law enforcement got the Hoffa case so wrong for so long by focusing on Chuckie when they should have set their sights elsewhere.  Goldsmith would seem to be a stellar candidate to argue on behalf of Chuckie.

Prior to moving to Harvard in 2004, Goldsmith served as head of the Department of Justice’s prestigious and powerful Office of Legal Counsel, an office that arbitrates legal issues involving power and authority within the executive branch of the United States government.  Goldsmith’s predecessors in that position include former Supreme Court justices William Rehnquist and Antonin Scalia.  Goldsmith came to OLC in 2003 in the immediate aftermath of the United States’ invasion of Iraq earlier that year.  In a short but controversial tenure at OLC of about 9 months, Goldsmith dealt with some of the most contentious legal issues generated by the United States’ post 9/11 “war on terror.”  He and then Deputy Attorney General James Comey famously confronted personnel from the White House in March 2004 at the hospital bed of a seriously ill Attorney General John Ashcroft and talked Ashcroft out of renewing a secret government surveillance program that Goldsmith and Comey had concluded was inconsistent with governing law.

But Goldsmith is more than just a high-powered legal beagle and Chuckie’s de facto lawyer.  He is also Chuckie’s stepson.  The word “stepfather” in Goldsmith’s title could apply loosely to Chuckie’s relationship to Hoffa, but applies in the strict legal sense to Goldsmith’s relationship to Chuckie, who stepped into Goldsmith’s life “seemingly from nowhere” (p.15) when he married Goldsmith’s mother Brenda in 1974, after Brenda’s first two marriages failed.  Brenda’s first husband, Goldsmith’s father, left her early in life, just as Chuckie’s father had left him and his mother at about the same age.  Brenda then married a doctor, but that marriage did not work out either.   Chuckie and Brenda married in Memphis in June 1975, one month prior to Hoffa’s disappearance.  Brenda had three young sons at the time – author Goldsmith was twelve years old, and his brothers Brett and Steven were 9 and 7 respectively.

While Goldsmith’s core purpose is to make the case for Chuckie’s innocence in the Hoffa disappearance, his book is in no small part also a heartfelt personal memoir about his relationship to Chuckie, a loving stepfather yet a man Goldsmith describes as a “hapless blabbermouth with famously terrible judgment” and an “uneducated serial lawbreaker” (p.9) who served two prison terms.  The largest segment of the book details Chuckie’s own life story, intertwined with that of Hoffa and the “complex legacy” that Hoffa “bequeathed to the American labor movement and American justice” (p.9).  But the juiciest segment comes at the end, as Goldsmith probes his stepfather to reveal more of what he knew about Hoffa’s disappearance – the “My Search for the Truth” portion of the book’s subtitle.  Goldsmith skillfully weaves these disparate strands into an absorbing, stranger-than-fiction narrative; you can’t make this stuff up.

* * *

Chuckie was seven when his father abruptly left him and his mother, Sylvia Pagano.  Sylvia, whose husband and father both had links to organized crime in Kansas City, moved in the early 1940s from Kansas City to Detroit with her young son. There she met Hoffa, an energetic and fast rising organizational dynamo with Detroit’s Teamster Local 299.  They became personal friends, although Goldsmith rejects the notion that the relationship was ever anything more.  As Hoffa rose within the Teamsters, he took Sylvia’s son under his wing, serving as a surrogate father to young Chuckie.  From the time Hoffa met Chuckie as a boy and continuing into his adult years, he showed Chuckie “solicitude, patience, and affection that he showed no one else in his life except for his daughter, Barbara” (p.82).  Chuckie in turn “loved Jimmy Hoffa more than anyone and would do anything he asked” (p.83).

Sylvia, who is often credited with convincing Hoffa he could do business with organized crime, was also a friend of Anthony Giacalone.  Like Hoffa, Giacalone too often stepped in to help Sylvia with her parenting responsibilities by taking young Chuckie under his wing, to the point that Chuckie referred to Giacalone as “Uncle Tony.”  If Hoffa was Chuckie’s stepfather, Chuckie later told his own stepson, Uncle Tony was his godfather.  From Hoffa and Giacalone, the two most significant male figures in his life as a boy and young man, Chuckie’s absorbed the value of what the Sicilians call Omertà

Omertà in the organized crime world is a code of silence, an ability to recognize and keep quiet about those matters that, in the Mafia euphemism, “shouldn’t be talked about.”  Chuckie learned early in life to avoid being a “rat,” someone who did not respect the code of silence.  But Omertà is little more than  an extreme form of the loyalty that was the “core commitment” (p.249 ) in his stepfather’s life as he grew from a  neglected little boy into a young man.  Chuckie, Goldsmith writes, “yearned for affection and sought it by loving those he cared for with intense fidelity and by doing his all to please them” —  the same qualities that made Chuckie “such a great father to me decades later” (p.83).

Goldsmith recalls being much happier, even joyful, after Chcukie’s arrival in his boyhood home, having never previously experienced fatherly attention.  His new stepfather “smothered me in love that he never received from his father, and taught me right from wrong even though he had trouble distinguishing the two in his own life” (p.5).  Chuckie was involved in Goldsmith’s youth sports and almost all his other activities, except homework – that was not his thing.   In 1976, Chuckie formally adopted Brenda’s boys, who changed their last name to “O’Brien.”  Chuckie joined the Goldsmith family about nine months prior to Hoffa’s disappearance, when Goldsmith was 14 years old.  During Goldsmith’s adolescence, his stepfather was fighting for his exoneration – and in some senses for his life — as a prime suspect in his former boss’ disappearance.

While Chuckie was trying to establish his innocence, Goldsmith went off to college and began to see his stepfather differently.  When he was a first year student at Yale Law School, Goldsmith decided that he wished to change his legal name from O’Brien back to Goldsmith.  It was, Goldsmith explains, part of an effort to “cut Chuckie out of my life completely.” Chuckie had done “nothing affirmatively to hurt me, and indeed had only ever shown me love.  But ambition augmented by feelings of moral superiority blinded me to my true motives or to the effect of my action on him or my family” (p.31-32).  During his time at Yale, the height of the Reagan era,  Goldsmith became more conservative politically.  He took the side of business in labor disputes, and sided generally with the prosecution rather than criminal defendants. At that point, Goldsmith by his own description was  “entirely self-absorbed . . . focused on my [career] prospects, my girlfriend, and not much else” (p.30).

As his legal career took off over the course of the next two decades, Goldsmith had little contact with his stepfather.  It was not until late 2004, after Goldsmith himself had married, had a family of his own, and had left the Department of Justice for Harvard that he sought Chuckie’s forgiveness for the long estrangement.  Chuckie “accepted me back into his life without qualification, rancor, or drama,” Goldsmith writes.  He “acted as if those twenty years hadn’t happened” (p.41).

* * *

Chuckie began work with the Teamsters upon graduation from high school, with ambitions to become a union organizer like his putative stepfather. But he was continually passed over for the position he was most interested in, the leader of Detroit’s Teamster Local 299, the local Hoffa had headed before he became the Teamsters’ national president in 1957.  Although Chuckie studied Hoffa’s methods closely, he “didn’t grasp the finer points of labor organizing or union finances, he wasn’t a charismatic speaker, he often didn’t follow through on commitments, and he lacked good judgment” (p.80), Goldsmith writes.  He had a “knucklehead charm and undoubted goodwill, and most people liked him despite his shortcomings.  But when he tried to mimic Jimmy Hoffa, Chuckle often fell on his face” (p.80).   Hoffa was a “deadly serious man who suffered no fools and in labor matters surrounded himself with learned professionals”  (p.82). As much as he loved Chuckie, Hoffa was not prepared to allow him to rise in the union beyond his competence level.

Hoffa combined the “business sense of an industrial tycoon with the political instinct of a big city boss, and the showmanship of a vaudeville entertainer” (p.80), according to one account.  He “identified with struggling workers and possessed an angry intensity about righting power imbalances in the workplace” (p.51).  Hoffa earned the admiration of the union rank and file, some 400,000 truckers by the early 1960s, by securing better hours and equipment for union members and winning impressive health, pension, and vacation benefits —  in essence winning a place for his members in the middle class, giving them what Chuckie described as a “dignity they never imagined possible” (p.75).

But Hoffa found himself in frequent conflict with local and regional leaders.  In part to secure needed support from several East Coast locals that were already controlled by organized crime, Hoffa lent liberally from union pension funds to organized crime figures.  Hoffa was “remarkably candid in defense of these arrangements,” Goldsmith learned from Chuckie.  He “always claimed he was simply adapting his labor goals to the power reality on the ground” (p.88).  Among Chuckie’s multiple duties as Hoffa’s assistant, he often served as a conduit between his boss and Mafia figures — the “union side” and the “Sicilian side,” (p.140), as Chuckie put it.

Hoffa’s colossal downfall began in Washington in 1957, when Robert F. Kennedy, then an ambitious United States Senate staff member for what was known as the McClellan Committee, zealously sought to expose links between organized labor and organized crime in contentious and highly publicized hearings. When Kennedy became Attorney General in his brother John’s administration in 1961, his Department of Justice targeted Hoffa for prosecution, a continuation of what Goldsmith terms a personal vendetta “probably without parallel in American history” (p.98).

Kennedy’s single-minded focus on Hoffa constitutes the “paradigmatic case in American history of wielding prosecutorial power to destroy a person rather than pursue a crime” (p.120), as Goldsmith puts it.  Kennedy’s seven year assault on Hoffa was “more responsible than has been appreciated for the steady decline in union power ever since” (p.9).  The identification of the American labor movement with corruption, violence, and bossism “crystallized with Bobby’s Kennedy’s singular crusade” and has “never receded, even though the idea was exaggerated at the time and is largely inaccurate today”  (p.108).

Hoffa was prosecuted for jury tampering and went to jail in March 1967, after Kennedy had left the Justice Department.  With Hoffa imprisoned, Frank Fitzsimmons stepped into the Teamsters leadership position and the wall that Hoffa had tried to maintain between organized crime and the union and its pension funds collapsed entirely.  The “sharpest irony” of Kennedy’s crusade against Hoffa, Goldsmith concludes, was that it “opened the door for the mob to infiltrate and leech off the union like never before.” (p. p.119-120).

President Richard Nixon, although viscerally anti-union, cynically sought Teamster votes for his 1972 re-election campaign by issuing a conditional pardon to Hoffa in 1971.  Nixon’s pardon released Hoffa from prison but required him to refrain from engaging in union activities, a condition Hoffa agreed to but in Chuckie’s view never intended to honor.  Hoffa’s efforts after his release to regain control of the union he once led included much erratic behavior, along with threats to expose the comfortable relationship between the union and criminal syndicates, threats that likely led to his disappearance in July 1975.  Did Chuckie have a role in that disappearance?

* * *

With the FBI under enormous pressure to make progress in the Hoffa case in the second half of 1975, its early belief in Chuckie’s role in driving Hoffa to his death in July 1975 became, as Goldsmith puts it, “one of the few unquestioned certainties in the case” (p.234).  But, he argues, the FBI, “focused on facts that fit its theory and ignored or discounted the many countervailing facts and circumstances that did not fit its theory but should have made it much less confident that Chuckie was involved” (p.234).  Goldsmith’s elaboration of these “countervailing facts and circumstances” reads like a transcript from a defense attorney’s closing argument to a jury.

Goldsmith stresses the unlikelihood  of using the car of Anthony Giacalone’s son: of all the cars available in the Motor City, “why use the car of the son of a leading Detroit mobster and the man who supposedly arranged for the hit or likely knew it was coming” (p.234).  He argues the logistical implausibility of Chuckie driving Hoffa to his death in the time gap when he was unseen on the afternoon of the disappearance.  He emphasizes that in the aftermath of the disappearance, Chuckie alone among the suspects spoke willingly and openly to the FBI, without any apparent repercussions from the perpetrators.  And he highlights numerous examples of post-disappearance conduct inconsistent with guilt.  As to the notion that Chuckie was the only person with whom Hoffa would have gotten into a car voluntarily, Goldsmith notes that the evidence does not show that Hoffa voluntarily got into the car.  The problems with the FBI’s theory of the case, Goldsmith acknowledges, “do not by themselves exonerate Chuckie.  But they stand as mysterious and unexplained counterpoints to the circumstantial evidence against him” (p.237).

By the time Goldsmith became involved in Chuckie’s case, sometime around 2012, these problems had convinced a handful of FBI agents in its Detroit office of Chuckie’s innocence.  At one point, after Chuckie had successfully passed an FBI-administered polygraph test, Goldsmith thought he had brokered an official FBI letter exonerating his stepfather.  Unfortunately for the hapless Chuckie, the idea of a letter was nixed by political appointees at the FBI and the Department of Justice who, in Goldsmith’s view, “didn’t want to take the political heat from admitting the government’s errors during the last four decades” (p.285).  Goldsmith’s book is as close as Chuckie came in his lifetime to an official exoneration.

* * *

Chuckie was not initially enamored about the idea of Goldsmith writing a book about him, but thought his hotshot Harvard Law School stepson represented his best opportunity for the exoneration he so desperately sought.  The stepson pledged that he would do his best, but only if his step father told him the truth.  Each was aware of the other’s mixed motives.  Chuckie was “committed to Omertà, I was committed to its opposite,” Goldsmith writes, “but we were both committed to each other.”  Out of affection, both tried hard to “help or accommodate the other . . . I was always on guard for mendacity or deflection.  He was always on guard for forbidden topics, and was brilliant, when he wanted to be, at resisting my probes” (p.301).

From the beginning, Goldsmith hoped that his efforts to clear Chuckie would also solve the Hoffa puzzle.  Goldsmith suspected that Chuckie knew a whole lot more about Hoffa’s disappearance than what he was telling him, maybe the full truth.  As Goldsmith tells the story, he came at least closer to the truth.  But Chuckie adhered to the code of Omertà throughout their discussions.  At one point, at a time when Chuckie was in poor health and may well have been aware he didn’t have long to live, Goldsmith told his stepfather that he couldn’t believe he would take to the grave his knowledge of what had been one of the most spectacular unsolved crimes of the 20thcentury.  Chuckie’s response:” “Believe it, Jack” (p.305).

 

Thomas H. Peebles

Paris, France

January 10, 2021

 

.

3 Comments

Filed under History, United States History

German Lessons: Is Mississippi Learning?

 

Susan Neiman, Learning from the Germans:

Race and the Memory of Evil (Farrar, Straus & Giroux) 

Less than two months ago, protests and public demonstrations erupted on an unprecedented scale across the United States and throughout the world over the killing of African American George Floyd at the hands of a Minneapolis, Minnesota, police officer, captured on videotape.  Fueled by the movement known as “Black Lives Matter,” the protests and demonstrations that continue to this day focus most directly on police violence and reform of criminal justice practices.  But at a deeper level the protests also seek to call attention to the endurance of systemic racism in the United States, the subject that hovers over Susan Neiman’s thought-provoking Learning from the Germans: Race and the Memory of Evil,  giving her work a timeliness she probably never imagined when it first appeared last year.

To address systemic racism, Neiman argues, the United States needs to confront more directly and honestly the realities of its racist past: human bondage dating from the early 17th century which plunged the United States into a Civil War in the mid-19th century, followed by an additional century of legally enforced segregation, rampant discrimination, racial terrorism and second class citizenship, with official sanction of racial discrimination not ending until passage of the Civil Rights Act in 1964 and the Voting Rights Act the following year.  Neiman’s title, moreover, is a give away to her surprising suggestion that Americans can learn much from how Germany finally confronted its own racist past, specifically the Holocaust, Nazi Germany’s project to exterminate Europe’s Jewish population that it perpetrated over a 12-year period, from 1933 to 1945.

When Neiman looked at the contentious issue of monuments honoring Southern Civil War veterans from the perspective of Germany, where she has lived on and off since 1982, she found it hard to “imagine a Germany filled with monuments to the men who fought for the Nazis.  My imagination failed. For anyone who has lived in contemporary Germany, the vision of statutes honoring those men is inconceivable.” Germans who lost family members during World War II realize that their loved ones “cannot be publicly honored without honoring the cause for which they died” (p.267).  In the United States, by contrast, the president and a substantial if declining portion of the public still support maintaining statutes and memorials honoring the cause of the Southern Confederacy, a reflection of the broader differences between the two countries in coming to terms with their racist pasts that Neiman seeks to highlight.

Learning from the Germans is not an attempt to compare the evils of slavery and discrimination against African Americans in the United States to those of the murder of Jews and others during the Holocaust, an exercise Neiman considers fruitless.  Rather, her work revolves around what might be characterized as “comparative atonement,” for which she uses her preferred if foreboding German word, Vergangenheitsaufarbeitung, translated into English as “working off the past.”  The word came into use in German in the 1960s as an “abstract polysyllable way of saying We have to do something about the Nazis” (p.30).  In atoning for its racist past, Germany is markedly further down the path to Vergangenheitsaufarbeitung than the United States, Neiman argues, but she also emphasizes how East Germany, when it existed and despite its many faults, was further along this path than West Germany.  Only after German reunification in 1990 did efforts of the former West Germany to atone for its racist crimes begin to gather serious momentum.

The first part of Neiman’s three-part work, “German Lessons,” outlines Germany’s attempt to come to terms with its crimes of the Nazi period, both before and after unification.  The second part, “Southern Discomfort,” looks at the legacy of racism in the American Deep South, heavily concentrated on the state of Mississippi and on the persistence of the notion of the Lost Cause, a romanticized version of the American Civil War that insists that the war was fought not over slavery but over “states’ rights” — an “abstract phrase that veils the question of what, exactly, Southern states thought they had a right to do” (p.186).  In her third part, “Setting Things Straight,” Neiman considers in broad terms how the American South and the United States as a whole can make strides in coming to terms with a racist past, with the German experience serving as a partial guide.  But this part is a more an invitation to debate than a provision of definitive answers.

Neiman, a Jewish American with no direct family connection to the Holocaust, was raised in the American South, in Atlanta, Georgia.  A philosopher by training who studied at Harvard under John Rawls and taught at Yale, she is today the Director of the Einstein Forum, a German think tank located in Potsdam, just outside Berlin.  After nearly a quarter century living and working in Germany, Neiman spent a year at the Winter Institute for Racial Reconstruction in Oxford, Mississippi, a forward-looking institution dedicated explicitly to encouraging people to “honestly engage in their history in order to live more truthfully in the present, where the inequities of the past no longer dictate the possibilities of the future” (p.143).  Utilizing these diverse professional and personal experiences, she mixes analysis and anecdote while introducing her readers to an impressive array of Germans and Americans working on what might be described as the front lines of Vergangenheitsaufarbeitung in their respective countries.

Although her analysis of the United States concentrates on the state of Mississippi, Neiman recognizes that Mississippi is hardly representative of the United States as a whole, and not even of the states of the former Confederacy.  But she contends that awareness of history is arguably more acute in Mississippi than anywhere else in United States.  “Focusing on the Deep South,” moreover, is “not a matter of ignoring the rest of the country, but of holding a magnifying glass to it” (p.17-18), she writes.   Although just about everyone in the United States now accepts that slavery was wrong, , the “national sense of shame” which she finds in today’s Germany is “entirely absent” in the United States; shame  is “not the American way” (p.268).

During the nearly three years that Neiman worked on her book, many of the Germans  she met with laughed at her proposed title and rejected the idea that Germany had anything to teach Americans about dealing with their racist past.  Most Germans today are defensive about their country’s efforts to work toward Vergangenheitsaufarbeitung, she observes.  They think they  took way too long to transition from looking at themselves as victims, with some adding  that many of their fellow citizens never made the transition.  “Good taste,” she writes, “prevents good Germans from anything that could possibly be construed as boasting about repentance” (p.56).   Neiman sees this widespread defensiveness as “itself a sign of how far Germany has come in taking responsibility for its criminal history” (p.17).  But how Germany arrived at this position is not easy to pinpoint.

* * *

Competitive victimhood, Neiman writes, “may be as close to a universal law of human nature as we’re ever going to get.”   Postwar Germany was no less inclined than the defeated American South to participate in this “old and universal sport”(p.63).   Although 80 years separate the defeat of the American South from that of Nazi Germany, Neiman perceives similar litanies: “the loss of their bravest sons, the destruction of their homes, the poverty and hunger that followed – combined with resentment at occupying forces they regarded as generally loutish, who had the gall to insist their suffering was deserved” (p.63).  For decades after World War II, Germans were “obsessed with the suffering they’d endured, not the suffering they’d caused” (p.40).

In the immediate aftermath of World War II, the United States, Britain and France, the occupying powers in what became West Germany, aimed to institute a process of de-Nazification.  Among its many aims, de-Nazification was supposed to purge former Nazis and Nazi sympathizers from positions of influence.  More broadly, as Frederick Taylor argued in Exorcising Hitler: The Occupation and Denazification of Germany (reviewed here in December 2012), de-Nazification was “perhaps the most ambitious scheme to change a nation’s psyche ever mounted in human history.” But de-Nazification was a failed scheme.  West Germans mocked the Allied attempt to impose a change of consciousness.

The Allies, moreover, lacked the resources to make de-Nazification successful, and Cold War rivalries and realities intruded.  The Allies were “far more interested in securing [German] allies against the Soviet Union than in digging up their sordid pasts” (p.99).  The de-Nazification program was turned over to the West German government, which had “no inclination to pursue it” (p.99).  Well into the 1960s, West German commitments to democratic governance were “precarious, and the possibility of a return to a sanitized Nazism could not be ruled out” (p.55).  The implicit message of Konrad Adenauer, West Germany’s first post-war chancellor, seemed to be: behave yourself, don’t call attention to your past, and we won’t look too deeply into that past.

East Germany worked off its Nazi past differently.  Although its official name was the German Democratic Republic (GDR), there was little that was democratic about East Germany.  Its borders were closed, its media heavily censored, and its elections a national joke.  Yet, East German leaders had been by and large genuinely anti-fascist, anti-Nazi during the war; the same cannot be said of West German leaders.  East Germany put far more old Nazis on trial proportionately and convicted more than in the West.  The West never invited Jewish émigrés to return; the East did.  Overall, Neiman concludes, East Germany quite simply “did a better job of working off the Nazi past than West Germany” (p.81).

It was not until around 1968 that West Germany began to get serious aboutVergangenheitsaufarbeitung, embarking on a path out of denial in conjunction with the student protests that roiled Europe and the United States that year.  Because their parents could not “mourn, acknowledge responsibility, or even speak about the war” (p.70), the 68ers, as the generation born in the 1940s was called, felt compelled to confront their parents over their war experiences and their subsequent silence about those experiences.  A decade later, the American TV series “Holocaust” served as a catalyst for “public discussion of the Holocaust that had been missing for decades” (p.370-71).  Then, on May 8, 1985, 40 years after Germany’s surrender, West German president Richard von Weizsäcker made headlines when he termed that day one of liberation. Up to that point, May 8 in West Germany had been called the Day of Defeat or Day of Unconditional Surrender (yet Weizsäcker even then symbolized the ambivalence of West German Vergangenheitsaufarbeitung: his father had been a high-level Nazi, an assistant to Foreign Minister Joachim von Ribbentrop; Weizsäcker defended his father at the post-war Nuremberg trials and always maintained that his father was trying only to make a bad situation better).

By the time of reunification in 1990, expressions of pro-Nazi sentiment had become “socially unacceptable” and have since become “morally unacceptable” (p.311).  A 1995 exhibit on the Wehrmacht, the Nazi army with 18 million members, demonstrated convincingly that it had systematically committed war crimes,  thereby breaking West Germany’s “final taboo” (p.24).  The exhibit was extended to 33 different cities in Germany and Austria and “ignited media discussions, filled talk shows, and eventually provoked a debate in parliament” (p.24).

Today, the right-wing Alternative for Germany, AfD in German, continues to rise in influence in Germany on an anti-immigrant platform many consider neo-Nazi.  Germany gained further unwanted attention earlier this month when Nazi sympathizers were revealed to  have infiltrated an elite German security unit:

https://www.nytimes.com/2020/07/03/world/europe/germany-military-neo-nazis-ksk.html

But today’s Germany has nonetheless reached the point where “open expressions of racism are politically ruinous,” Neiman concludes, which may be the “best outcome we can hope for and it may also be enough. . . Very often, social change begins with lip service” (p.310-11).

* * *

As in Germany, Neiman observes, “the War” throughout the American South is a singular reference.  “Everybody knows that one was decisive, and its repercussions are with us today.”  This knowledge is “more conscious in a Deep South that was occupied, and almost as devastated as Germany, than in the rest of the United States” (p.37).  But the Lost Cause narrative that arose in the American South was an exercise in Civil War historical revisionism that flourished toward end of the 19th and into the early 20th century, in which the war was rebranded as a “noble fight for Southern freedom,” with the post-war Reconstruction period becoming a “violent effort by ignorant ex-slaves and mercenary Yankees to debase the honor of the South in general, and its white women in particular” (p.181).

Reconciliation under the Lost Cause mythology was “between white members of the opposing armies” to be achieved by “valorizing the defeated, and ignoring the cause for which they fought” (p.182). Reconciliation between white and black folk was not on the agenda.  Slowly and hazily, the Lost Cause narrative “came to capture the hearts of the North. Weary of war, eager for reconciliation, and keen to get on with the business of industrialization that was changing the American economy, Northerners conceded most of the mythmaking to the South. Not many had been enthusiastic abolitionists anyway” (p.186-87).

The Winter Institute, where Neiman conducted much of the research for this book, has sought to counter the Lost Cause narrative through such institutional reforms as creating and implementing school criteria on human rights, fostering inter-racial dialogue in communities known for racial violence, and promoting academic investigation and scholarship on patterns and legacies of racial inequities.   What keeps the Winter Institute going is the notion that “if you can change Mississippi communities, you can probably change anything” (p.142).  The primary lesson Neiman derived from her time at the Winter Institute: “national reconciliation begins at the bottom. Very personal encounters between members of different races, people who represent the victims as well as those who represent the perpetrators, are the foundation of any larger attempt to treat national wounds . . . It is a long and weary process, but it is hard to see an alternative” (p.301).

Neiman discusses at length two notorious murderous acts in mid-20th century Mississippi: the 1955 murder of Emmitt Till, a 14-year-old Chicago boy brutally killed during a summer visit to Mississippi; and the murders of  Andrew Goodman, Mickey Schwerner, and James Chaney, three civil rights workers, two young white men from New York and a black man from Mississippi, killed near Philadelphia, Mississippi during the following decade while organizing African-Americans to exercise their right to vote.  The two men tried for the Till murder were promptly acquitted.  Protected   by the Double Jeopardy Clause of the United States Constitution, they thereafter took money from Look magazine to confess that they had killed the teenager.  No trial at all ensued in the immediate aftermath of the killing of Goodman, Schwerner, and Chaney.

While the world knew the story of the ghastly Till murder, for decades nobody in the local Mississippi Delta community, black or white, wanted to talk about it.  Neiman sees a similarity to the silence that prevailed in Germany in the first decades after the war, where to both non-Jewish and Jewish families, “anything connected to the war was off-limits.  Neither side could bear to talk about it, one side afraid of facing its own guilt, the other afraid of succumbing to pain and rage” (p.217).

In 1989, the Mississippi Secretary of State issued a public apology to the families of the three slain civil rights workers, the first local white man to publicly acknowledge the crime.  Most Mississippians think that is the reason he lost when he ran for governor the following year.  With a strong push from the Winter Institute, a trial in the case finally took place in in 2005.  The prime suspect, Edgar Ray Killen, then 80 years old, was convicted, but only of manslaughter.  Killen received three 20-year sentences and died in prison in 2018.  Neiman wonders whether the trial has helped a “healing process” or allowed Mississippi to “rest in the self-satisfaction that the horrors that stigmatized the state all belonged to the past” (p.301).

* * *

In her final section, Neiman runs through the most common arguments against reparations to descendants of victims of slavery, and proffers counter arguments.  She glosses over what in my mind is the most difficult: how to determine who gets what amount.   She notes that West Germany paid Israel what amounted to reparations early in the history of the two states, the “price for acceptance into the Western Community and the price was relatively cheap . . . Reparations were paid in exchange for world recognition and the opportunity to keep silent about the quantity of Nazis, and Nazi thinking, that permeated the Federal Republic” (p.289).  Iin the United States, she argues, reparations  need not take the form of precise compensation to individual African Americans but should be the subject of public debate.

On the current polemic surrounding statutes and memorials honoring Confederate war veterans,  Neiman reminds her readers that most were erected in the early part of the 20th century with the express purpose of reinforcing and providing legitimacy to the regime of rigid segregation and discrimination.  They should not be seen as “innocuous shrines to history; they were provocative assertions of white supremacy at moments when its defenders felt under threat.  Knowing when they were built is part of knowing why they were built. . What is at stake is not the past, but the present and the future. When we choose to memorialize a historical moment, we are choosing the values we want to defend, and pass on” (p.263).

* * *

“Forgetting past evils may be initially safer,” Neiman writes, but in the long run, the “dangers of forgetting are greater than the dangers of remembering — provided, of course, that we use the failures of past attempts to learn how to do it better” (p.373).  Although there is no single pathway to Vergangenheitsaufarbeitung, understanding the distance Germany has traveled in coming to terms with the Nazi era’s racist crimes should benefit Americans yearning to find a better pathway in the turbulent aftermath of the George Floyd killing.

Thomas H. Peebles

La Châtaigneraie, France

July 29, 2020

 

6 Comments

Filed under American Politics, American Society, German History, History, Politics, United States History

Misjudgments and Misdeeds of an Unseen Power Broker

Jefferson Morley, The Ghost:

The Secret Life of  CIA Spymaster James Jesus Angleton

(St. Martin’s)

James Jesus Angleton served as the Central Intelligence Agency’s head of counterintelligence — its top spy and effectively the number three person in the agency — from 1954 until he was forced into retirement in 1975.  Although his name is a less familiar than that of the FBI’s original director, J. Edgar Hoover, I couldn’t help thinking of Hoover as I read Jefferson Morley’s trenchant biography, The Ghost: The Secret Life of CIA Spymaster James Jesus Angleton.  Both were immensely powerful, paranoid men who repeatedly broke or skirted the law to advance their often-idiosyncratic versions of what United States national security required.  Throughout their careers, both were able to avoid almost all attempts to hold them accountable for their misdeeds.  With the passage of four decades since Hoover’s death in 1972 and Angleton’s departure from the CIA three years later, we can see that the two men seem  embodied what has recently come to be known as the “Deep State,” a nearly independent branch of government in which officials secretly manipulate government policy, as Morley puts it, “largely beyond the view of the Madisonian government and the voting public” (p.xi).

Morley demonstrates that the notorious COINTELPRO operation, associated today with Hoover and arguably his most dubious legacy, actually began as a joint FBI-CIA undertaking that Angleton concocted.  COINTELPRO aimed to infiltrate and disrupt dissidents and included among its targets Dr. Martin Luther King, left leaning organizations, and Vietnam anti-war protestors.  The original idea that Angleton sold to a skeptical Hoover, who considered the CIA a “nest of liberals, atheists, homosexuals, professors, and otherwise feminized men who specialized in wasting the taxpayer dollar” (p.71), was that the Bureau would target subjects within the United States while the Agency would take the lead in targeting subjects outside the United States.

From there, the CIA and FBI collaborated on LINGUAL, an elaborate and extensive program to read American citizens’ mail, which Morley terms perhaps Angleton’s “most flagrant violation of the law” (p.82); and on CHAOS, an operation designed to infiltrate the entire anti-Vietnam war movement, not just people or organizations that engaged in violence or contacted foreign governments. Post-Watergate hearings brought the existence and extent of COINTELPRO, LINGUAL and CHAOS  to light, along with numerous other chilling exercises of authority attributed to the FBI and CIA, leading to Angleton’s involuntary retirement from the agency.

Morley, a freelance journalist and former Washington Post editor, does not make the Hoover comparison explicitly.  He sees in Angleton a streak of Iago, Othello’s untrustworthy advisor: outwardly a “sympathetic counselor with his own agenda, which sometimes verged on the sinister” (p.158).  Angleton served four American presidents with “seeming loyalty and sometimes devious intent” (p.159), he writes (of course, the same could be said of Hoover, who served eight presidents over the course of a career that began in the 1920s).

Writing in icy prose that pieces together short, punchy vignettes with one word titles, Morley undertakes to show how Angleton was able to elevate himself from a “staff functionary” at the CIA, a new agency created in 1947, to an “untouchable mandarin” who had an “all but transcendent influence on U.S. intelligence operations for two decades” (p.67).  At the height of the Cold War, Morley writes, Angleton became an “unseen broker of American power” (p.158).

But Morley’s biography might better be viewed as a compendium of the misjudgments and misdeeds that punctuated Angleton’s career from beginning to end.  Angleton’s judgment failed him repeatedly, most notoriously when his close friend and associate, British intelligence agent Kim Philby, was revealed to have been a Soviet spy from World War II onward (I reviewed Ben McIntyre’s biography of Philby here in 2016). The Philby revelation convinced Angleton that the KGB had also planted an agent within the CIA, precipitating a disastrous and abysmally unsuccessful “mole hunt” that paralyzed the CIA for years and damaged the careers of many innocent fellow employees, yet discovered no one.

The book’s most explosive conjuncture of questionable judgment and conduct involves Angleton’s relationship to Lee Harvey Oswald, President John F. Kennedy’s presumed assassin.  Angleton followed Oswald closely from 1959, when he defected to the Soviet Union, to that fateful day in Dallas in 1963.  Thereafter, Angleton tenaciously withheld his knowledge of Oswald from the Warren Commission, charged with investigating the circumstances of the Kennedy assassination, to the point where Morley suggests that Angleton should have been indicted for obstruction of justice.  The full extent of Angleton’s knowledge of Oswald has yet to come out, leaving his work laden with fodder for those of a conspiratorial bent who insist that Oswald was something other than a lone gunman, acting alone, as the Warren Commission found (in 2015, I reviewed Peter Savodnik’s biography of Oswald here, in which Savodnik argues forcefully for the lone gunman view of Oswald).

* * *

Born in 1917 in Boise, Idaho, Angleton was the son of a prosperous merchant father and a Mexican-American mother (hence the middle name “Jesus”).  At age 16, the young Angleton moved with his family to Milan, where his father ran the Italian-American Chamber of Commerce and was friendly with many leaders in the fascist regime of Benito Mussolini.  For the remainder of his life, James retained a fondness for Italy, Italian culture and, it could be argued, the Italian brand of fascism.

Angleton attended boarding school in England, then went on to Yale as an undergraduate.  At Yale, he demonstrated a keen interest in poetry and came under the influence of the poet Erza Pound, who later became notorious for his Nazi sympathies (after an investigation led by J. Edgar Hoover, Pound was jailed during World War II).  Poetry constituted a powerful method for Angleton, Morley writes.  He would come to value “coded language, textual analysis, ambiguity, and close control as the means to illuminate the amoral arts of spying that became his job.  Literary criticism led him to the profession of secret intelligence.  Poetry gave birth to a spy” (p.8).

During World War II, Angleton found his way to the Office of Strategic Services, the CIA’s predecessor agency.  He spent the later portion of the war years in Rome, where he developed a friendship with Junio Valerio Borghese, “perhaps the most famous fascist military commander in Italy” (p.21).  Angleton helped Borghese avoid execution at the hands of the same partisan forces that captured and executed Mussolini in 1945.  Thanks to Angleton’s efforts, Borghese “survived to become titular and spiritual leader of postwar Italian fascism” (p.27), and one of the United States’ key partners in preventing a Communist takeover of postwar Italy.

Angleton prepared for his assignment in Rome at Bletchley Park in England, the center of Allied code-breaking operations during World War II.  There, Angleton learned the craft of counter-intelligence under the tutelage of Kim Philby, who taught the young American “how to run double agent operations, to intercept wireless and mail messages, and to feed false information to the enemy.  Angleton would prove to be his most trusting friend” (p.18).  After the war, Philby and Angleton both found themselves in Washington, where they became inseparable buddies, the “closest of friends, soul mates in espionage” (p.41).  Each saw in the other the qualities needed to succeed in espionage: ruthlessness, calculation, autonomy, and cleverness.

The news of Philby’s 1963 defection to Moscow iwas “almost incomprehensible” (p.123) to Angleton.  What he had considered a deep and warm relationship had been a sham.  Philby was “his friend, his mentor, his confidant, his boozy buddy,” Morley writes.  And “through every meeting, conference, debriefing, confidential aside, and cocktail party, his friend had played him for a fool” (p.124).  Philby’s defection does not appear to have damaged Angleton’s position within the CIA, but it set him off on a disastrous hunt for a KGB “mole” that would paralyze and divide the agency for years.

Angleton’s mole hunt hardened into a “fixed idea, which fueled an ideological crusade that more than a few of his colleagues denounced as a witch hunt” (p.86).  Angleton’s operation  was multi-faceted,  “consisting of dozens of different mole hunts – some targeting individuals, others focused on components within the CIA (p.135).  Angleton’s suspicions “effectively stunted or ended the career of colleagues who were guilty of nothing” (p.198).  To this day, after the opening of significant portions of KGB archives in the aftermath of the fall of the Soviet Union, there is no indication it ever had a mole burrowed into the CIA.  Angleton’s mole hunt, Morley concludes, “soaked in alcohol” and permeated by “convoluted certitudes,” brought Angleton to the “brink of being a fool” (p.126).

Just as Angleton never gave up his (witch) hunt for the KGB spy within the CIA, he became convinced that Harold Wilson, British Labor politician and for a while Prime Minister, was a Soviet Spy, and never relinquished this odd view either.  And he argued almost until the day he departed from the CIA that the diplomatic sparring and occasional direct confrontation between the Soviet Union and China was an elaborate exercise in disinformation to deceive the West.

While head of counterintelligence at the CIA, Angleton served simultaneously as the agency’s desk officer for Israel, the direct link between Israeli and American intelligence services.  Angleton was initially wary of the Israeli state that came into existence in 1948, in part the residue of the anti-Semitism he had entertained in his youth, in part the product of his view that too many Jews were communists. By the mid-1950s, however, Angleton had overcome his initial reticence to become an admirer of Israel and especially Mossad, its primary intelligence service.

But Angleton’s judgment in his relationship with Israel frequently failed him just as it failed him in his relationship with Philby.  He did not foresee Israel’s role in the 1956 Anglo-French invasion of Suez (the subject of Ike’s Gamble, reviewed here in 2017), infuriating President Eisenhower.  After winning President Johnson’s favor for calling the Israeli first strike that ignited the June 1967 Six Day War (“accurate almost down to the day and time,” p.181), he incurred the wrath of President Nixon for missing Egypt’s strike at Israel in the October 1973 Yom Kippur War.  Nixon and his Secretary of State, Henry Kissinger, were of the view that Angleton had grown too close to Israel.

Angleton, moreover, was almost certainly involved behind the scenes in a 1968 Israeli heist of uranium enriched nuclear fuel to build its own nuclear reactor, lifted from a Pennsylvania power plant known as NUMEC.  A CIA analyst later concluded that NUMEC had been a “front company deployed in an Israeli-American criminal conspiracy to evade U.S.. nonproliferation laws and supply the Israeli nuclear arsenal” (p.261-62).  Angleton’s loyalty to Israel “betrayed U.S. policy on an epic scale” (p.261), Morley writes.

* * *

Morley’s treatment of Angleton’s relationship to to Lee Harvey Oswald and Fidel Castro’s Cuba raises more questions that it answers.  The CIA learned of Oswald’s attempt to defect to the Soviet Union in November 1959, and began monitoring him at that point.  In this same timeframe, the CIA and FBI began jointly monitoring a pro-Castro group, the Fair Play for Cuba Committee, which would later attract Oswald. Although Angleton was a contemporary and occasional friend of John Kennedy (the two were born the same year), when Kennedy assumed the presidency in 1961, Angleton’s view was that American policy toward Fidel Castro needed to be more aggressive. He viewed Cuba as still another Soviet satellite state, but one just 90 miles from United States shores.

The Kennedy administration’s Cuba policy got off to a miserable start with the infamous failure of the April 1961 Bay of Pigs operation to dislodge Castro.  Kennedy was furious with the way the CIA and the military had presented the options to him and fired CIA Director Allen Dulles in the operation’s aftermath (Dulles’ demise is one of the subjects of Stephen Kinzer’s The Brothers, reviewed here in 2014). But elements within the CIA and the military held Kennedy responsible for the failure by refusing to order air support for the operation (Kennedy had been assured prior to the invasion that no additional military assistance would be necessary).

CIA and military distrust for Kennedy heightened after the Cuban Missile Crisis of October 1962, when the United States and the Soviet Union faced off in what threatened to be a nuclear confrontation over the placement of offensive Soviet missiles on the renegade island.  Although Kennedy’s handling of that crisis was widely acclaimed as his finest moment as president, many within the military and the CIA, Angleton included, thought that Kennedy’s pledge to Soviet Premier Khrushchev of no invasion of Cuba in exchange for Soviet withdrawal of missiles had given Castro and his Soviet allies too much.  Taking the invasion option off the table amounted in Angleton’s view to a cave in to Soviet aggression and a betrayal of the anti-Castro Cuban community in the United States.

In the 13 months that remained of the Kennedy presidency, the administration continued to obsess over Cuba, with a variety of operations under consideration to dislodge Castro.  The CIA was also  monitoring Soviet defector Oswald, who by this time had returned to the United States.  Angleton placed Oswald’s’ name on the LINGUAL list to track his mail.  By the fall of 1963, Oswald had become active in the Fair Play for Cuba Committee, passing out FPCC leaflets in New Orleans.  He was briefly arrested for disturbing the peace after an altercation with anti-Castro activists.  In October of that year, a mere one month before the Kennedy assassination, the FBI and CIA received notice that Oswald had been in touch with the Soviet and Cuban embassies and consular sections in Mexico City.  Angleton followed Oswald’s Mexico City visits intensely, yet withheld for the rest of his life precisely what he knew about them .

From the moment Kennedy was assassinated, Angleton “always sought to give the impression that he knew very little about Oswald before November 22, 1963” (p.140).  But Angleton and his staff, Morley observes, had “monitored Oswald’s movements for four years. As the former marine moved from Moscow to Minsk to Fort Worth to New Orleans to Mexico City to Dallas,” the special group Angleton created to track defectors “received reports on him everywhere he went” (p.140-41).  Angleton clearly knew that Oswald was in Dallas in November 1963.   He hid his knowledge of Oswald from the Warren Commission, established by President Lyndon Johnson to investigate the Kennedy assassination. What was Angleton’s motivation for obfuscation?

The most plausible – and most innocent – explanation is that Angleton was protecting his own rear end in an “epic counterintelligence failure” that had “culminated on Angleton’s watch. It was bigger than the Philby affair and bloodier” (p.140).  Given this disastrous counterintelligence failure, Morley argues, Angleton “could have – and should have – lost his job after November 22 [1963].  Had the public, the Congress, and the Warren Commission known of his pre-assassination interest in Oswald or his post-assassination cover-up, he surely would have” (p.157).

But the range of possibilities Morley considers extends to speculation that Angleton may have been hiding his own involvement in a Deep State operation to assassinate the president.   Was Angleton running Oswald as an agent in an assassination plot, Morley asks:

He certainly had the knowledge and ability to do so.  Angleton and his staff had a granular knowledge of Oswald long before Kennedy was killed.  Angleton had a penchant for running operations outside of reporting channels. He articulated a vigilant anti-communism that depicted the results of JFK’s liberal policies in apocalyptic terms. He participated in discussions of political assassination. And he worked in a penumbra of cunning that excluded few possibilities (p.265).

Whether Angleton manipulated Oswald as part of an assassination plot is a question Morley is not prepared to answer.  But in Morley’s view, Angleton plainly “obstructed justice to hide interest in Oswald.   He lied to veil his use of the ex-defector in later 1963 for intelligence purposes related to the Cuban consulate in Mexico City. . . Whoever killed JFK, Angleton protected them. He masterminded the JFK conspiracy and cover up” (p.265).   To this day, no consensus exists as to why Angleton dodged all questions concerning his undisputed control over the CIA’s file on Oswald for four years, up to Oswald’s death in November 1963.  Angleton’s relationship to Oswald remains “shrouded in deception and perjury, theories and disinformation, lies and legends” (p.87), Morley concludes.  Even though a fuller story began to emerge when Congress ordered the declassification of long-secret JFK assassination records in the 1990s,” the full story has “yet to be disclosed” (p.87).

* * *

The burglary at the Democratic National Headquarters in the Watergate Hotel in June 1972 proved to be Angleton’s professional undoing, just as it was for President Richard Nixon.  The burglary involved three ex-CIA employees, all likely well known to Angleton.   In 1973, in the middle of multiple Watergate investigations, Nixon appointed William Colby as agency director, a man determined to get to the bottom of what was flowing into the public record about the CIA and its possible involvement in Watergate-related activity.

Colby concluded that Angleton’s never-ending mole hunts were “seriously damaging the recruiting of Soviet officers and hurting CIA’s intelligence intake” (p.225).  Colby suspended LINGUAL, finding the mail opening operation “legally questionable and operationally trivial,” having produced little “beyond vague generalities” (p.225). At the same time, New York Times investigative reporter Seymour Hersh published a story that described in great detail Operation CHAOS, the agency’s program aimed at anti-Vietnam activists, attributing ultimate responsibility to Angleton.  Immediately after Christmas 1974. Colby moved  to replace Angleton.

For the first and only time in his career, Angleton’s covert empire within the CIA stood exposed and he left the agency in 1975.  When Jimmy Carter became president in 1977, his Department of Justice elected not to prosecute Angleton, although Morley argues that it had ample basis to do so.  In retirement, Angleton expounded his views to “any and all who cared to listen” (p.256).  He took to running reporters “like he had once run agents in the field, and for the same purpose: to advance his geopolitical vision” (p.266).

* * *

Angleton, a life-long smoker (as well as heavy drinker) was diagnosed with lung cancer in 1986 and died in May 1987.  He was, Morley concludes “fortunate that so much of his legacy was unknown or classified at the time of his death..”  Angleton not only “often acted outside the law and the Constitution,” but also, for the most part, “got away with it” (p.271).

Thomas H. Peebles

La Châtaigneraie, France

June 10, 2020

 

2 Comments

Filed under American Politics, Biography, United States History

Reading Darwin in Abolitionist New England

 

Randall Fuller, The Book That Changed America:

How Darwin’s Theory of Evolution Ignited a Nation (Viking)

In mid-December 1859, the first copy of Charles Darwin’s On the Origin of Species arrived in the United States from England at a wharf in Boston harbor.  Darwin’s book explained how plants and animals had developed and evolved over multiple millennia through a process Darwin termed “natural selection,” a process which distinguished On the Origins of Species from the work of other naturalists of Darwin’s generation.   Although Darwin said little in the book about how humans fit into the natural selection process, the work promised to ignite a battle between science and religion.

In The Book That Changed America: How Darwin’s Theory of Evolution Ignited a Nation, Randall Fuller, professor of American literature at the University of Kansas, contends that what made Darwin’s insight so radical was its “reliance upon a natural mechanism to explain the development of species.  An intelligent Creator was not required for natural selection to operate.  Darwin’s’ vision was of a dynamic, self-generation process of material change.  That process was entirely arbitrary, governed by physical law and chance – and not leading ineluctably . . . toward progress and perfection” (p.24).  Darwin’s work challenged the notion that human beings were a “separate and extraordinary species, differing from every other animal on the planet. Taken to its logical conclusion, it demolished the idea that people had been created in God’s image” (p.24).

On the Origins of Species arrived in the United States at a particularly fraught moment.  In October 1859, abolitionist John Brown had conducted a raid on a federal arsenal in Harper’s Ferry (then part of Virginia, today West Virginia), with the intention of precipitating a rebellion that would eradicate slavery from American soil.  The raid failed spectacularly: Brown was captured, tried for treason and hung on December 2, 1859.  The raid and its aftermath exacerbated tensions between North and South, further polarizing the already bitterly divided country over the issue of chattel slavery in its southern states.  Notwithstanding the little Darwin had written about how humans fit into the natural selection process, abolitionists seized on hints in the book that all humans were biologically related to buttress their arguments against slavery.  To the abolitionists, Darwin “seemed to refute once and for all the idea that African American slaves were a separate, inferior species” (p.x).

Asa Gray, a respected botanist at Harvard University and a friend of Darwin, received the first copy of On the Origin of Species in the United States.  He passed the copy, which he annotated heavily, to his cousin by marriage  Charles Loring Brace (who was also a distant cousin of Harriet Beecher Stowe, author of the anti-slavery runaway best-seller Uncle Tom’s Cabin).  Brace in turn introduced the book to three men: Franklin Benjamin Sanborn, a part-time school master and full-time abolitionist activist; Amos Bronson Alcott, an educator and loquacious philosopher, today best remembered as the father of author Louisa May Alcott; and Henry David Thoreau, one of America’s best known philosophers and truth-seekers.  Sanborn, Alcott and Thoreau were residents of Concord, Massachusetts, roughly twenty miles north of Boston, the site of a famous Revolutionary War battle but in the mid-19th century both a leading literary center and a hotbed of abolitionist sentiment.

As luck would have it, Brace, Alcott and Thoreau gathered at Sanborn’s Concord home on New Year’s Day 1860.  Only Gray did not attend. The four men almost certainly shared their initial reactions to Darwin’s work.   This get together constitutes the starting point for Fuller’s engrossing study, centered on how Gray and the four men in Sanborn’s parlor on that New Year’s Day  absorbed Darwin’s book.   Darwin himself is at best a background figure in the study.  Several familiar figures make occasional appearances, among them:  Frederick Douglass, renowned orator and “easily the most famous black man in America” (p.91); Bronson Alcott’s author-daughter Louisa May; and American philosophe Ralph Waldo Emerson, Thoreau’s mentor and friend.  Emerson, like Louisa May and her father, was a Concord resident, and Fuller’s study takes place mostly there, with occasional forays to nearby Boston and Cambridge.

Fuller’s study is therefore more tightly circumscribed geographically than its title suggests.  He spends little time detailing the reaction to Darwin’s work in other parts of the United States, most conspicuously in the American South, where any work that might seem to support abolitionism and undermine slavery was anathema.   The study is also circumscribed in time; it takes place mostly in 1860, with most of the rest confined to the first half of the 1860s, up to the end of the American Civil War in 1865.  Fuller barely mentions what is sometimes called “Social Darwinism,” a notion that gained traction in the decades after the Civil War that purported to apply Darwin’s theory of natural selection to the competition between individuals in politics and economics, producing an argument for unregulated capitalism.

Rather, Fuller charts out the paths each of his five main characters traversed in absorbing and assimilating into their own worldviews the scientific, religious and political ramifications of Darwin’s work, particularly during the tumultuous year 1860.   All five were fervent abolitionists.   Sunburn was a co-conspirator in John Brown’s raid.  Thoreau gave a series of eloquent, impassioned speeches in support of Brown.  All were convinced that Darwin’s notion of natural selection had provided still another argument against slavery, based on science rather than morality or economics.  But in varying degrees, all five could also be considered adherents of transcendentalism, a mid-19th century philosophical approach that posited a form of human knowledge that goes beyond, or transcends, what can be seen, heard, tasted, touched or felt.

Although transcendentalists were almost by definition highly individualistic, most believed that a special force or intelligence stood behind nature and that prudential design ruled the universe.  Many subscribed to the notion that humans were the products of some sort of “special creation.”   Most saw God everywhere, and considered the human mind “resplendent with powers and insights wholly distinct from the external world” (p.54).  Transcendentalism was both an effort to invoke the divinity within man and, as Fuller puts it, also “cultural attack on a nation that had become too materialistic, too conformist, too smug about its place in history” (p.66).

Transcendentalism thus hovered in the background in 1860 as all but Sanborn wrestled with the implications of Darwinism (Sanborn spent much of the year fleeing federal authorities seeking his arrest for his role in John Brown’s raid).  Alcott never left transcendentalism, rejecting much of Darwinism.  Gray and Brace initially seemed to embrace Darwinian theories wholeheartedly, but in different ways each pulled back once he fully grasped the full implications of those theories.   Thoreau was the only one of the five who accepted wholly Darwinism’s most radical implications, using Darwin’s theories to “redirect his life’s work” (p.ix).

Fuller’s study thus combines a deep dive into the New England abolitionist milieu at a time when the United States was fracturing over the issue of slavery with a medium level dive into the intricacies of Darwin’s theory of natural selection.   But the story Fuller tells is anything but dry and abstract.  With an elegant writing style and an acute sense of detail, Fuller places his five men and their thinking about Darwin in their habitat, the frenetic world of 1860s New England.  In vivid passages, readers can almost feel the chilly January wind whistling through Franklin Sanborn’s parlor that New Year’s Day 1860, or envision the mud accumulating on Henry David Thoreau’s boots as he trudges through the melting snow in the woods on a March afternoon contemplating Darwin.  The result is a lively, easy-to-read narrative that nimbly mixes intellectual and everyday, ground-level history.

* * *

Bronson Alcott, described by Fuller as America’s most radical transcendentalist, never accepted the premises of On the Origins of Species.  Darwin had, in Alcott’s view, “reduced human life to chemistry, to mechanical processes, to vulgar materialism” (p.10).  To Alcott, Darwin seemed “morbidly attached to an amoral struggle of existence, which robbed humans of free will and ignored the promptings of the soul” (p.150). Alcott could not imagine a universe “so perversely cruel as to produce life without meaning.  Nor could he bear to live in a world that was reduced to the most tangible and daily phenomena, to random change and process”(p.188).  Asa Gray, one of America’s most eminent scientists, came to the same realization, but  only after thoroughly digesting Darwin and explaining his theories to a wide swath of the American public.

Gray’s initial reaction to Darwin’s work was one of unbounded enthusiasm.  Gray covered nearly every page of the book with his own annotations.  He admired the book because it “reinforced his conviction that inductive reasoning was the proper approach to science” (p.109).  He also admired the work’s “artfully modulated tone, [and] its modest voice, which softened the more audacious ideas rippling through the text” (p.17). Gray was most impressed with Darwin’s “careful judging and clear-eyed balancing of data” (p.110).  To grapple with Darwin’s ideas, Gray maintained, one had to “follow the evidence wherever it led, ignoring prior convictions and certainties or the narrative one wanted that evidence to confirm” (p.110).  Without saying so explicitly, Gray suggested that readers of Darwin’s book had to be “open to the possibility that everything they had taken for granted was in fact incorrect” (p.110).

Gray reviewed On the Origins of Species for the Atlantic Monthly in three parts, appearing  in the summer and fall of 1860.  Gray’s articles served as the first encounter with Darwin for many American readers.  The articles elicited a steady stream of letters from respectful readers.  Some responded with “unalloyed enthusiasm” for a new idea which “seemed to unlock the mysteries of nature” (p.134).  Others, however, “reacted with anger toward a theory that proposed to unravel . . . their belief in a divine Being who had placed humans at the summit of creation” (p.134).  But as Gray finished the third Atlantic article, he began to realize that he himself was not entirely at ease with the diminution of humanity’s place in the universe that Darwin’s work implied.

The third Atlantic article, appearing in October 1860, revealed Gray’s increasing difficulty in “aligning Darwin’s theory with his own religions convictions” (p.213).   Gray proposed that natural selection might be the “God’s chosen method of creation” (p.214).  This idea seemed to resolve the tension between scientific and religious accounts of origins, making Gray the first to develop a theological case for Darwinian theory.  But the idea that natural selection might be the process by which God had fashioned  the world represented what Fuller describes as a “stunning shift for Gray. Before now, he had always insisted that secondary causes were the only items science was qualified to address.  First, or final causes – the beginning of life, the creation of the universe – were the purview of religion: a matter of faith and metaphysics” (p.214).  Darwin responded to Gray’s conjectures by indicating that, as Fuller summarizes the written exchange, the natural world was “simply too murderous and too cruel to have been created by a just and merciful God” (p.211).

In the Atlantic articles, Fuller argues, Gray leapt “beyond his own rules of science, speculating about something that was untestable” (p.214-15 ).  Gray must have known that his argument “failed to adhere to his own definition of science” (p.216).  But, much like Bronson Alcott, Gray found it “impossible to live in the world Darwin had imagined: a world of chance, a world that did not require a God to operate” (p.216).  Charles Brace, a noted social reformer who founded several institutions for orphans and destitute children, greeted Darwin’s book  with an initial enthusiasm that rivaled that of Gray.

Brace  claimed to have read On the Origins of Species 13 times.  He was most attracted to the book for its implications for human societies, especially for American society, where nearly half the country accepted and defended human slavery.  Darwin’s book “confirmed Brace’s belief that environment played a crucial role in the moral life of humans” (p.11), and demonstrated that every person in the world, black, white, yellow, was related to every one else.  The theory of natural selection was thus for Brace the “latest argument against chattel slavery, a scientific claim that could be used in the most important controversy of his time, a clarion call for abolition” (p.39).

Brace produced a tract entitled The Races of the Old World, modeled after Darwin’s On the Origin of Species, which Fuller describes as a “sprawling, ramshackle work” (p.199).  Its central thesis was simple enough: “There is nothing . . . to prove the negro radically different from the other families of man or even mentally inferior to them” (p.199-200).  But much of The Races of the Old World seemed to undercut Brace’s central thesis.  Although the book never defined the term “race,” Brace “apparently believed that though all humans sprang from the same source, some races had degraded over time . . . Human races were not permanent” (p.199-200).  Brace thus struggled to make Darwin’s theory fit his own ideas about race and slavery. “He increasingly bent facts to fit his own speculations” (p.197), as Fuller puts it.

The Races of the Old World revealed Brace’s hesitation in imagining a multi-racial America. He couched in Darwinian terms the difficulty of the races cohabiting,  reverting to what Fuller describes as nonsense about blacks not being conditioned to survive in the colder Northern climate.  Brace “firmly believed in the emancipation of slaves, and he was equally convinced that blacks and white did not differ in their mental capacities” (p.202).  But he nonetheless worried that “race mixing,” or what was then termed race “amalgamation,” might imperil Anglo-Saxon America, the “apex of development. . . God’s favored nation, a place where democracy and Christianity had fused to create the world’s best hope” (p.202).  Brace joined many other leading abolitionists in opposing race “amalgamation.”  His conclusion that “black and brown-skinned people inhabited a lower run on the ladder of civilization” was shared, Fuller indicates, by “even the most enlightened New England abolitionists” (p.57).

No such misgivings visited Thoreau, who  grappled with On the Origins of Species “as thoroughly and as insightfully as any American of the period” (p.11).  As Thoreau first read his copy of the book in late January 1860,  a “new universe took form on the rectangular page before him” (p.75).  Prior to his encounter with Darwin, Thoreau’s thought had often “bordered on the nostalgic.  He longed for the transcendentalist’s confidence in a natural world infused with spirit” (p.157).  But Darwin led Thoreau beyond nostalgia.

Thoreau was struck in particular by Darwin’s portrayal of the struggle among species as an engine of creation.  The Origin of Species revealed nature as process, in constant transformation.  Darwin’s book directed Thoreau’s attention “away from fixed concepts and hierarchies toward movement instead” (p.144-45).  The idea of struggle among species “undermined transcendentalist assumptions about the essential goodness of nature, but it also corroborated many of Thoreau’s own observations” (p.137).  Thoreau had “long suspected that people were an intrinsic part of nature – neither separate nor entirely alienated from it” (p.155).  Darwin now enabled Thoreau to see how “people and the environment worked together to fashion the world,” providing a “scientific foundation for Thoreau’s belief that humans and nature were part of the same continuum” (p.155).

Darwin’s natural selection, Thoreau wrote, “implies a greater vital force in nature, because it is more flexible and accommodating, and equivalent to a sort of constant new creation” (p.246).  The phrase “constant new creation” in Fuller’s view represents an “epoch in American thought” because it “no longer relies upon divinity to explain the natural world” (p.246).  Darwin thus propelled Thoreau to a radical vision in which there was “no force or intelligence behind Nature, directing its course in a determined and purposeful manner.  Nature just was” (p.246-47).

How far Thoreau would have taken these ideas is impossible to know. He became sick in December 1860, stricken with influenza, exacerbated by tuberculosis, and died in June 1862, with Americans fighting other Americans on the battlefield over the issue of slavery.

* * *

            Fuller compares Darwin’s On the Origin of Species to a Trojan horse.  It entered American culture “using the newly prestigious language of science, only to attack, once inside, the nation’s cherished beliefs. . . With special and desolating force, it combated the idea that God had placed humans at the peak of creation” (p.213).  That the book’s attack did not spare even New England’s best known abolitionists and transcendentalists demonstrates just how unsettling the attack was.

Thomas H. Peebles

La Châtaigneraie, France

May 18, 2020

 

10 Comments

Filed under American Society, History, Political Theory, Religion, Science, United States History

The Power of Human Rights

 

Samantha Power, The Education of an Idealist:

A Memoir 

By almost any measure, Samantha Power should be considered an extraordinary American success story. An immigrant from Ireland who fled the Emerald Isle with her mother and brother at a young age to escape a turbulent family situation, Power earned degrees from Yale University and Harvard Law School, rose to prominence in her mid-20s as a journalist covering civil wars and ethnic cleaning in Bosnia and the Balkans, won a Pulitzer Prize for a book on 20th century genocides, and helped found the Carr Center for Human Rights Policy at Harvard’s Kennedy School of Government, where she served as its executive director — all before age 35.  Then she met an ambitious junior Senator from Illinois, Barack Obama, and her career really took off.

Between 2009 and 2017, Power served in the Obama administration almost continually, first on the National Security Council and subsequently as Ambassador to the United Nations.  In both capacities, she became the administration’s most outspoken and influential voice for prioritizing human rights, arguing regularly for targeted United States and multi-lateral interventions to protect individuals from human rights abuses and mass atrocities, perpetrated in most cases by their own governments.  In what amounts to an autobiography, The Education of an Idealist: A Memoir, Power guides her readers through  the major foreign policy crises of the Obama administration.

Her life story, Power tells her readers at the outset, is one of idealism, “where it comes from, how it gets challenged, and why it must endure” (p.xii).  She is quick to emphasize that hers is not a story of how a person with “lofty dreams” about making a difference in the world came to be “’educated’ by the “brutish forces” (p.xii) she encountered throughout her professional career.  So what then is the nature of the idealist’s “education” that provides the title to her memoir?  The short answer probably lies in how Power learned to make her idealistic message on human rights both heard and effective within the complex bureaucratic structures of the United States government and the United Nations.

But Power almost invariably couples this idealistic message with the view that the promotion and protection of human rights across the globe is in the United States’ own national security interests; and that the United States can often advance those interests most effectively by working multi-laterally, through international organizations and with like-minded states.  The United States, by virtue of its multi-faceted strengths – economic, military and cultural – is in a unique position to influence the actions of other states, from its traditional allies all the way to those that inflict atrocities upon their citizens.

Power acknowledges that the United States has not always used its strength as a positive force for human rights and human betterment – one immediate example is the 2003 Iraq invasion, which she opposed. Nevertheless, the United States retains a reservoir of credibility sufficient to be effective on human rights matters when it choses to do so.   Although Power is sometimes labeled a foreign policy “hawk,” she recoils from that adjective.  To Power, the military is among the last of the tools that should be considered to advance America’s interests around the world.

Into this policy-rich discussion, Power weaves much detail about her personal life, beginning with her early years in Ireland,  the incompatibilities between her parents that prompted her mother to take her and her brother to the United States when she was nine, and her efforts as a schoolgirl to become American in the full sense of the term. After numerous failed romances, she finally met Mr. Right, her husband, Harvard Law School professor Cass Sunstein (who also served briefly in the Obama administration). The marriage gave rise to a boy and a girl with lovely Irish names, Declan and Rían, both born while Power was in government.  With much emphasis upon her parents, husband, children and family life, the memoir is also a case study of how professional women balance the exacting demands of high-level jobs with the formidable responsibilities attached to being a parent and spouse.  It’s a tough balancing act for any parent, but especially for women, and Power admits that she did not always strike the right balance.

Memoirs by political and public figures are frequently attempts to write one’s biography before someone else does, and Power’s whopping 550-page work seems to fit this rule.  But Power provides much candor  – a willingness to admit to mistakes and share vulnerabilities – that is often missing in political memoirs. Refreshingly, she also abstains from serious score settling.  Most striking for me is the nostalgia that pervades the memoir.  Power takes her readers down memory lane, depicting a now by-gone time when the United States cared about human rights and believed in bi- and multi-lateral cooperation to accomplish its goals in its dealings with the rest of the world – a time that sure seems long ago.

* * *

Samantha Jane Power was born in 1970 to Irish parents, Vera Delaney, a doctor, and Jim Power, a part-time dentist.  She spent her early years in Dublin, in a tense family environment where, she can see now, her parents’ marriage was coming unraveled.  Her father put in far more time at Hartigan’s, a local pub in the neighborhood where he was known for his musical skills and “holding court,” than he did at his dentist’s office.  Although young Samantha didn’t recognize it at the time, her father had a serious alcohol problem, serious enough to lead her mother to escape by immigrating to the United States with the couple’s two children, Samantha, then age nine, and her brother Stephen, two years younger. They settled in Pittsburgh, where Samantha at a young age set about to become American, as she dropped her Irish accent, tried to learn the intricacies of American sports, and became a fervent Pittsburgh Pirates fan.

But the two children were required under the terms of their parents’ custody agreement to spend time with her father back in Ireland. On her trip back at Christmas 1979, Samantha’s father informed the nine-year old that he intended to keep her and her brother with him.  When her mother, who was staying nearby, showed up to object and collect her children to return to the United States, a parental confrontation ensued which would traumatize Samantha for decades.  The nine year old found herself caught between the conflicting commands of her two parents and, in a split second decision, left with her mother and returned to the Pittsburgh. She never again saw her father.

When her father died unexpectedly five years later, at age 47 of alcohol-related complications, Samantha, then in high school, blamed herself for her father’s death and carried a sense of guilt with her well into her adult years. It was not until she was thirty-five, after many therapy sessions, that she came to accept that she had not been responsible for her father’s death.  Then, a few years later, she made the mistake of returning to Hartigan’s, where she encountered the bar lady who had worked there in her father’s time.   Mostly out of curiosity, Power asked her why, given that so many people drank so much at Hartigan’s, her father had been the only one who died. The bar lady’s answer was matter-of-fact: “Because you left” (p.192) — not what Power needed to hear.

Power had by then already acquired a public persona as a human rights advocate through her work as a journalist in the 1990s in Bosnia, where she called attention to the ethnic cleansing that was sweeping the country in the aftermath of the collapse of the former Yugoslavia.  Power ended up writing for a number of major publications, including The Economist, the New Republic and the Washington Post.   She was among the first to report on the fall of Srebrenica in July 1995, the largest single massacre in Europe since World War II, in which around 10,000 Muslim men and boy were taken prisoner and “seemed to have simply vanished” (p.102). Although the United States and its NATO allies had imposed a no-fly zone over Bosnia, Power hoped the Clinton administration would commit to employing ground troops to prevent further atrocities. But she did not yet enjoy the clout to have a real chance at making her case directly with the administration.

Power wrote a chronology of the conflict, Breakdown in the Balkans, which was later put into book form and attracted attention from think tanks, and the diplomatic, policy and media communities.  Attracting even more attention was  A Problem for Hell: America and the Age of Genocide, her book exploring  American reluctance to take action in the face of 20th century mass atrocities and genocides.  The book appeared in 2002, and won the 2003 Pulitzer Prize for General Non-Fiction.  It also provided Power with her inroad to Senator Barack Obama.

At the recommendation of a politically well-connected friend, in late 2004 Power sent a copy of the book to the recently elected Illinois Senator who had inspired the Democratic National Convention that summer with an electrifying keynote address.  Obama’s office scheduled a dinner for her with the Senator which was supposed to last 45 minutes.  The dinner went on for four hours as the two exchanged ideas about America’s place in the world and how, why and when it should advance human rights as a component of its foreign policy.  Although Obama considered Power to be primarily an academic, he offered her a position on his Senate staff, where she started working late in 2005.

Obama and Power would then be linked professionally more or less continually until the end of the Obama presidency in January 2017.   Once Obama enters the memoir, at about the one-third point, it becomes as much his story as hers. The two did not always see the world and specific world problems in the same way, but it’s clear that Obama had great appreciation both for Power’s intelligence and her intensity. He was a man who enjoyed being challenged intellectually, and plainly valued the human rights perspective that Power brought to their policy discussions even if he wasn’t prepared to push as far as Power advocated.

After Obama threw his hat in the ring for the 2008 Democratic Party nomination, Power became one of his primary foreign policy advisors and, more generally, a political operative. It was not a role that fit Power comfortably and it threatened to be short-lived.  In the heat of the primary campaign, with Obama and Hilary Clinton facing off in a vigorously contested battle for their party’s nomination, Power was quoted in an obscure British publication, the Scotsman, as describing Clinton as a “monster.” The right-wing Drudge Report picked up the quotation, whose accuracy Power does not contest, and suddenly Power found herself on the front page of major newspapers, the subject of a story she did not want.  Obama’s closest advisors were of the view that she would have to resign from the campaign.  But the candidate himself, who loved sports metaphors, told Power only that she would have to spend some time in the “penalty box” (p.187).  Obama’s relatively soft reaction was an indication of the potential he saw in her and his assessment of her prospective value to him if successful in the primaries and the general election.

Power’s time in the penalty box had expired when Obama, having defeated Clinton for his party’s nomination, won a resounding victory in the general election in November 2008.  Obama badly wanted Power on his team in some capacity, and the transition team placed her on the President’s National Security Council as principal deputy for international organizations, especially the United Nations.  But she was also able to carve out a concurrent position for herself as the President’s Senior Director for Human Rights.   In this portion of the memoir, Power describes learning the jargon and often-arcane skills needed to be effective on the council and within the vast foreign policy bureaucracy of the United States government.  Being solely responsibility for human rights, Power found that she had some leeway in deciding which issues to concentrate on and bring to the attention of the full Council.  Her mentor Richard Holbrook advised her that she could be most effective on subjects for which there was limited United States interest – pick “small fights,” Holbrook advised.

Power had a hand in a string of “small victories” while on the National Security Council: coaxing the United States to rejoin a number of UN agencies from which the Bush Administration had walked away; convincing President Obama to raise his voice over atrocities perpetrated by governments in Sri Lanka and Sudan against their own citizens; being appointed White House coordinator for Iraqi refugees; helping create an inter-agency board to coordinate the United States government’s response to war crimes and atrocities; and encouraging increased emphasis upon lesbian, gay, bi-sexual and transgender issues (LGBT) overseas.  In pursuit of the latter, Obama delivered an address at the UN General Assembly on LGBT rights, and thereafter issued a Presidential Memorandum directing all US agencies to consider LGBT issues explicitly in crafting overseas assistance (disclosure: while with the Department of Justice, I served on the department’s portion of the inter-agency Atrocity Prevention Board, and represented the department in inter-agency coordination on the President’s LGBT memorandum; I never met Power in either capacity).

But the Arab Spring that erupted in late 2010 and early 2011 presented  anything but small issues and resulted in few victories for the Obama administration.  A “cascade of revolts that would reorder huge swaths of the Arab world,” the Arab Spring ended up “impacting the course of Obama’s presidency more than any other geopolitical development during his eight years in office” (p.288), Power writes, and the same could be said for Power’s time in government.  Power was among those at the National Security Council who pushed successfully for United States military intervention in Libya to protect Libyan citizens from the predations of their leader, Muammar Qaddafi.

The intervention, backed by a United Nations Security Council resolution and led jointly by the United States, France and Jordan, saved civilian lives and contributed to Qaddafi’s ouster and death.  ButPresident Obama was determined to avoid a longer-term and more open-ended United States commitment, and the mission stopped short of the follow-up needed to bring stability to the country.  With civil war in various guises continuing to this day, Power suggests that the outcome might have been different had the United States continued its engagement in the aftermath of Qaddafi’s death.

Shortly after Power became US Ambassador to the United Nations, the volatile issue of an American military commitment arose again, this time in Syria in August 2013, when proof came irrefutably to light that Syrian leader Bashar al-Assad was using chemical weapons in his effort to suppress uprisings within the country.  The revelations came 13 months after Obama had asserted that use of such weapons would constitute a “red line” that would move him to intervene militarily in Syria.  Power favored targeted US air strikes within Syria.

Obama came excruciatingly close to approving such strikes.  He not only concluded that the “costs of not responding forcefully were greater than the risks of taking military action” (p.369), but was prepared to act without UN Security Council authorization, given the certainty of  a Russian veto of any Security Council resolution for concerted action.   With elevated stakes for “upholding the international norm against the use of chemical weapons” Power writes, Obama was “prepared to operate with what White House lawyers called a ‘traditionally recognized legal basis under international law’” (p.369).

But almost overnight, Obama decided that he needed prior Congressional authorization for a military strike in Syria, a decision taken seemingly with little effort to ascertain whether there was sufficient support in Congress for such a strike.  With neither the Congress nor the American public supporting military action within Syria to save civilian lives, Obama backed down.  On no other issue did Power see Obama as torn as he was on Syria,  “convinced that even limited military action would mire the United States in another open-ended conflict, yet wracked by the human toll of the slaughter.  I don’t believe he ever stopped interrogating his choices” (p.508).

Looking back at that decision with the passage of more than five years, Power’s disappointment remains palpable.  The consequences of inaction in Syria, she maintains, went:

beyond unfathomable levels of death, destruction, and displacement. The spillover of the conflict into neighboring countries through massive refugee flows and the spread of ISIS’s ideology has created dangers for people in many parts of the world. . . [T]hose of us involved in helping devise Syria policy will forever carry regret over our inability to do more to stem the crisis.  And we know the consequences of the policies we did choose. For generations to come, the Syrian people and the wide world will be living with the horrific aftermath of the most diabolical atrocities carried out since the Rwanda genocide (p.513-14).

But if incomplete action in Libya and inaction in Syria constitute major disappointments for Power, she considers exemplary the response of both the United States and the United Nations to the July 2014 outbreak of the Ebola virus that occurred in three West African countries, Guinea, Liberia and Sierra Leone.  United States experts initially foresaw more than one million infections of the deadly and contagious disease by the end of 2015.  The United States devised its own plan to send supplies, doctors and nurses to the region to facilitate the training of local health workers to care for Ebola patients, along with 3,000 military personnel to assist with on-the-ground logistics.  Power was able to talk President Obama out of a travel ban to the United States from the three impacted countries, a measure favored not only by Donald Trump, then contemplating an improbable run for the presidency, but also by many members of the President’s own party.

At the United Nations, Power was charged with marshaling global assistance.   She convinced 134 fellow Ambassadors to co-sponsor a Security Council resolution declaring the Ebola outbreak a public health threat to international peace and security, the largest number of co-sponsors for any Security Council resolution in UN history and the first ever directed to a public health crisis.  Thereafter, UN Member States committed $4 billion in supplies, facilities and medical treatments.  The surge of international resources that followed meant that the three West African countries “got what they needed to conquer Ebola” (p.455).  At different times in 2015, each of the countries was declared Ebola-free.

The most deadly and dangerous Ebola outbreak in history was contained, Power observes, above all because of the “heroic efforts of the people and governments of Guinea, Liberia and Sierra Leone” (p.456). But America’s involvement was also crucial.  President Obama provided what she describes as an “awesome demonstration of US leadership and capability – and a vivid example of how a country advances its values and interests at once” (p.438).  But the multi-national, collective success further illustrated “why the world needed the United Nations, because no one country – even one as powerful as the United States – could have slayed the epidemic on its own” (p.457).

Although Russia supported the UN Ebola intervention, Power more often found herself in an adversarial posture with Russia on both geo-political and UN administrative issues.  Yet, she used creative  diplomatic skills to develop a more nuanced relationship with her Russian counterpart, Vitaly Churkin.  Cherkin, a talented negotiator and master of the art of strategically storming out of meetings, valued US-Russia cooperation and often “pushed for compromises that Moscow was disinclined to make” (p.405).  Over time, Power writes, she and Churkin “developed something resembling genuine friendship” (p.406). But “I also spent much of my time at the UN in pitched, public battle with him” (p.408).

The most heated of these battles ensued after Russia invaded Ukraine in February 2014, a flagrant violation of international law. Later that year, troops associated with Russia shot down a Malaysian passenger jet, killing all passengers aboard.  In the UN debates on Ukraine, Power found her Russian counterpart “defending the indefensible, repeating lines sent by Moscow that he was too intelligent to believe and speaking in binary terms that belied his nuanced grasp of what was actually happening” (p.426). Yet, Power and Churkin continued to meet privately to seek solutions to the Ukraine crisis, none of which bore fruit.

While at the UN, Power went out of her way to visit the offices of the ambassadors of the smaller countries represented in the General Assembly, many of whom had never received  a United States Ambassador.  During her UN tenure, she managed to meet personally with the ambassadors from every country except North Korea.  Power also started a group that gathered the UN’s 37 female Ambassadors together one day a week for coffee and discussion of common issues.  Some involved  substantive matters that the UN had to deal with, but just as often the group focused on workplace matters that affected the women ambassadors as women, matters that their male colleagues did not have to deal with.

* * *

Donald Trump’s surprise victory in November 2016 left Power stunned.  His nativist campaign to “Make America Great Again” seemed to her like a “repudiation of many of the central tenets of my life” (p.534).  As an  immigrant, a category Trump seemed to relish denigrating, she “felt fortunate to have experienced many countries and cultures. I saw the fate of the American people as intertwined with that of individuals elsewhere on the planet.   And I knew that if the United States retreated from the world, global crises would fester, harming US interests” (p.534-35).  As Obama passed the baton to Trump in January 2017, Power left government.

Not long after, her husband suffered a near-fatal automobile accident, from which he recovered. Today, the pair team-teach courses at Harvard, while Power seems to have found the time for her family that proved so elusive when she was in government.  She is coaching her son’s baseball team and helping her daughter survey rocks and leaves in their backyard.  No one would begrudge Power’s quality time with her family. But her memoir will likely leave many readers wistful, daring to hope that there may someday  be room again for  her and her energetic idealism in the formulation of United States foreign policy.

Thomas H. Peebles

La Châtaigneraie, France

April 26, 2020

7 Comments

Filed under American Politics, American Society, Politics, United States History

School Girls on the Front Lines of Desegregation

 

Rachel Devlin, A Girl Stands in the Door:

The Generation of Young Women Who Desegregated America’s Schools

(Basic Books)

When World War II ended, public schools in the United States were still segregated by race throughout much of the country.  Segregated schools were mandated by state legislatures in all the states of the former Confederacy (“the Deep South”), along with Washington, D.C., Delaware and Arizona, while a handful of American states barred racial segregation in their public schools.  In the remainder, the decision whether to segregate was left to local jurisdictions.  Racial segregation of public schools found its constitutional sanction in Plessy v. Ferguson, the United States Supreme Court’s 1896 decision which held that equal protection of the law under the federal constitution did not prohibit states from maintaining public facilities that were “separate but equal.”

But “separate but equal” was a cruel joke, particularly as applied to public schools: in almost every jurisdiction which maintained segregated schools, those set aside for African-Americans were by every objective standard unequal and inferior to counterpart white schools.  In 1954, the Supreme Court, in one of its most momentous decisions, Brown v. Board of Education of Topeka, Kansas, invalidated the Plessy “separate but equal” standard as applied to public schools, holding that in the school context separate was inherently unequal.  The decision preceded by a year and a half the Montgomery, Alabama, bus boycott that made both Rosa Parks and Martin Luther King, Jr., household names.  The pathway leading to Brown was arguably the opening salvo in what we now term the modern Civil Rights Movement.

That pathway has been the subject of numerous popular and scholarly works, the best known of which is Richard Kluger’s magisterial 1975 work Simple Justice.  In Kluger’s account and most others, the National Association for the Advancement of Colored People (NAACP) and its Legal Defense Fund (LDF), which instituted Brown and several of its predecessor cases, are front and center, with future Supreme Court justice Thurgood Marshall, the LDF’s lead litigator, the undisputed lead character.  Yet, Rachel Devlin, an associate professor of history at Rutgers University, maintains that earlier studies of the school desegregation movement, including that of Kluger, overlook a critical point: the students who desegregated educational institutions – the “firsts,” to use Devlin’s phrase — were mostly girls and young women.

Devlin’s research revealed that only one of the early, post-World War II primary and secondary school desegregation cases that paved the way to the Brown decision was filed on behalf of a boy.  Looking at those who “attempted to register at white schools, testified in court, met with local white administrators and school boards, and talked with reporters from both the black and white press,” Devlin saw almost exclusively schoolgirls.  This disparity “held true in the Deep South, upper South, and Midwest” (p.x). After the Brown decision, the same pattern prevailed: “girls and young women vastly outnumbered boys as the first to attend formerly all-white schools” (p.x).

Unlike Kluger, Devlin does not focus on lawyers and lawsuits but rather on the “largely young, feminine work that brought school desegregation into the courts” (p.xi).  She begins with court challenges to state enforced segregation at the university level, some of which began before World War II.  She then proceeds to a host of post-World War II communities that challenged racial segregation in primary and second schools in the late 1940s and early 1950s.  The Brown decision itself, a ruling on segregated schools in Topeka, Kansas, merits only a few pages, after which she portrays the first African-American students to enter previously all-white schools during the second half of the 1950s and into the 1960s.  The pre-Brown challenges to segregated public education that Devlin highlights took place in Washington, D.C., Kansas, Delaware, Texas and Virginia. In her post-Brown analysis, she turns to the Deep South, to communities in Louisiana, Georgia and South Carolina.

Devlin’s intensely factual and personality-driven narrative at times falls victim to a forest-and-trees problem: she focuses on a multitude of individuals — the trees — to the point that the reader  can easily lose sight of the forest — how the featured individuals fit into the overall school desegregation movement.  Yet, there are a multitude of lovely trees to behold in Devlin’s forest – heroic and endearing schoolgirls and the adults who supported them, both men and women, all willing to confront entrenched racial segregation in America’s public schools.

* * *

School desegregation, Devlin writes, differed from other civil rights battles, such as desegregation of lunch counters, public transportation, and parks, in that interacting with white people was not “fleeting or ‘fortuitous,’ but central to the project itself.  School desegregation required sustained interactions with white school officials and students. This fact called for a different approach than other forms of civil rights activism” (p.xxiv).   But Devlin also emphasizes that this different approach gave rise to controversy among affected African-Americans.

In almost every community she studied, there was a dissident African-American faction that opposed desegregation of all-white schools, favoring direct pressure and court cases designed to force school authorities to make good on the “equal” portion of “separate but equal.”  Parents who favored this less frontal approach, while “willing to protest unequal schools, simply wanted a better education for their children while they were still young enough to receive it, not a long, hard campaign against a long-standing Supreme Court precedent” (p.167).  Devlin demonstrates that this quest for equalization, however understandable, was at best quixotic. Time and time again, she shows, the white power structure in the communities she studies had no serious intention of equalizing black and white schools.

Why girls and young women predominated in school desegregation efforts is as much a part of Devlin’s story as the particulars of those efforts at the institutions and in the communities she studies.  After WWII, she notes, there was a “strong, though unstated, cultural assumption that the war to end school desegregation was a girls’ war, a battle for which young women and girls were specially suited” (p.xvi).  With the example of boys and young men who had gone off to fight in World War II fresh in everyone’s minds, Devlin speculates, girls and young women may have felt an “ethical compulsion to act at a young age” (p.xvi).

Devlin was able to interview several of the female firsts for her book as they looked back on their experience in desegregating schools several decades earlier.  These women, she indicates, had been inspired as school girls “not only by a sense of obligation and individual calling but also by the opportunity to do something important and highly visible in a world and at a time when young women did not often earn much public acclaim” (p.225). The boys and young men she studied, by contrast, manifested a “desire to distance themselves from an overt, individual commitment to desegregating schools” (p.223).  Leaving was more of an option for high school age boys who felt alienated in newly desegregated schools.  They had “more mobility – and autonomy – than young women, and it allowed them to walk away from the school desegregation process when they felt it was not working for them” (p.196).   Leaving for girls “did not feel like a choice, both because they understood their parents’ expectations of them and because they had fewer alternatives” (p.196).

* * *

The pathway to Brown in Devlin’s account starts at the university level with Lucille Bluford and Ida Mae Sipuel, two lesser-known women who were denied admission because of their race to, respectively, the University of Missouri School of Journalism and the University of Oklahoma Law School.  Both saw their court cases overshadowed by those of men, Lloyd Gaines and Herman Sweatt, pursuing university level desegregation in court at the same time.  But while the two men’s cases established major Supreme Court precedents, both proved to be disappointing plaintiffs and spokesmen for the desegregation cause, in sharp contrast to Bluford and Sipuel.

Gaines was the beneficiary of one of the Supreme Court’s first major decisions involving higher education, Gaines v. Canada, where the Court ruled in 1938 that the State of Missouri was required either to admit Gaines to the University of Missouri Law School or create a separate facility for him.  Missouri chose the latter option, which Gaines refused.  But he thereafter went missing.  He was last seen taking a train to Chicago and was never heard from again.  Bluford, then a seasoned journalist working for the African-American newspaper the Kansas City Call, not only covered the Gaines litigation decision but also set out to gain admission herself to the University of Missouri’s prestigious School of Journalism.

Both “hardheaded and gregarious” (p.32), Bluford doggedly pursued admission to the university’s journalism school between 1939 and 1942.  In her court case, her lawyer, the NAACP’s Charles Houston, provided the book’s title in his closing argument when he told the court: “A girl stands at the door and a generation waits outside” (p.27).  When Bluford won a victory in court in 1942, Missouri chose to close its journalism school, citing low wartime enrollment, rather than admit Bluford.  But with her uncanny ability to find “significance in small acts of decency and mutual acknowledgement in everyday encounters” (p.11), Bluford turned her energies to reporting on school desegregation cases throughout the country, including both Sipuel’s quest to enter the University of Oklahoma Law School and the Kansas desegregation cases that led to Brown.

Sipuel agreed to challenge the University of Oklahoma Law School’s refusal to admit African-Americans only after her brother Lemuel turned down the NAACP’s request to serve as plaintiff in the case.  In 1946, she refused Oklahoma’s offer create a separate “Negro law school,” and two years later won a major Supreme Court case when the Court ruled that Oklahoma was obligated to provide her with legal education equal to that of whites.  Sipuel became the near perfect first at the law school, Devlin writes, personifying the uncommon array of skills required in that sensitive position:  “personal ambition combined with an ability to withstand public humiliation, charisma in front of the camera and self-sacrificing patience, the appearance of openness with the black and white press corps alongside an implacable determination” (p.67).

The “girl who started the fight,” as one black newspaper described Sipuel, became “something of a regional folk hero” (p.52) as a role model for future desegregation plaintiffs.  The “revelation that school desegregation was in their grasp came not from the persuasive power of NAACP officials and lawyers,” Devlin writes, but from the “‘young girl’ who would not be turned down” (p.37).  Sipuel went on to become the law school’s first African American graduate and thereafter the first African-American to pass the Oklahoma bar.

Sipuel’s engaging and exuberant public persona contrasted with that of Herman Sweatt, who sought to enter the University of Texas’s flagship law school in Austin.  In a 1950 case bearing his name, Sweatt v. Painter, the Supreme Court rejected Texas’ contention that it could satisfy the requirements of the constitution’s equal protection clause by consigning Sweatt to a “Negro law school” it had established in Houston.  The Court’s sweeping decision outlawed segregation in its entirety in graduate school education.  But although Sweatt did not go missing in action like Lloyd Gaines, he never completed his course of study at the University of Texas Law School and proved to be ill suited to the high-visibility, high-pressure role of a desegregation plaintiff.  He exuded neither Sipuel’s enthusiastic commitment to desegregated higher education, nor her grace under fire.

As the Supreme Court was rewriting the rules of university level education, dozens of cases challenging primary and secondary school segregation were percolating in jurisdictions across America, with Washington, D.C., and Meriam, Kansas, near Kansas City, providing the book’s most memorable characters.  Rigidly segregated Washington,  the nation’s capital, had several lawsuits going  simultaneously, each of which featured a strong father standing behind a courageous daughter.

First out of the gate was 14-year old Marguerite Carr.  Amidst much fanfare, in 1947 Marguerite’s father took her to enroll at a newly built white middle school two blocks from her home, where she faced off with the school principal.  When the principal told her, “you don’t want to come here,” Carr smiled, a “sign of social reciprocity, trustworthiness, a willingness to engage,” yet at the same time told the principal respectfully but firmly, “I do want to come to this school” (p.ix).  Carr’s combative response was pitch perfect, Devlin argues, meeting the “contradictory requirements inherent in such confrontations” (p.ix).

Marguerite’s court case coincided with that of Karla Galaza, a Mexican-American who had been attending  a black vocational school with a strong program in dress design until school authorities discovered that she was not black and barred her from the school.  Her stepfather, a Mexican-American activist, filed suit on his daughter’s behalf.  Simultaneously, Gardner Bishop surged into a leadership position during an African-American student strike challenging segregated education in Washington.  Bishop, by day a barber, was an activist who thrust his somewhat reluctant daughter Judine into the strike and subsequent litigation.  Bishop described himself as an outsider in Washington’s desegregation battle, representing the city’s African-American working class rather than its black middle class.  None of these cases culminated in a major court decision.

The NAACP later chose Spotswood Bolling as the lead plaintiff over a handful of girls in the lawsuit that accompanied Brown to the Supreme Court.  The young Bolling was another elusive male plaintiff, dodging all reporters and photographers.  His discomfort with the press “sets in high relief the performances of girl plaintiffs with reporters in the late 1940s (p.173),” Devlin argues.  Girls and young women “felt it was their special responsibility to find ways to address such inquiries. Bolling evidently did not” (p.174).   But the case bearing his name, Bolling v. Sharp, decided at the same time as Brown, held that segregation in Washington’s public schools was unconstitutional even though, as a federal district rather than a state, Washington was not technically bound by the constitution’s equal protection clause.

In South Park, Kansas, an unincorporated section of Merriam, located outside Kansas City, Esther Brown, arguably the book’s most unforgettable character, led a student strike over segregated schools.  Brown, a 23-year-old Jewish woman, committed radical and communist sympathizer, cast herself as merely a “housewife with a conscience” — a “deliberately humble, naïve, and conservative image” (p.108) that she invoked constantly in her dealings with public.  Lucille Bluford covered the strike for the Kansas City Call.  Bluford and the “White Mrs. Brown,” as she was called, subsequently became friends (Esther Brown was not related to Oliver Brown, the named plaintiff in the Brown case).

During the South Park student strike, Esther Brown went out on a limb to promise that she would find a way to pay the teachers herself.  She organized a Billie Holiday concert, but most of her fund raising targeted people of modest means – farmers, laborers, and domestics.  She eventually persuaded Thurgood Marshall that the NAACP should initiate a court case, despite Marshall’s initial reservations — he was suspicious of what he described as a “one woman show” (p.125).  Although the lawsuit was filed on behalf of an even number of boys and girls, Patricia Black, then eight years old, was chosen to testify in court — “setting another pattern of female participation for the cases to come” (p.111).  Black, who wore a white bow in her hair when she testified, reflected years later that she had been “taught how to act,” which meant “having manners . . . sitting up straight . . . making eye contract, being erect, and [being] nice” (p.139).

The South Park lawsuit led to the NAACP’s first major desegregation victory below the university level.  Black grade school students successfully entered the white school in the fall of 1949. The South Park case also inspired the challenge to segregated schooling in Topeka that culminated in the Brown decision.  At the trial in Brown, a 9-year-old girl, Kathy Cape, accepted the personal risk and outsized responsibility of testifying at the trial, rather than  the named plaintiff Oliver Brown, a boy.

With the Supreme Court’s ruling in Brown meriting barely more than a page, Devlin turns in the last third of the book to the schoolgirls who entered previously all white schools in the aftermath of the ruling.  Here, more than in her earlier portions, she describes in stark terms the white opposition to desegregation which, although widespread, was especially ferocious in the Deep South, where the “vast majority of school boards angrily fought school desegregation with every resource available to them” (p.192).  Devlin notes that between 1955 and 1958, southern legislatures passed nearly five hundred laws to impede implementation of Brown.

In New Orleans, three girls, Tessie Prevost, Leona Tate and Ruby Bridges, were chosen to be firsts as eight year olds at Semmes Elementary School.  Years later, Tessie described to Devlin what she, Leona and Ruby had endured at Semmes.  Administrators, teachers, and fellow pupils “did everything in their power to break us” (p.213-14), Prevost recounted.  Even teachers incited violence against the girls:

The teachers were no better that the kids. They encouraged them to fight us, to do whatever it took.  Spit on us. We couldn’t even eat in the cafeteria; they’d spit on our food – we could hardly use the restrooms  . . . They’d punch you, trip you, kick you . . . They’d push you down the steps . . . I got hit by a bat . . . in the face . . . It was every day. And the teachers encouraged it . . . Every day.  Every day (p.214).

The New Orleans girls’ experience was typical of the young firsts from the other Southern communities Devlin studied, including Baton Rouge, Louisiana, Albany, Georgia and Charleston, South Carolina.  Nearly all experienced relentless abuse, “not simply violence and aggression but a systemic, all encompassing, organized form of endless oppression” (p.214). Throughout the South, black schoolgirls demonstrated an extraordinary ability to “withstand warfare within the school when others could not,” which Devlin characterizes as a “barometer of their determination, courage, ability, and strength” (p.218).

* * *

Devlin acknowledges a growing contemporary disillusionment with the Brown decision and school integration generally among legal scholars, historians and ordinary African-Americans.  But the school desegregation firsts who met with Devlin for this book uniformly believe that their actions more than a half-century earlier had “transformed the arc of American history for the better” (p.268).   Even if Brown no longer occupies quite the exalted place it once enjoyed in the iconography of the modern Civil Rights Movement, the schoolgirls and supporting adults whom Devlin portrays in this deeply researched account deserve our full admiration and gratitude.

 

Thomas H. Peebles

La Châtaigneraie, France

April 8, 2020

 

11 Comments

Filed under American Society, United States History