No Hero

David Kertzer, The Pope at War:

The Secret History of Pius XII, Mussolini and Hitler

(Random House)

Pope Pius XII, born Eugenio Pacelli in 1876, assumed the papacy on the eve of World War II, in late winter 1939, and steered the Vatican through history’s most devastating war.  His stewardship during this perilous time has attracted much attention, often polemical, from scholars and the public, making him arguably the 20th century’s most controversial pope.  Two contrasting appraisals of the pope during the war years have emerged.   David Kertzer, Brown University professor and one of the English-speaking world’s leading scholars of 20th century Italy, summarizes them at the outset of his most recent work, The Pope at War: The Secret History of Pius XII, Mussolini and Hitler.

To his defenders, Pius XII was a heroic figure who had to maneuver between belligerents to keep the Vatican and Catholicism afloat at all amid the tumult of war.  To paraphrase Josef Stalin, the Pope had no legions available to defend the Vatican.  Rather, his institutional defense of the Vatican had to rely upon guile, political maneuvering behind the scenes, keeping a low profile, and remaining on good terms with all the era’s belligerents.

The alternative view is highly critical of Pius XII, especially his steadfast refusal to use the moral authority of his position to speak out against the treatment of Europe’s Jews, even when it was brought to his attention that Nazi Germany’s all-too-evident persecution of Jews had given way to their extermination.  In this view, the pope was easily intimidated and manipulated by Axis dictators Adolph Hitler and Benito Mussolini, and “ever apt to embrace expediency over principle” (p.xxix).   John Cornwell’s biography, Hitler’s Pope, whose title Kertzer characterizes as deliberately provocative, captures the spirit of this perspective.

Kertzer is closer to the second view.  But he adds nuance to both in no small measure through a treasure trove of documents maintained in the Vatican archives on Pius XII’s time as pope, sealed upon his death in 1958 and not reopened until March 2020.   Kertzer is the first to take full advantage of the recently released documents, which include internal Vatican memoranda prepared at the pope’s request, along with reports sent to the pope from his nuncios and other church leaders across Europe during the war years.  He skillfully blends these documents with others from Italian, German, French, British, and American archives, especially diplomatic notes and cables, many of which have themselves only recently been declassified.  The recently available documents allow him to tell a “more complete story” (p.xxix-xxx) of the controversial pope’s actions and why he took the decisions he did.  The result is a work that is likely to stand for many years as the definitive work on Pius XII during World War II.

In a narrative that revolves around diplomatic documents, especially notes of meetings between Vatican officials and representatives of other governments, Kertzer casts much light on the book’s central area of inquiry, the pope’s reaction to Nazi Germany’s unfolding project to exterminate Europe’s Jews.  Kertzer gives full due to the view of the pope as the heroic guardian of the Vatican’s institutional interests during the war years.  Yet, his circumspect and through presentation of evidence leaves little doubt that if speaking out for embattled Jews and prioritizing saving of Jewish lives was the measure of heroism during World War II, Pius XII was no hero.

From the time war broke out in Europe in September 1939, Kertzer shows the pope besieged – seemingly, almost daily – with entreaties for him to use the moral authority of his position to speak out on behalf of the victims of Nazi atrocities, Jews and non-Jews alike and, in case after case, declining to do so.  The Vatican’s interest in Jews under Pius XII was confined to exempting those who had converted to Christianity from anti-Semitic actions undertaken in both Germany and Italy .  Otherwise, the pope’s priorities were elsewhere, including protecting Catholics in Nazi-occupied countries and Rome from allied bombing.

Pius XII was no fan of Hitler.  He was alarmed by the efforts of Hitler’s regime to “weaken the church’s influence, diminish its hold on youth, and discredit key aspects of its theology” (p.479).   But the pope also viewed Germany as Europe’s “strongest bulwark against what he regarded as the church’s greatest enemy, Communism” (p.xxxvi).  Above all, Kertzer writes, the pope’s highest priority was to “safeguard the church and thereby protect its God-given mission of saving souls” (p.34).

Kertzer’s account is also a story of how Italy experienced the war years, with the pope’s relationship to Mussolini figuring as much in the story as his relationship to Hitler.  In this sense his work constitutes a natural sequel to The Pope and Mussolini: The Secret History of Pius XI and the Rise of Fascism in Europe, reviewed here in 2016.  There, Kertzer captured the improbable but mutually beneficial relationship between Mussolini and Pacelli’s predecessor, Pius XI.   The Vatican under Pius XI benefitted enormously when Mussolini’s Fascist regime reinstated the church’s privileged position within Italian society.  Moreover, the  Vatican looked upon the Fascist party as the only force that could preserve order in Italy by serving as a bulwark against the existential threat of Soviet communism.  Vatican support under Pius XI in turn played a major role in legitimizing Mussolini’s explicitly anti-democratic regime and allowing it to consolidate power.

The symbiotic relationship between Mussolini and the pope continued under Pius XII.  As Europe lurched closer to war in 1939, the newly elected pope was “committed to maintaining the church’s mutually beneficial collaboration with Italy’s Fascist government and was eager to reach an understanding with Nazi Germany,” Kertzer writes.  The pope considered Mussolini his “best bet” for exercising a “moderating influence” (p.474) on Hitler, who seemingly was bound to determine Europe’s fate and that of the church.   The pope was particularly shaken by rumors that Germany intended to oust the pope from the Vatican once victory was obtained, or even do away with the Vatican altogether.  That meant that he didn’t want to present himself as the enemy of Germany.

Kertzer shows Pius XII entertaining doubts about the wisdom of Italy’s belated entry into the war in June 1940, just France was about to fall.  But Mussolini had surprisingly strong support throughout the church hierarchy in Italy, along with solid initial backing from the Italian people, and the pope’s misgivings over Italy’s decision to go to war on Hitler’s side “largely evaporated” (p.156).  While trying to maintain a public stance of neutrality, the pope did nothing to “discourage Italy’s most prominent churchmen from giving their vocal support to the Axis war” (p.208).

* * *

In 1939, the same year that Pacelli became pope, Mussolini’s government instituted the Italian “racial laws,” harsh measures aimed at turning Italy’s relatively small Jewish community into second class citizens.  Non-Italian Jews were ordered to leave the country.  Jewish children were barred from the nation’s schools, and Jewish schoolteachers and university professors were dismissed.  Jews were also barred from the military and civil service, from working in banks or insurance companies, from owning large businesses or farms, and from employing Christian household help.  No criticism of the Italian racial laws “would ever escape the pope’s lips or pen,” Kertzer writes, “not in 1939, nor over the following years in which they were in force” (p.472).  But while the pope “offered no public sign of displeasure with the anti-Jewish campaign generally,” he lobbied strenuously for “exemptions on behalf of Catholics who had formerly been Jews or were the children of Jews” (p.54).

Through his access to recently released materials, Kertzer sheds much light upon secret negotiations between the Vatican and Nazi Germany that began in August of 1939, as Hitler prepared to invade Poland.  Hitler’s negotiator was Prince Philipp von Hessen, not only the son-in-law of Italian King Victor Emmanuel but also a man who, Kertzer indicates, stood second only to Albert Speer as a Hitler favorite.  Van Hessen was bargaining for the Vatican and the Catholic Church to stay out of politics.   The negotiations continued after the German invasion of Poland.  While the pope remained, as Kertzer puts it, “eager to reach an understanding with Hitler, at the same time he wanted the Führer to know that any agreement depended on a change of those German policies that had harmed the church” (p.91).

The negotiations went nowhere.  As far as we know, there was never an explicit offer on the table that the pope agree to give Nazi Germany a free hand to deal with what was euphemistically termed the “Jewish question” as it saw fit in return for the Vatican’s priorities, genuinely deferential treatment of German Catholics, a strict hands-off policy toward the Catholic Church, or guaranteed exemptions for Jews who had converted to Christianity.  But one shudders to think what the pope’s reaction might have been had von Hessen put any of these offers on the table.

In October 1941, as German aggression extended west to the Netherlands, Belgium, Luxembourg, and France, Pius XII received one of the first “unmistakably credible accounts of the massacre of Europe’s Jews” (p.214).  In November 1941, the pope learned in even greater detail about the unfolding mass murder of Europe’s Jews in Eastern Europe.   By January 1942, reports of Hitler’s campaign to exterminate Europe’s Jews were coming regularly, not only from Jewish and press sources but also from “churchmen in whom he would have complete faith” (p.224).  At least from that point onward, Kertzer writes, the pope was “well aware of the fate that awaited the Jews being deported to Nazi death camps” (p.276).  Yet, he continued to resist pressures to intervene publicly, arguing repeatedly that his words would “hold little sway with the Germans and any papal criticism risked provoking a backlash against the church in German-occupied Europe” (p.276).

Pius XII hoped his annual Christmas address in 1942 would dispel concerns about his failure to speak out against Nazi atrocities.  The pope’s speeches were typically lengthy, Kertzer writes, with a “level of abstraction and abstruse ecclesiastical language that flew over the heads of all but the most erudite church intellectuals” (p.188).  His 1942 Christmas address was also filled with “words that both sides of the war could interpret as supporting their cause” (p.260).  Although he nowhere mentioned Nazis or Jews, the pope lamented on page 24 of his prepared text that “hundreds of thousands of people who, through no fault of their own and solely because of the nation or their race, have been condemned to death or progressive extinction” (p.260).  When the Polish Ambassador respectfully requested that he articulate more definitively his opposition to Nazi atrocities, the pope responded that he had already done exactly that and was hurt that he had not received a word of thanks.   Catholics in Poland would pay a heavy price if he spoke out more directly, he added.

Another preoccupation of the pope came to the fore when the Allies began bombing Turin and other major industrial cities shortly after Italy entered the war in June 1940: that the Allies refrain from bombing Rome.  Then, as Hitler’s war fortunes reversed from late 1942 onward and Italian public opinion turned against the war after an unrelieved succession of military disasters, Kertzer cites repeated instances of the pope setting aside his earlier concerns about the church under an all-powerful Nazi regime and becoming “increasingly worried about the impact of a German defeat” (p.269), with a “new fear, the fate of the church following a victory of the Soviet Union” (p.474).

When the Fascist regime fell in the summer of 1943, Italy’s racial laws were up for revision and the Vatican deemed it opportune to press again the point that it felt the laws were being unfairly applied to Jews who had been baptized, along with baptized children of “mixed marriages.”  A Vatican memo made clear that this was the only change it was pushing for, emphasizing that it was not seeking the revocation of all anti-Jewish laws.   Then, in the fall of 1943, with Germany occupying Rome, Nazi SS leader Henrich Himmler ordered all Roman Jews “transferred.”

Approximately 1,250 Jews were arrested in the round up operation that followed and held at a detention center only a few hundred meters from the Vatican.  Rumors that the pope had blessed the operation circulated.  There is no evidence supporting the rumors but, as Kertzer observes, the pope “had never spoken out either against Italy’s own racial laws or against the Nazis’ systematic murder of the Jews,” thereby allowing such stories “to be spread among the SS and German troops” (p.363).  Vatican intervention even led to the release of some former Jews who had been baptized and some Jews married to Christians.

The remaining Jews were loaded into trucks and taken to the Rome train station, including over 100 children under age five.  The pope by then “could have no doubt about the fate that awaited the Jews” (p.363), Kertzer writes.  Yet, despite numerous pleas  to intervene, the pope’s response appears to have been confined to alerting the German authorities that, despite earlier efforts “some with bona-fide Catholic credentials had been among those forced onto the train” (p.369).  The train that left Rome that October day in 1943 arrived at Auschwitz a week later.  All but sixteen of the Italian Jews on the train perished.

Behind the scenes, Germany’s Ambassador to the Vatican was greatly relieved to be able to inform his superiors in Berlin that the pope had decided not to say anything about the roundup of Rome’s Jews.  The pope has done “everything possible not to strain relations with the German government and the German authorities in Rome” (p.370), the Ambassador reported approvingly.  He went on to note that an oblique reference to the roundup in the Vatican’s newspaper was worded in such a way that it will be “understood by very few people as a specific reference to the Jewish question” (p.371).

It was not until June 1945, after Mussolini’s assassination by Italian partisans and Nazi Germany’s capitulation to the Allies, that the pope addressed the question of National Socialism in a direct sense.  He highlighted the suffering of Catholics and the Catholic Church during the war and, as Kertzer puts it, “represented the Catholics in Germany as the Nazis’ victims” (p.460).  Yet, the pope “made not even the briefest mention, indeed no mention at all, of the Nazis’ extermination of Europe’s Jews.  If any Jews had been in those concentration camps alongside the valorous Catholic priests and lay Catholics, one would not know it from the pope’s speech” (p.460).  The pope also used his 1945 Christmas day message to denounce the dangers of totalitarianism.   But, as the British Ambassador to the Vatican pointed out, the pope “waited to denounce totalitarian states until the only one left was the Soviet Union” (p.466).

Implicitly but hardly subtly, Kertzer makes perhaps his most telling point about papal priorities by noting that in contrast to any condemnation of Hitler or the Nazis in the war years, Pius XII had “no trouble condemning the dangers of immorality in the areas of women’s fashion, sport, hygiene, social relations and entertainment” (p.201).  The pope was particularly scandalized by instances of women’s immodest dress, along with inappropriate dancing, theater, books and magazines.  He and the Vatican’s moral arbiters who worked for him also excoriated Italy’s popular variety shows, “to which large numbers were streaming as a diversion from the rigors of wartime life” (p.254).

* * *

Only in the book’s final pages does Kertzer render an explicit verdict on Pius XII.  There is a good case that he was successful in “protecting the institutional interests of the church at a time of war,” he writes and adds, with a hint of irony, equally successful in adhering firmly to his “determination to do nothing to antagonize” (p.480) either Hitler or Mussolini.  As a “moral leader,” however, Kertzer concludes that Pius XII “must be judged a failure” (p.480).  What Kertzer makes clear, but does not need to say in light of the mountain of evidence he presents, is that Jews and saving Jewish lives ranked far from the top of Pius XII’s priorities.

 

Thomas H. Peebles

Paris, France

May 19, 2024

Leave a comment

Filed under European History, German History, History, Italian History

The Need to Enlighten the Woke

Susan Neiman, Left is Not Woke (Polity Press, 2023)

In Left is Not Woke, Susan Neiman wades into the ongoing debate on both sides of the Atlantic over the concept of “wokeness.”  Her primary objective in this slim volume is to rescue the thinking of the 18th century Enlightenment from what she sees as an ill-considered attack coming from the “woke left,” which she defines as the “far left, or the radical left”  (p.3).  An American scholar with academic training in philosophy and intellectual history, Neiman has lived mostly in Germany over the last quarter-century, where today she is director of the Einstein Forum, a German think tank located in Potsdam, outside Berlin.  She is the author of Learning from the Germans, Race and the Memory of Evil,  reviewed here in 2020, a thought provoking work examining German and American experiences with historical memory and what she calls “comparative atonement,” the differing ways in which the two countries have tried to come to terms with two of the darkest chapters in their history, the Holocaust in Germany and the United States’ years of chattel slavery and racial discrimination.

The term “woke,” Neiman reminds us, owes its origins to a 1938 song “Scottsboro Boys,” by renowned blues singer Lead Belly, Huddie William Ledbetter, written in defense of nine black teenagers who were falsely accused of rape of a white woman in the 1930s.  Staying woke in Lead Belly’s song meant staying awake, always watching for signs of discrimination.  In this sense, we should all be woke.

Unfortunately, as she wades into the debate over wokeness, Neiman stays in the shallow end.  Her case against woke thinking is largely undocumented, with few contemporary examples of that thinking.  At a time at a time when some right-wing politicians seem to use “woke” as a shorthand for much that is objectionable to conservatives, Neiman passes up an opportunity to add clarity to the term.

Alarmed by the rise of anti-democratic authoritarianism across the globe, not least in the United States and Germany, Neiman appears to be warning the political left that it needs to get its own house in order if it wants to curb the world’s rising authoritarianism.  While the anti-democratic right may be more dangerous, by denigrating the Enlightenment the woke left has “deprived itself of the ideas we need if we hope to resist the lurch to the right” (p.3).  Ideas that can be traced directly to the Enlightenment provide “much stronger conceptions of progress, justice, and solidarity than those which are dominant today,” (p.109), she writes.

Neiman highlights three such ideas that she contends have been denigrated and discarded by the woke left, all flowing from the Enlightenment, which she hopes to resuscitate: a commitment to universalism over tribalism, a belief in the possibility of progress, and an insistence upon justice as something other than a mere lever of power. To the woke left, these interconnected ideas are “Eurocentric,” little more than thinly veiled justifications for European colonialism in the 19th and 20th centuries and its contemporary vestiges, particularly the racism that accompanied colonialism.

Neiman traces the foundation for the woke attack on the Enlightenment to two 20th century thinkers, French philosopher Michel Foucault (1926-1984), and German jurist Carl Schmitt (1888-1985).  Neiman describes Foucault as a radical with a gloomy and insidiously reactionary message: everything that we might point to as a sign of progress is, on closer examination, a manifestation of “more sinister forms of repression . . . ways in which the state extends its domination over our lives” (p.93).  Poking holes in society’s major institutions was Foucault’s favorite pastime, and he found almost all akin to prisons, about which he wrote much.  Foucault proffered few solutions for the pervasive oppression he perceived everywhere.   His relentless deconstruction may have had no other purpose than “subversion as an art form” (p.71), Neiman surmises.

Schmitt shared with Foucault a “hostility toward liberalism” and a “commitment to unmasking liberal hypocrisies”  (p.71).  He became a Nazi party member when Hitler came to power in 1933 and earned the moniker “Crown Jurist” of National Socialism.  Although banned from teaching after World War II, Schmitt continued to write prolifically up to his death in 1985 at age 96, producing searing critiques of the Enlightenment, liberalism and parliamentary democracy which emphasized the importance of will rather than right in organizing society.   Much like “Nietzsche on a bad day” as Neiman colorfully puts it, Schmitt’s ideas add up to a “political theory for war” (p.75).  It is easy to see the influence of Foucault in academic circles.  I for one was surprised to learn that Schmitt enjoys similar stature in some of the same circles.

But Foucault and Schmitt have been gone since the mid-1980s. Neiman notes that many of those now teaching in universities were students themselves when Foucault became the “bedrock of left-wing thought,” the “one philosopher read by anyone who isn’t a philosopher”  (p.63).  These present-day teachers “pass on the texts they learned as exciting new classics.”  But herein lies the main deficiency that runs through Neiman’s analysis: we learn next-to-nothing about who those present-day teachers in the thrall of Foucault and Schmitt are, or the specifics of their case against the Enlightenment. Nonetheless, Neiman defends the Enlightenment and its relevance to present day political aims with passion and eloquence.

* * *

Neiman’s primary argument is that writing off the Enlightenment as “Eurocentric” entails rejecting universalism, arguably its central tenet, in favor of what she terms “tribalism,” sometimes referred to as “identity politics.”  The contemporary woke left considers Enlightenment universalism a sham, “invented to disguise Eurocentric views that supported colonialism” (p.31)  – a “fake universalism” that some cultures use to impose themselves on others “in the name of an abstract humanity that turns out to reflect just a dominant culture’s time, place, and interests” (p.23), as Neiman phrases it.

More than “simply ungrounded,” the arguments for Eurocentricity turn the Enlightenment notion of universalism “upside down” (p.32).  The Enlightenment was pathbreaking in “rejecting Eurocentrism and urging Europeans to examine themselves from the perspective of the rest of the world” (p.37), Neiman argues.  Enlightenment thinkers insisted that everyone is “endowed with innate dignity that demands respect”  (p.33-34).  Voltaire’s Candide, a “succinct diatribe against fanaticism, slavery, colonial plunder, and other European evils” (p.32), is the most accessible of many 18th century Enlightenment texts espousing universalism.  Montesquieu among others insisted that we learn to view the world from the perspective of non-Europeans.  Diderot criticized repressive European sexual laws from the (more enlightened) perspective of Hurons and Tahitians.  Unlike similar thoughts in the great religions, Enlightenment thinkers based their universalist views on reason, not revelation.

Those who make the “bewildering” claim that the Enlightenment was Eurocentric, Neiman argues, confuse the historical realities of the 18th century with the Enlightenment thinkers who “fought to change them – often at considerable personal risk”  (p.36).  This was part of a strategy in which Enlightenment writers put their own thoughts in the “mouths of imagined non-Europeans in order to avoid the persecution they would otherwise face for voicing them” (p.36-37).  Nearly all the canonical Enlightenment texts were “banned, burned, or published anonymously.  However different they were, all were seen to threaten established authority in the name of universal principles available to anyone in any culture” (p.40).

To be sure, there were gaps in Europeans’ knowledge of non-Europeans, but the best Enlightenment thinkers,  aware of the limits of their knowledge, “urged caution and skepticism in reading empirical descriptions of non-European peoples” (p.39).  In 1754, Rousseau criticized the “new collections of travels and reports” which Europeans had amassed as they ventured to other parts of the world, pretending to “judge mankind” but “more interested in filling their purses than their heads”  (p.39).  Diderot warned against making judgements about China without a more thorough knowledge of its language and literature.  Kant pointed out the difficulty of drawing conclusions from contradictory ethnographic accounts of non-European peoples.

Yet the same thinkers recognized that differences between people and cultures still matter.  The universalism that the Enlightenment bequeathed to future generations came with a “robust assurance that cultural pluralism is not an alternative to universalism but an enhancement of it” (p.46-47).  Enlightenment universalism rests on the conviction that “behind all the differences of time and space that separate us,” human beings are “deeply connected in a wealth of ways” (p.11), Neiman writes. “Individual histories and cultures put flesh on the bones of abstract humanity” (p.34), she adds.  The best type of universalism is that “learned with and through difference” (p.56).

But Enlightenment universalism has now given way to tribalism, which Neiman describes as the “civil breakdown that occurs when people, of whatever kind, see the fundamental human difference as that between our kind and everyone else” (p.20).  The left-wing turn to tribalism is “particularly tragic because the early civil rights and anti-colonialist movements resolutely opposed tribal thinking in all its forms” (p.26), Neiman argues.  “We do things for other reasons than being members of a tribe” (p.7).  Demographics cannot be the whole answer in explaining people’s values.

Neiman seizes upon a 2019 article in the New York Times to illustrate how tribalism has seeped into mainstream liberalism today.  The article referred to the Indian ancestry of Vice President Kamala Harris (who was born in the United States), and noted that despite her Indian roots, the Biden administration “may prove less forgiving over [Indian Prime Minister Narendra] Modi’s Hindu nationalist agenda” (p.6).  This carelessly written sentence seems to presume that the Vice President’s ethnic background would normally determine administration policy.  Appearing in the New York Times, sometimes called the United States’ “paper of record” but one which Neiman describes as “increasingly, demonstratively woke” (p.6), the sentence may illustrate the pervasive nature of tribalist assumptions in today’s United States.

Neiman also seeks to refute the idea that the key Enlightenment thinkers entertained a naïve belief in the inevitability of progress in human affairs.  With few exceptions, Enlightenment thinkers viewed progress as possible but far from inevitable, she argues.  Their views were the “very opposite of the views ascribed to them today. Over and over, they proclaim that progress is (just barely) possible; their passionate engagement with the evils of their day precludes any belief that progress is assured.  Still, they never stopped working at it” (p.103).  The religious doctrine of original sin, which dominated the institutions of the Christian world in which Enlightenment thinkers operated, shaped the Enlightenment view of the potential for human progress, Neiman emphasizes.

In its many variations, the doctrine of original sin taught that human beings are born inherently sinful, and change could come “only come through the hand of God” (p.104).   Rejection of the doctrine was a crucial element to Enlightenment thinking.  Voltaire, Rousseau, Kant, and others strove “not to defend a utopian view that we are all naturally good, but to attack a Christian view that we are all naturally evil”  (p.104).   Voltaire wrote in his Philosophical Dictionary that rather than being born evil, man “becomes evil, as he becomes sick” (p.103; although Neiman notes that Voltaire once quipped that original sin was the only theological doctrine supported by evidence).  In the aftermath of the Enlightenment, the idea that progress is possible has constituted another deep difference between the political left and right.

To stand on the left was once to “stand behind the idea that people can work together to make significant improvements in the real conditions of their own and others’ lives” (p.93), Neiman writes. The United States has bridged racial gaps significantly since the time when slavery was enshrined in law, even if that progress has been largely incremental, with a few steps forward, other steps in the wrong direction, and plenty of room for continued progress.  But to suggest that racism has hardly changed in a century “dishonors the memory of those who struggled to change it” (p.113), she argues.

Going forward, Neiman urges the political left to embrace the Enlightenment notion of progress as always possible, even in the bleakest moments.  If we give up on the prospect of progress, politics becomes “nothing but a struggle for power” (p.93), she writes.  Politics as merely a struggle for power constitutes no only another article of faith for today’s woke left, but also neatly encapsulates Foucault’s perspective.  To deflate the notion, Neiman takes on Foucault directly rather than relying upon Enlightenment intermediaries.

In Foucault’s world, and that of Schmitt, power is “only vaguely tied to the actions of particular humans in particular institutions” and is the “driving force of everything” (p.63).  This view can be traced back to the ancient Sophists, where might makes right, which “amounts to no concept of right at all” (p.78).  Foucault belittled the traditional notion of justice as a means of rewarding people according to their merits and punishing them according to their faults.

In writing about prisons, his metaphor for all societal institutions, Foucault came close to denying any social or moral distinction between innocence and guilt.  In one interview, he took the position that the distinction was irrelevant.  This is an over-the-top contention for Neiman, one that should place Foucault and those who purport to be his heirs outside the realm of serious democratic debate.  Denying the moral distinction between innocence and guilt “denies the possibility of moral distinctions at all” (p.65).  When a democracy gives up on moral distinctions — between guilt and innocence, between justice and power – it is on a glide path to fascism, she contends.

* * *

While there are undoubtedly contemporaries on today’s political left who share Foucault’s instrumental view of justice, such persons and their views do not receive a hearing here. Nor do those who reject the universalism of the Enlightenment or its guarded view of progress as ever possible.  Despite a robust defense of the Enlightenment and spirited exhortations to contemporary progressives, these absences render Neiman’s case against the woke left more frustrating than enlightening.

Thomas H. Peebles

Paris, France

April 28, 2024

 

 

 

 

 

 

 

2 Comments

Filed under Intellectual History, Political Theory

Is the American Constitution Dead?

Erwin Chemerinsky, Worse Than Nothing:

The Dangerous Fallacy of Originalism

(Yale University Press)

In 2010, the United States Supreme Court heard oral arguments in a case involving the constitutionality of a California law that prohibited the sale or rental of violent video games to minors without parental consent.  California legislators, shocked by some games’ graphic violence, credited studies that showed a correlation between such games and violent behavior, a correlation that did not exist for comic books, television programs, or movies.  The law was challenged as a violation of the free speech clause of the First Amendment to the United States Constitution.  At the oral argument before the Court, the late Justice Antonin Scalia pressed the lawyer for the State of California to explain whether the law could be reconciled with the original understanding of the First Amendment, an amendment which became part of the constitution in 1791.  Justice Samuel Alito then jumped into the exchange, trying to help the lawyer: “I think what Justice Scalia wants to know is what James Madison thought about video games” (p.115).

Justice Scalia’s interest in the understanding of the First Amendment in 1791 and Justice Alito’s odd intervention highlighted an approach to constitutional interpretation termed “originalism,” an approach which maintains that all such interpretation should center upon the original understanding of the constitutional provision at issue.  As Erwin Chemerinsky, currently dean of the law school at the University of California, Berkeley, puts it in Worse Than Nothing: The Dangerous Fallacy of Originalism, the central belief of originalism is that the meaning of a constitutional provision is “fixed when it is adopted and can be changed only by amendment” (p.14).

To Chemerinsky, the exchange at the highest court in the United States involving James Madison’s view of video games demonstrates the absurdity of the originalist approach.  Our world today, he notes, is “vastly different from that which existed at the nation’s beginning. There are countless questions for which originalism can provide no answer” (p.116; in 2011, the Court found the California video games law unconstitutional, with Justice Scalia writing the opinion that in Chemerinsky’s view had little to do with originalism).  While some of the country’s leading constitutional scholars have advanced sophisticated academic critiques of the originalist approach to constitutional interpretation, Worse Than Nothing appears to be the first on the subject for general readers, easily comprehensible to those who are neither lawyers nor constitutional scholars.

For its proponents, the originalist approach is the only theory of constitutional interpretation consistent with majoritarian democracy.  It is neutral and value free, they contend, restraining judicial decision-makers from imposing their personal values under the guise of interpreting constitutional provisions.  Originalism alone prevents unelected judges and justices, who enjoy life tenure, from functioning as an unelected super-legislature.  Chemerinsky dismisses these justifications as high-level sophistry.  Constitutional decision-making based on the original understanding is “only a fig leaf allowing a justice to pretend to adhere to a neutral method while advancing a conservative political ideology” (p.166), he writes.

If there is an undemocratic aspect to unelected justices disallowing laws and acts of government enacted through the political process, a point Chemerinsky concedes, at least those justices die or retire, with replacements appointed by elected officials. How much more undemocratic is it, he asks, “if society is governed by past majorities who cannot be overruled and are never replaced?” (p.78).  Why, moreover, should we “reject all the wisdom and experience gained since a constitutional provision was adopted?  It is hard to fathom why one would prefer such ignorance” (p.169).

Chemerinsky traces the modern version of originalism to Robert Bork, who in 1971 argued in a famous Indiana Law Journal article that the constitution did not recognize a right of privacy, and the Court was wrong to protect it (two years later, in Roe v. Wade, the Court extended the right to privacy to include a woman’s decision to terminate her pregnancy).  Although President Ronald Reagan  subsequently nominated Bork for the Supreme Court in 1987, Bork was forced to withdraw after a series of contentious confirmation hearings before the Senate Judiciary Committee, presided over by 44-year-old Senator Joe Biden.  In these hearings, Bork’s commitment to originalism was the primary basis for the rejection of his nomination.

Chemerinsky credits Bork with being far more honest about his judicial approach than more recent nominees, but he was unable to escape the clear position he had articulated in his 1971 article.  He failed to win over a majority of senators “not because his positions were mischaracterized but precisely because he had set them out so clearly.  The senators saw his originalist views as too dangerous for constitutional rights” (p.6), Chemerinsky writes.

Today, three current members of the Court are explicit proponents of originalism, Justices Thomas, Gorsuch, and Barret, with Justice Alito and Chief Justice Roberts also leaning in that direction.  Numerous lower court judges, especially those appointed by former President Donald Trump, also consider themselves originalists.  A fringe theory that started out primarily as a vehicle to criticize some of the Warren Court’s decisions of the 1950s and 1960s has become decidedly mainstream.  But for Chemerinsky, that only increases its danger.

The Supreme Court’s 2022 decision in Dobbs v. Jackson Women’s Health Organization,  overruling Roev. Wade on an originalist basis, paves the way for erasing other widely accepted rights that are not enumerated in the constitution’s text, Chemerinsky contends, including the right to marry, to control the upbringing of one’s children, to purchase and use contraceptives, and to engage in same sex sexual activity.  He is also troubled by several recent cases providing a questionable originalist interpretation of the First Amendment’s religion clauses, which aim to avoid the establishment of religion and protect its free exercise.  These cases see “no constitutional limit on making religion a part of government activities, such as through prayer or religious symbols, provided that there is no coercion” (p.199), and have allowed people to claim exemption from the general application of civil rights law on the basis of their religion – for example, an exemption from laws outlawing anti-gay discrimination for a web designer who contended that being required to make wedding websites for gay couples infringed his religious freedom.  The Court’s application of originalism in these and numerous other cases should make us “very afraid” of where that approach will lead, Chemerinsky warns, “afraid for the future of constitutional rights and equality” (p.xiii).

Chemerinsky’s title can be traced to Justice Scalia, the first proponent of originalism to reach the Supreme Court in the modern era.  A playful linguist who could add light touches to the most serious arguments, Scalia was aware of the many inconsistencies in originalism that academics had underscored.  He was fond of retorting that at least originalism was a theory, whereas non-originalists had no counter-theory – flawed though it may be, originalism was still better than nothing.  Chemerinsky seeks to persuade his readers that contrary to Justice Scalia, originalism is “worse than nothing,” far worse.  Even readers not convinced that he reaches this overall end point are likely to admit that he makes some formidable arguments along the way, particularly when he highlights a host of conceptual anomalies that undermine the originalist theory.

* * *

The most glaring anomaly to the originalist theory is what Chemerinsky terms the “coherence problem,” the simple fact that what is commonly termed “judicial review” is not provided for in the United States constitution.  The constitution contains no authority for the judiciary to determine the constitutionality of the acts of other branches of the federal government or of the individual states.  A commitment to originalism should therefore require abandoning judicial review altogether, Chemerinksy argues.  “It is incoherent to seek the original meaning for how the courts should exercise a power when there is no indication that the original meaning of the Constitution was ever to give them that power” (p.77).  Of course, American courts have been practicing judicial review since the landmark 1803 decision Marbury v. Madison established the practice, and no originalist has argued that courts should abandon judicial review altogether.

But the coherence problem has a second dimension: there is no evidence that Article III of the Constitution, which establishes the judicial branch of the United States government and delineates the scope of federal court jurisdiction, “included the understanding that courts should interpret the Constitution based on its original meaning” (p.82).  To the contrary, evidence from the period suggests that “constitutional interpretation was never intended to be a quest for original understanding” (p.86).  There is, Chemerinsky argues, “every reason to believe that the original meaning of the Constitution, if it included judicial review at all, did not embrace originalism as the method for interpreting the document” (p.84).  Chemerinsky cites several constitutional clauses written in broad, open-ended language (“speech,” “taking” of property, “cruel and unusual punishment,” “due process of law”).  There is strong evidence that these provisions were intended to “gain meaning over time” (p.86), he contends.  Jefferson, for one, insisted that constitutions should not be regarded with “sanctimonious reference” and that law and institutions must develop “hand in hand with the progress of the human mind” (p.86).

Beyond these threshold coherence problems lies what Chemerinsky terms the “inherent epistemological problem”: in most cases, there is no single “original understanding” to be discovered.  “Rarely does any constitutional provision have a clear original meaning or provide an unequivocal basis for deciding a specific case” (p.181), he observes.  Instead, there is almost always a “range of possibilities that allows for exactly the judicial discretion that originalism seeks to eliminate” (p.xi).  Justices of any stripe can “pick and choose the sources that support the conclusion they want, and then declare that that is a constitutional provision’s original meaning” (p.63).

Then there is the “repugnancy problem,” which Chemerinsky considers the gravest deficiency of originalism: the original understanding, when it is discernible, often leads to abhorrent results. The most glaring example is racial segregation under the equal protection clause of the 14th amendment.  The evidence is overwhelming that those who proposed and ratified the amendment did not understand that clause as outlawing racial segregation.  Serious originalism should lead to only one result, that racial segregation is constitutional and the decisions invalidating it have been wrongly decided, starting with the landmark Brown v. Board of Education decision in 1954  that proscribed racial segregation in public schools.  Thankfully, no one on the Supreme Court and none of the academic advocates of originalism have come out publicly for overruling the Brown decision.

Some originalists have justified Brown and other decisions which do not comport with a strict application of the original understanding by defining the understanding in a more general and abstract way.  One academic promotes “inclusive originalism,” which allows judges to consider precedent, policy, or practice but “only to the extent that the original meaning incorporates or permits them” (p.41).  Another espoused “living originalism,” in which each generation must decide “how to make sense of the Constitution’s words and principles” (p.41).   While these two versions of originalism allow the meaning of the Constitution to change over time, they are far from the originalism of Justices Scalia and Thomas, and in Chemerinsky’s view render originalism indistinguishable from non-originalism.  Defining the original understanding in an abstract and general way can justify “literally any result and does nothing to restrain judges from ruling according to their personal convictions” (p.105), he argues.  This is the “central paradox of originalism”:  there is “no middle ground.”  Either originalism “constrains at the price of unacceptable outcomes, or it offers no constraints and is not really originalism at all” (p.11).

Chemerinsky goes on to highlight how originalists often abandon their theory altogether when it does not yield the result they want.  This “hypocrisy problem” undercuts any claim that originalism “actually constrains judging and suggests instead that it is not a theory of judging at all but only a rhetorical ploy to make it appear that decisions are based on something other than political ideology.” (p.139).  Chemerinsky devotes an entire chapter to cases where conservative ideological positions were “clear and strongly held” (p.140), and that ideology, not the original meaning, appeared to control the originalist justices’ rulings.

In Shelby County v. Holder, the Court held unconstitutional a critical section of the Voting Rights Act of 1965 requiring jurisdictions which have in the past committed voting discrimination to gain preclearance from the Department of Justice or a federal court before making changes to voting procedures. The Court found that the pre-clearance provision violated the principle of “equal state sovereignty,” a principle which the Court “invented” that “appears nowhere in the text and is contrary to the original understanding of the Fourteenth Amendment” (p.147).   In Citizens United v. Federal Election Commission, the Court held that the free speech clause of the First Amendment protects the right of corporations to spend unlimited amounts of money to support or oppose candidates in election campaigns, a “risible” interpretation of the original understanding of that amendment.  “Those who drafted and ratified the amendment could not have imagined campaign spending as it exists in the twenty-first century, let alone the wealth of modern corporations and their ability to spend that wealth to influence elections” (p.157), Chemerinsky writes.

To avoid the anomalies and hypocrisies of originalism, Chemerinsky advocates what he terms “non-originalism.” Rather than an identifiable school of thought, “non-originalism” is best thought of as a shorthand for the way most justices not swayed by originalism have approached constitutional interpretation.  Non-originalism does not exclude examining original understanding, to the extent it can be ascertained, but only rarely does the inquiry stop there.  Additional sources non-originalist judges look to include the constitution’s structure, relevant precedents, traditions, modern social needs and, occasionally, foreign laws and practices.  Chemerinsky considers the expression “living constitution” a useful and evocative way to describe a document whose meaning “changes over time as it is interpreted in specific cases” (p.15).

But the expression “living constitution” is a lightning rod for judicial conservatives.  Justice Scalia liked to counter by saying that the Constitution was “dead, dead, dead” (p.15).  If originalism produces outcomes unacceptable in the modern age, the originalist answer is to point to the constitutional amendment process as the obvious means to rectify unpopular results.  In 1895, for example, the Supreme Court declared a federal income tax unconstitutional, a decision that was in effect overruled with the ratification of the 16th amendment to the constitution in 1913.

Amending the United States Constitution is an arduous process, requiring a majority of both houses of Congress, then approval by the legislatures of three quarters of the states, or the calling of a new convention by Congress on the application of two thirds of the states.  But the problem with relying upon the amendment process is not that it is inconveniently slow and difficult.   Relying upon the amendment process means that the rights of individuals and the protection of minorities from discrimination depend upon the action of a supermajority. The amendment process is an inadequate answer when it comes to “balancing the majority’s’ values against the values that should be protected from society’s majorities” (p.207), Chemerinsky writes.  Minority rights “should not depend on the willingness of a supermajority to enact an amendment” (p.179).

Balancing of values is an inescapable judicial function which no justice or judge can avoid, Chemerinsky contends, rendering the originalist’s desire for value-neutral judging an “impossible quest.”  A justice or judge’s ideology and life experiences “inevitably determine how he or she – or anyone interpreting the Constitution – strikes the balance” (p.71).  Originalism “only allows conservative justices and judges to pretend that they are following a neutral theory when in reality they are imposing their own values” (p.xi).

* * *

Chemerinsky has no illusions that his lucid, passionately argued work will affect the originalist justices on the Supreme Court and their supporters. But in the court of public opinion, he has presented a remarkably persuasive case.

Thomas H. Peebles

Paris, France

February 17, 2024

 

 

6 Comments

Filed under American Politics, American Society, Rule of Law

A Decade of Radical Republican Reconstruction

 

Nicole Hemmer, Partisans:

The Conservative Revolutionaries Who Remade Politics in the 1990s

(Basic Books)

And

Jeffrey Toobin, Homegrown:

Timothy McVeigh and the Rise of Right-Wing Extremism

(Simon & Schuster)

In the United States’ 2020 presidential election, 74 million Americans voted for the incumbent president, Republican Donald Trump, the second highest total ever for a presidential candidate (his opponent Joe Biden set the record with 81 million votes).  Today, the same Trump appears to be on the cusp of receiving the Republican party’s nomination for a third run for the presidency.  He will carry some exceptionally heavy baggage with him throughout the campaign, including criminal indictments in four different jurisdictions, a civil suit that could dismantle his business empire, another civil suit for defamation with a crippling judgement entered against him, and disqualification from the ballot in two American states on the ground that he partook in an insurrection against the United States on January 6, 2021.  Yet, polls at this still early stage indicate that the former president enjoys at least an even chance of recapturing the White House in the 2024 general election.

What principles were those 74 million Americans voting for when they cast their ballots in 2020?   The Republican party issued no platform that year from which its position on policy issues might be gleaned.  Is today’s Republican party “conservative” in the sense that both Barry Goldwater, the Republican presidential nominee in 1964, and Ronald Reagan, president from 1981 to 1989, used that term?  Has what was once affectionately termed the “party of Lincoln” evolved into a cult operation centered around former and possibly future President Trump, and if so, how did that happen?  Works seeking to provide responses to these and related questions are sufficient to fill a midsize library.  Two recent, representative, and highly illuminating examples are Nicole Hemmer’s Partisans: The Conservative Revolutionaries Who Remade Politics in the 1990s and Jeffrey Toobin’s Homegrown: Timothy McVeigh and the Rise of Right-Wing Extremism.  

In different but complementary ways, each finds the foundation for today’s Republican party in the politics of the 1990s, a decade when Republican George Herbert Walker Bush and after him Democrat Bill Clinton occupied the White House.  The 1990s saw the end of the Cold War struggle between the Soviet Union and the United States, a struggle that had been what Hemmer terms the “central organizing theme” (NH, p.7) of American conservatism, providing glue that held together its various components.  While the internet was in its infancy in the 1990s and today’s social media platforms did not yet exist, the decade saw the media becoming increasingly diverse and influential in politics.  Ranging widely among these broader currents, Hemmer depicts a cantankerous right wing within the Republican party that repudiated Ronald Reagan’s sunny, upbeat approach to politics in favor of what she terms the “politics of resentment” (NH, p.47), a much harder-edged version of conservatism.

The version that emerged in the 1990s leaned into the “coarseness of American culture and brought it into politics,” Hemmer writes, blurring the lines between the party’s mainstream and a “violent far right that was particularly active in the 1990s and was split between acts of domestic terrorism and efforts to worm its way into mainstream politics”  (NH, 13-14).  With white racism never far from the surface, the hard-edged 1990s conservatism blended strains of nativism, isolationism, and anti-democratic authoritarianism.  Its core support came from voters who were “seeking not to conserve but to tear down . . . angry, radical, reactionary, and convinced that they were the victims of a liberal system that included both Democratic and Republican leaders” (NH, p.90).  These voters, overwhelmingly white, and their leaders are the “partisans” of Hemmer’s title.

With each passing year after Reagan left office in 1989, Hemmer writes, American conservatives “looked less and less like Reagan, even as they invoked his name more and more” (NH, p.2).  Hemmer concentrates mostly on the Republican mainstream during the decade, with a narrative revolving around a handful of high-visibility personalities. Her two lead characters are commentator, journalist and White House staffer turned presidential candidate Patrick Buchanan, and syndicated radio host Rush Limbaugh.  The two set the tone for the Republican revolt against Reaganism, a tone which Hemmer describes as “harsh, outrageous, uncompromising” (NH, p.11).

Hemmer also emphasizes how televangelist (and presidential candidate in 1988) Pat Robertson brought evangelical Christians into the Republican party as one of its key constituencies; and how Speaker of the House of Representatives Newt Gingrich gave sanction to the tactical use of incendiary rhetoric to demonize and delegitimize political opponents, encouraging his colleagues to use words like “radical,” “disgrace,” and “traitor” in referring to their opponents. Hemmer also highlights the key if less familiar role of Idaho Congresswoman Helen Chenowith as a link between the mainstream Republican party and its violent fringes.

Chenowith demonstrated how a “radical, sometimes violent right merged into the broader conservative coalition, even into the highest reaches of the federal government” (NH, p.142), Hemmer writes.  Arguably the most enthusiastic voice in Washington for the “angry white male” (NH, p.142), Chenoweth championed resistance to any form of government regulation of firearms and promoted the position that armed self-defense was necessary “not just from other people but from the government itself” (NH, p.152).

President Bill Clinton also figures prominently in Hemmer’s analysis as the foil for a Republican party that came to define itself in large measure in opposition to Clinton.  Republicans constructed what Hemmer terms a “scandal infrastructure” (NH, p.233), using “whipped up outrage and implacable opposition” (NH, p.18) to call the president’s legitimacy into question on a continual basis.  She also emphasizes how the Clinton administration pursued a host of conservative policies, thereby occupying traditional Republican ground and pushing the implacable Republican opposition ever further to the right.

Toobin by contrast focuses on a single individual, Timothy McVeigh, perpetrator of the most horrific act of domestic terrorism of the decade, the bombing of the federal building in Oklahoma City, Oklahoma, on April 19, 1995 in which 168 people died; and on the affinities for violence which McVeigh and others like him embodied during the decade.  Toobin describes McVeigh as an “early prototype for what would become known as the aggrieved Trump voter” (JT, p.19), with a virulent anti- government worldview that included an obsession with guns and gun rights and a belief in the necessity of violence to restore the United States to what he convinced himself was its original purpose.  Nearly half of Homegrown concerns the legal proceedings against McVeigh, which culminated in the first federal execution in decades.  After the execution, Stephen Jones, McVeigh’s publicity-hungry lead attorney, donated the entire defense file to the University of Texas.

The donation came to almost a million pages, including a substantial amount of correspondence between Jones and his client and memos detailing the defense team’s trial strategies.   Although ethically dubious – the attorney-client privilege does not generally terminate with the death of the client — the donation allows Toobin to share with his readers many previously unknown behind-the-scenes details of the case, including, crucially, McVeigh’s own ruminations on what led him to his act of terror.  “For such a complete record of a major criminal investigation to be available for examination is unprecedented in American history” (JT, p.7-8), Toobin writes.

Toobin uses the Oklahoma City bombing to make the broader point that McVeigh’s anti-government worldview has been absorbed into today’s Republican party, manifesting itself graphically in the January 6, 2021, assault on the United States Capitol.  McVeigh’s views were “replicated with extraordinary precision in the rioters on January 6,” Toobin writes, “as well as many of the other right-wing extremists who have flourished in the quarter century since the bombing” (JT p.6.).  The “homegrown” forces that motivated McVeigh are not only still with us today but are more dangerous than in the 1990s, he argues, due especially to the presence of the internet and social media which were not part of McVeigh’s world.

Hemmer asks her readers not to consider her book a “prehistory of Trumpism” (NH, p.14), but she too connects dots between the 1990s and the present. In its “turn toward nativism and a more overt racism” and its “wariness about free trade and democracy,” she writes, the party’s selection of a nominee like Trump in 2016 “had been underway for a quarter of a century” (NH, p.14).  But her account begins in the 1980s, when Ronald Reagan brought traditional American conservatism to the White House for the first time in the post-World War II era.

* * *

Reagan’s conservatism revolved around reduced government intervention in the economy, lower taxes, a belief in the virtues of free trade, and a “muscular anticommunism” (NH, p.16-17).  But Reagan also brought a sunny disposition and an optimistic, upbeat vision of America’s future, two qualities that had hitherto been absent in 20th century American conservatism.   He believed that immigration, like free trade, benefited the United States economically.

While supporting numerous anti-democratic, authoritarian regimes around the world, Reagan used soaring and inspiring rhetoric to trumpet democratic values as the cornerstone of American governance at home and a key to stability and prosperity abroad.  But Hemmer points out that Reagan’s full-throated embrace of democracy alarmed many conservatives.  Traditional conservatism had always been distrustful of democracy.  The United States was a republic, not a democracy, conservatives liked to contend.  To many on the political right, the language of democracy had been coopted by the African American struggle for civil rights, which they viewed skeptically or opposed altogether.

Many “social conservatives” concluded that Reagan had not prioritized their key issues, such as opposition to abortion and school prayer in public schools.  To them, he was “the god that failed” (NH, p.41).  Near the end of Reagan’s second term, Buchanan wrote that the administration had fallen short by refusing to “take up the challenge from the Left on its chosen battleground: the politics of class, culture, religion and race” (NH, p.48).  The pugnacious Buchanan campaigned aggressively on these issues when he ran for president in 1992, 1996 and 2000.

In his presidential campaigns, Buchanan “fashioned grievance politics into an agenda” (NH, p.11), Hemmer writes.  Although often described as a “populist,” Hemmer contends that Buchanan’s campaigns were “about racism as much as economic displacement” (NH, p.218).  Buchanan appealed to what he euphemistically termed “racial conservatives” (NH, p.31), whites who believed that that they had become an “oppressed class, losing economic and political power in American society as a result of civil rights, immigration and economic decline” (NH, p.90-91).  Buchanan envisioned these voters as the future core of the Republican party, rather than one component among several.

Buchanan had little difficulty with right-wing authoritarian governments.  He challenged the idea that democracy was an “unqualified good” (NH, p.75).   He believed that the United States did not need to go abroad to fight foreign enemies — they were “already on America’s shores in the form of cheap goods from Asia and foreign workers from Latin America, Asia and Africa” (NH, p.75).   Buchanan blamed immigrants for a large portion of the nation’s economic and social woes.  Decades before Donald Trump, he favored building walls on America’s southern border.  He portrayed the United States as a nation under siege, “one that had to pull up the drawbridge and reinforce the ramparts” (NH, p.82).

Radio host Rush Limbaugh, one of Buchanan’s most vociferous supporters, pioneered a “new kind of political entertainment” in the 1990s that thrived on the “boundary of offensiveness” (NH, p.101).  Limbaugh regularly attacked feminists, Blacks, and gays while always claiming he was criticizing only the leaders and militants of these groups, not all of them.  Proving how large and potent the right-wing radio audience could be, Limbaugh gave rise to a new source of power within the conservative movement, one that influenced:

an entire political party, so much so that, a quarter century later, more politicians would sound more like Rush Limbaugh than like Ronald Reagan – and more lived in fear of crossing the radio host than of deviating from the former president’s political legacy (NH, p.96).

Timothy McVeigh was among Limbaugh’s regular listeners, absorbing the radio superstar’s messages on long solo drives across the country.  Limbaugh’s incendiary language, Toobin writes, “matched and encouraged McVeigh’s views” (JT, p.56).

In addition to espousing unfettered gun rights, Chenowith also became the “unexpected face” (NH, p.142) for the male-dominated “militia movement” that emerged in the 1990s — paramilitary groups with anti-government, conspiratorial views often buttressed by racism and anti-Semitism.  The movement came to national attention in 1992, when federal agents sought to arrest militia member Randy Weaver, a fugitive from justice, at Ruby Ridge, one of several militia outposts in Chenowith’s district.  The ensuing gun battle cost the lives of Weaver’s wife, their son and a federal agent.

The Ruby Ridge shootout stoked the militia movement’s “fears of a federal government hell-bent on disarming Americans,” (NH, p.140), as did the 51-day siege the following year near Waco, Texas, between the federal government and a religious sect called the Branch Davidians that ended in a deadly fire in which 76 Branch Davidians, including 28 children, died.  Hemmer’s account gives full due to the two incidents and the role which guns and the proclivity for political violence played in the Republican party’s turn in the 1990s.

Toobin concentrates almost exclusively on these intertwined issues for the simple reason that they constitute close to a full explanation of the thinking that led Timothy McVeigh to the federal building in Oklahoma City in April 1995.  His reaction to Waco, Toobin writes, “exceeded mere political outrage,” and became a “psychological obsession” (JT, p.84).  McVeigh chose April 19th as the date for the bombing because it was the second anniversary of the Waco fire.   But Waco was not the only obsession which Toobin highlights as he wrestles with the question how an average young man growing up in Buffalo, New York, in the 1970s and 1980s became a homegrown terrorist in the 1990s.

* * *

McVeigh developed an obsessional interest in gun collecting during his last year in high school.  By the time he began working in low level jobs after graduation, his political views were, Toobin writes, a “cauldron of resentments – against politicians who wanted to take away his guns and people (especially Blacks) who, he believed, had jumped ahead of his kind in the American hierarchy” (JT, p.26-27).  He joined the National Rifle Association (NRA) at a transformative period in the politics of guns, when the organization was becoming, as Toobin puts it, “ferociously partisan, dedicated to stifling any attempts to limit gun ownership” (JT, p.20).  Through an advertisement in the NRA magazine, McVeigh ordered a copy of The Turner Diaries, a novel first published in 1978 under a pseudonym by William Luther Pierce, a man with neo-Nazi leanings.

“It is difficult to overstate the influence of The Turner Diaries on McVeigh” (JT, p.23), Toobin writes, and it is easy to see why.  The novel centers around the bombing of FBI headquarters, sparking a national rebellion against “the System,” a “sinister Jewish cabal” (JT, p.21) where privately owned firearms are subject to arbitrary confiscation at the government’s whim and Black people are allowed to attack whites with impunity while whites are prosecuted for defending themselves.  Toobin argues that The Turner Diaries planted in McVeigh’s mind the thought that a truck bomb demolishing a government building could spark the political revolution he considered necessary.

McVeigh enlisted in the army just in time to be sent to Kuwait to fight in the first Iraqi war, where he was awarded four medals, including a Bronze star.  While in the army, McVeigh met the two men who not only shared his political views but, in different ways, conspired with him to bomb the federal building in Oklahoma City, Terry Nichols and Robert Fortier (Fortier’s wife was also charged as a co-conspirator).  After resigning from the Army in December 1991, McVeigh spent most of the next two years living alone in his car, driving long distances alone while listening to Limbaugh three hours a day.

Limbaugh strengthened McVeigh’s view that the economic pie was continuing to shrink, with the federal government reserving the best jobs for women and Blacks.  “Changes in gender, race, and the economy had undermined McVeigh’s status in the world,” Toobin writes.  He thought it was “maddening that so many people out there couldn’t see what was right in front of them” (JT, p.65).  By late 1994, McVeigh’s plan for bombing the federal building in Oklahoma City the following April 19th had gained a terrible “momentum of its own, and that was ultimately the most important reason for it; it was not the result of strategic or tactical thinking, but rather an expression of rage”  (JT, p.103).

The consequences of McVeigh’s expression of rage remain staggering.  Among the 168 people who died in Oklahoma City were 15 children in a day care center.  30 other children were orphaned and 219 lost one parent.  The bomb damaged 324 buildings in a 50-block area of downtown.  The explosion was felt 55 miles away and registered 6.0 on the Richter scale.  About 7,000 people lost their workplaces.  In a state of about 3.2 million, over 10% of Oklahoma residents according to one estimate knew someone personally who worked in the building.

In the aftermath of the bombing, President Clinton sensed that dots needed to be connected between the bombing and the fiery rhetoric of Limbaugh, Gingrich and others.  Without naming names, he appeared to single them out as bearing some responsibility for the catastrophic event.  “We hear so many loud and angry voices in America today whose sole goal seems to be to try to keep some people as paranoid as possible and the rest of us all torn up and upset with each other,” Clinton said.  These voices “spread hate.  They leave the impression that, by their very words, violence is acceptable” and “must know that their bitter words can have consequences” (JT, p.193-94).

Limbaugh, predictably, took offense.  “Make no mistake about it.  Liberals intend to use this tragedy for their own political gain,” he intoned, terming Clinton’s remarks an “irresponsible attempt to categorize and demonize those who had nothing to do with this . . . There is absolutely no connection between these nuts and mainstream conservatism in American today” (JT, p.194).  Limbaugh’s response, Toobin writes, established a pattern that has “recurred for decades following acts of right-wing domestic terrorism,” up to and including the January 2021 assault on the Capitol, characterized by “dutiful condemnation of the attacks themselves, accompanied by refusals to accept any responsibility for inspiring them” (JT, p.194).

From the moment the bombing took place, it was portrayed as the “work of outsiders, of individuals who were sinister anomalies from American norms.” (JT, p.8).  Limbaugh said that “his gut” (JT, p.9). had told him that Middle Eastern terrorists were responsible.  Those looking for an international conspiracy latched on to Nichols’ personal contacts with the Philippines, his second wife’s native country, where he traveled on at least one occasion for family matters.  Toobin indicates that the radical right never fully abandoned the idea of an international connection.  The FBI, after looking far and wide for possible co-conspirators, concentrated its massive investigation on McVeigh, Nichols and Fortier (the Fortiers entered guilty pleas and cooperated with the government in separate trials against McVeigh and Nichols).

McVeigh’s attorney Stephen Jones was among those who pursued the possibility of a wider conspiracy, doggedly espousing a “rush to judgment” theory in which he suggested that the government had prematurely ceased its investigation into a possible wider conspiracy when further investigation might have revealed far more persons involved.  Jones was “never specific or consistent about who participated in this grander conspiracy, but he didn’t have to be” (JT, p.238) — as defense counsel, he did not need to prove or disprove anything at trial.  His idea was to “sow doubt and uncertainty, not to prove an alternative” (JT, p.238).

We now know, thanks to Jones’ decision to make his full file on the case public, that he and McVeigh differed vigorously and bitterly on this very issue.  McVeigh was adamant that he had acted in concert with only Nichols and Fortier, and no one else.  Yet, Jones would not let loose of the conspiracy notion (raising its own set of ethical issues).  Toobin argues convincingly that McVeigh was right, that beyond Nichols and Fortier, he had acted on his own.  But Toobin is equally interested in dispelling an idea that ran side by side with the notion of a broader conspiracy: that McVeigh was an isolated, lonely and eccentric “survivalist.”

The prosecution’s case strengthened this notion by focusing narrowly on McVeigh and Nichols and actively discouraging speculation that the bombing represented anything broader than their “own malevolent behavior.” (JT, p.205).  The impression lingered after the trial that McVeigh was an aberration, a “lone and lonely figure who represented only himself and his sad-sack co-defendant” (JT, p.297), an impression which Toobin characterizes as “dangerously misleading”  (JT, p.205).  McVeigh’s act was no aberration, he contends.  Many at the time shared his anti-governmental worldview and obsession with gun rights.  But McVeigh was unable to reach them.

Although Limbaugh convinced McVeigh that there was an “army” of fellow believers out there somewhere, the fledging internet had not yet created “places where those of similarly extreme views could convene and plot together, as they did before January 6” (JT, p.56).  The internet, more than any other factor, “accounts for the difference between McVeigh’s lonely crusade and the thousands who stormed the Capitol on January 6, 2021” (JT, p.59), Toobin writes.

But Toobin points to another unsettling difference between April 1995 and January 2021:  no high-level politicians – and not even Limbaugh — defended the attacks, but many Republicans did just that in 2021.  In the quarter of a century since the Oklahoma City bombings, Toobin writes, the United States “took an extraordinary journey – from nearly universal horror at the action of a right-wing extremist to wide embrace of a former president (also possibly a future president) who reflected the bomber’s values”  (JT, p.373).

The hard truth, Toobin writes, is that from McVeigh’s time to the present, a “meaningful part of the conservative movement in the United States has engaged in violence” (JT, p.241).  Both the insurrection on January 6, 2021, and much else in today’s conservative movement show how McVeigh’s values, views, and tactics have “endured and even flourished in the decades since his death,” and now come “close to the conservative movement norm” (JT, p.317).  This makes the story of McVeigh and the Oklahoma City bombing “not just a glimpse of the past but also a warning about the future” (JT, p.12).

Hemmer’s bottom line is close to that of Toobin:  American conservatism and today’s Republican party have been taken over fully by the partisans of the 1990s and their 21st century heirs.  Trump’s 2016 campaign “recreated the Buchanan agenda with relatively few updates,” she notes.  Both men “relied on media platforms to shape their political personas, and both left outside observers – and some Republican insiders – worried about a turn toward fascism in the party” (NH, p.299).  We will see this November if these same partisans and their heirs take over the country.

 

Thomas H. Peebles

Paris, France

January 30, 2024

 

 

 

7 Comments

Filed under Uncategorized

Waging War on American Democracy

 

Adam Hochschild, American Midnight:

The Great War, A Violent Peace and Democracy’s Forgotten Crisis

(Mariner Books)

 “The world must be made safe for democracy.”  That was American president Woodrow Wilson’s most prominent argument when he asked the United States Congress for a declaration of war on Germany in April 1917.   But Wilson, who had won re-election months while campaigning on a pledge to keep the country out of the European conflict, might just as well have asked Congress for a declaration of war on American democracy.  Wilson finished his address to Congress by stressing ominously that “if there should be disloyalty,” a veiled reference to opposition to the war effort, it would be “dealt with with the firm hand of stern repression” (p.33).  As Adam Hochschild demonstrates convincingly in American Midnight: The Great War, A Violent Peace and Democracy’s Forgotten Crisis, Wilson proved all too true to his word.

A progressive Democrat when first elected to the presidency in 1912, Wilson presided over the “greatest assault on American civil liberties in the last century and a half,” Hochschild writes, with “few regrets over that contradiction” (p.12).  The assault targeted not only anti-war dissenters but also members of such overlapping, disfavored groups as unions, socialists, immigrants, and anarchists – along with Americans of German ancestry.  It included “mass imprisonments, torture, vigilante violence, censorship, killings of Black Americans, and far more” (p.2).  This is the “forgotten crisis” of Hochschild’s sub-title, a side of American history “not marked by commemorative plaques, museum exhibits or Ken Burns documentaries” (p.2), he notes acidly.

In April 1917, the United States was “startingly unprepared” (p.55) for war.   Although it had by far the world’s largest economy, its army was smaller than that of Portugal, ranking 17th in the world.  Nearly a year passed before large numbers of Americans were fully ready for combat.  Then, months later, in November 1918, the fighting in Europe ended with an armistice.  But the domestic war within the United States continued into 1921, the year Wilson’s second term ended.  Shining light on the full period 1917 to 1921, Hochschild provides roughly equal treatment to the war and post-war years.

During this time, what Hochschild terms the “raw underside of our country’s life” (p.4), including such phenomena as racism, anti-immigrant hostility, vigilante justice, and contempt for the rule of law, had never been more revealingly on display.  But Hochschild makes clear that these phenomena had been part of American history for decades.  When the United States joined the conflict in Europe, however, America’s long-standing racism and nativist hostility to immigrants blended seamlessly with an equally deep-seated hatred of “any challenge to the power of business and industry” (p.88-89), particularly the challenge of labor unions and their Socialists backers.  American entry into World War I, “provided business with a God-given excuse to stop workers from organizing” (p.4), Hochschild writes.

 Hochschild’s best known work is the award-winning King Leopold’s Ghost, the story of how Belgian King Leopold II used violence and coercion to gain control of the Congo in the late 19th and early 20th centuries.   He is also the author of two other works reviewed here, To End All Wars, dealing with the reaction to World War I within the belligerent European countries, and Spain in Our Hearts , on American participation in the Spanish Civil War.  Returning to the World War I era in American Midnight, Hochschild again shows himself to be a formidable storyteller. delivering a steady stream of anecdotal evidence in sharp, almost bullet-point form.  The result, a withering indictment of widespread governmental abuse at all levels, revolves around a colorful collection of villains and victims, along with a few heroes who resisted the massive assault on civil liberties and democratic values.  But the central figure in American Midnight is the inscrutable President Wilson.

* * *

Although Wilson had initially hoped to keep the United States out of the conflict that broke out in Europe in 1914, he became convinced that only through United States participation in what was then known as the “Great War” could it shape the post-war order.   Wilson hoped that an international organization to ensure a lasting peace – what came to be known as the League of Nations – would emerge from the carnage of the war.  There is a suggestion of pretext to Hochschild’s interpretation of the decision to go to war.  Although Germany was blamed, “with some reason” (p.17), for having started the war, “no one had attacked the United States” (p.39) – there was no analogue to the German invasion of Belgium three years earlier or to Pearl Harbor a quarter century later.  German ships interfered with American shipping and killed American citizens, most dramatically in the sinking of the British passenger line Lusitania in 1915, costing 128 American lives among the nearly 1,200 who died.

But, as Hochschild points out, Germany had tried to warn potential passengers of the risk of traveling on the Lusitania, a ship also carrying “173 tons of munitions, including artillery shells and 4.2 million rifle bullets” (p.19).  Yet, in his appeal to Congress, Wilson argued that the role of belligerent had been thrusted upon the United States, in keeping with what Hochschild terms the “pretense that the United States was an innocent victim drawn into the conflict against its will” (p.40).  Across the country,  people were “thrilled by the idea that the country was somehow defending itself” (p.39-40).  But if the United States was defending itself in Europe, it was simultaneously being “fatefully transformed” (p.55) at home.

One prominent member of Wilson’s administration giving shape to that fateful transformation was Postmaster General Albert Burleson, who became “America’s chief censor,” with powers “seldom wielded by any single government official before or since” (p.61-62).  Burleson’s powers emanated from the Espionage Act, passed in June 1917, which had little to do with spying.  The act  defined almost any sort of opposition to the war as criminal, giving the Postmaster General authority to declare any newspaper or magazine ‘unmailable” at a time when “there was no other way to distribute publications nationally” (p.61).  Burleson’s first target was what he termed “offensive negro papers that constantly appeal to race and class prejudice” (p.64), but he went on to use the Espionage Act primarily as a “club to smash left-wing forces of all kinds” (p.60).

Burleson went after publications that depended entirely upon the mail, “foreign-language papers, journals of opinion, and Burleson’s prime target, the socialist press” (p.63-64).  He declared one socialist publication, The Rebel, unmailable after it exposed exploitation on a cotton farm his wife had inherited.  He went after the best known socialist monthly, The Masses, “one of the liveliest journals the United States has ever seen” (p.65-66).  A trial against The Masses ended in an acquittal but still led to the magazine’s permanent disappearance.

The Wilson administration also benefited from the services of Major Ralph Van Deman, who led a newly established surveillance unit within the Department of the Army set up to spy on American citizens.  The unit recruited Pinkertons and other private detectives with experience working for corporate clients against labor organizers.  It kept files on hundreds of thousands of Americans, tapped phones, compiled ethnic breakdowns of the groups the units placed under surveillance, and maintained informants and secret infiltrators.

Eager to believe the “wildest of rumors” (p.116), Van Deman became convinced that Germany was inciting the Black population of the United States.  Among the African American targets of his surveillance was the Reverend A.D. Williams of Ebenezer Baptist Church in Atlanta, whose grandson, Martin Luther King, Jr., became both a preacher in the same church and the subject of another era’s government surveillance under FBI Director J. Edgar Hoover – then, much later, an iconic national hero.  But Van Deman’s most obsessive target was the Industrial Workers of the World, known as the “Wobblies,” a relatively small labor group to which Hochschild returns repeatedly throughout American Midnight.

 Even though their numbers never exceeded 5% of American union members, the Wobblies were a “convenient bogeyman” for anti-labor politicians and businessmen, with the war presenting a “welcome chance to crush them” (p.85).  To that end, Van Deman benefited from the work of one Leo Wendell, an infiltrator who went by the name of Louis Walsh.  Reporting directly to Van Deman, “Walsh” posed as a mechanic and Wobblie activist, particularly in Pittsburgh area steel mills.  Taking pride in his many Wobblie friends and how much they trusted him, he was elected to the strike committee when the steelworkers went on strike and arrested several times, often to great fanfare, but usually released shortly thereafter.

The Wobblies were subject to the largest civil trial in the history of the United States in the summer of 1918, with 112 defendants accused of seditious speech; none was charged with violent acts.  After about an hour of deliberations, the jury rendered guilty verdicts for all defendants, permanently crippling the union.  The IWW “would never again be a significant force in American life” (p.170), Hochschild writes.  When Wendell provided the Department of Justice with insider information on the Wobblies in Pittsburgh, the information went to a young J. Edgar Hoover, then a mere 24 years old, whom Attorney General A. Mitchell Palmer had just appointed to head a new, internal “Radical Division” within the department.

In that position, Hoover engineered a nation-wide series of raids and arrests that have gone down in history as the “Palmer Raids” but in Hochschild’s view would better be called the “Hoover Raids” — a “domestic war the likes of which the United States had never seen” (p.240).  The raids were ostensibly a response to a rash of bombings that occurred across the country in June 1919, targeting men in power, including Attorney General Palmer, none of whom was hurt.  The bombings may have been the work of members of a “tiny, shadowy sect of Italian American anarchists” (p.240), but the actual perpetrators have never been identified.  Hoover and Palmer used the raids for broader purposes, to arrest and deport a wide variety of Socialists, union members, suspected anarchists, and other subversives for “political advantage . . . regardless of the lack of evidence connecting them to the bombings” (p.241).

Hoover, acting as Palmer’s “determined deputy, quietly wielding influence beyond his years” (p.280), engineered the raids, which began on November 7, 1919, the second anniversary of the Bolshevik coup in Russia.  It was the time of the “First Red Scare” in the United States, when business and political leaders feared that Bolshevism was spreading to American shores.  The raids took place in more than a dozen cities in the Northeast and Midwest.  They enhanced Hoover’s stock within the Department of Justice, as well as providing a political boost to Palmer, a Democrat who had his eyes on the presidency after Wilson’s second term.

Hoover and Palmer’s strategy of deporting the raids’ targets ran into an unanticipated roadblock in the person of Louis Post who, as Acting Secretary of Labor, had the final say on deportation.  Unintimidated by Hoover and Palmer, Post was “one of the most courageous figures of this grim time” (p.80), Hochschild writes.  He took the unfashionable position that any non-citizen subject to deportation was entitled to constitutional safeguards.  Of approximately 2,500 cases forwarded to him, Post approved deportation in less than 20%, throwing out the majority and asking for further investigation of the remainder.  Renowned radical Emma Goldman presented Post with his highest-profile case.

A “larger-than-life celebrity with a fierce gaze and fiery energy” (p.75), at the time the United States went to war, Goldman had arguably “enraged the country’s establishment more than any other American of her time” (p.76).   An immigrant from Russia, Goldman had become an American citizen through her marriage to a naturalized US citizen.  When the government found that her husband had lied to gain his citizenship, her citizenship fell with it, making her a deportation target of Palmer and Hoover.  But the petition to deport Goldman fell to Post, who thwarted Palmer and Hoover by applying to Goldman’s case his view that no one should be expelled from the United States for opinion alone.

But Post could not save Goldman from prosecution under the Espionage Act.  She was charged with espionage on the day the act went into effect.  At trial, the prosecutor described her as dangerous because she was such an eloquent orator, capable of holding spellbound “the minds of ignorant, weaker and emotional people” (p.79), an echo of the Salem witchcraft trials, Hochschild observes.  She spent most of the war years in jail.  Once the war ended, Hoover moved again for Goldman’s deportation.  With Post no longer in office, Goldman was sent back to her native Russia.

None of the victims Hochschild portrays experienced what he aptly terms the “sadistic fury” (p.107) to which African Americans continued to be exposed during the World War I era.  Racial segregation and overt discrimination were the norm throughout the country in 1917, along with the gruesomely unrelenting practice of lynching.  Even the Wobblies failed to attract many Black members, and many unions barred them altogether.  President Wilson, a Southerner with sympathy for the Confederate cause and views of Black Americans fully in line with the deep-seated racism of the times, was resegregating the federal workforce.  Openly racist politicians were in power throughout the American South.

Despite this dispiriting environment, most Black organizations “did not oppose the war, and encouraged young men to serve” (p.115).  Even the militant philosopher and journalist W.E. B. DuBois, who became a special target of Hoover, urged his readers to “close ranks” with white Americans in the fight against Germany.   DuBois fervently hoped that “if Black soldiers fought bravely, the country would treat them more fairly once the war was over” (p.116).  This proved to be a forlorn hope, with the era witnessing numerous outbreaks of organized violence against entire African American communities.

A deadly riot took place in East St. Louis, Illinois on July 1, 1917, less than three months after official US entry into the war.  After a crowd of angry whites invaded a Black neighborhood and opened fire, Black men responded, killing two whites who turned out to be plainclothes policemen.  It was the most severe outburst of racial violence in decades but hardly the last.  Competition for a limited number of jobs in urban centers after the war gave rise to a wave of anti-black violence in many cities during the Red Scare of 1919.  1921, the year Wilson left office, was also the year of the now infamous riot in Tulsa, Oklahoma, when white mobs effectively wiped out the city’s African American community, looting Black homes and businesses and setting fire to scores of buildings.  These outbreaks of violence, almost always described in the press as race riots, should in Hochschild’s view be termed “white riots” (p.252).  Not surprisingly, President Wilson remained largely silent on the era’s pervasive anti-Black discrimination and violence.

Throughout American Midnight, moreover, Hochschild cites repeated instances where Wilson entertained reservations about some of the assaults on civil liberties which his administration had undertaken or acquiesced in, yet failed to take action.  Just before Wilson left for Europe in early 1919 to sell his vision of lasting peace to the United States’ allies, for example, he wrote to Postmaster Burleson suggesting that the time to curtail censorship might have come. But it was not an order.  He did not follow up while abroad and Burleson simply ignored the president’s message.

* * *

Much of the latter portions of American Midnight concern Wilson’s trip to Europe, where he was able to convince his skeptical British and French Allies to include his cherished notion of a League of Nations within the peace treaty they adopted; his failing health due to a series of strokes suffered during and after his time in Europe, which came close to incapacitating him; and his unsuccessful efforts upon his return to convince the United States Senate and the American public of the virtues of a League of Nations.  When the isolationist Senate refused in November 1919 to ratify the peace treaty he had negotiated in Europe, Wilson’s own isolation was complete, leaving him in the “deepest despair” (p.303).  It “never seems to have occurred to Wilson,” Hochschild observes, that the “censorship, political imprisonments, and harsh crackdown on antiwar dissidents he had presided over for nearly two years had not nurtured a climate of enthusiasm for a peace-oriented, internationalist ideal like the league” (p.213).

The 1920 presidential election constituted a major repudiation of Wilson’s party, with Republican Senator Warren Harding, running on the slogan that it was time to return to normalcy, easily defeating Democratic nominee James Cox (Cox’s running mate was the young Assistant Secretary of the Navy, Franklin Roosevelt).  But  a third candidate on the ballot, charismatic Socialist Party leader Eugene V. Debs, “America’s most beloved leftist” (p.181), received nearly one million votes, 3.4% of the overall total, even though he was then in jail, serving a 10-year sentence after conviction under the Espionage Act for his outspoken opposition to the war (although he had been careful not to advocate avoiding the draft).

Debs, a founder of the Wobblies, had broken with the organization at the time World War I broke out.  He ran as the Socialist Party candidate for president in 1908 and 1912 but refrained from running in 1916 out of respect for the Wilson campaign’s pledge to keep the United States out of war.  Wilson refused to commute Debs’ sentence after the war ended despite considerable pressure to do so.  That task fell to Harding, who first met with Debs at the White House, found him surprisingly likable, then released him, suspending the remainder of his term.

Although generally given low marks in American history books, Harding “undid much of the harsh repression still in place from the war years and the Red Scare” (p.340), Hochschild writes.  He moved slowly to free most of the nation’s federal political prisoners.  Those remaining were freed by his successor, Calvin Coolidge; all were out by June 1924.  “For the first time in seven years, no American was in federal prison because of something he or she had written, said, or believed” (p.345).

* * *

But did the extreme repression that the country endured between 1917 and 1921 really end after Wilson left office?  Hochschild’s answer is a qualified yes.  The very excesses of the period gave Americans:

a greater appreciation for the Bill of Rights, something gradually reflected over later decades in school curricula, Supreme Court decisions, and much more.  No political mass arrests on the scale of the Palmer Raids happened again . . . Never would the government censor news media and put publications out of business the way Albert Burleson had done.  . . . [T]he long battle between business and organized labor rarely again would become as violent as it was more than a century ago  (p.352, 357).

But Hochschild cites numerous examples of instances where the forces that blighted the United States during the World War I era have resurfaced, most recently during the presidency of Donald Trump, 2017 to 2021 — exactly 100 years after the second Wilson term — when “rage against immigrants and refugees, racism, Red-baiting, fear of subversive ideas in schools, and much more” once again became “dramatically visible” (p.356).

For the United States to avoid slipping back into the darkness that befell the country from 1917 to 1921 requires much from its citizenry, Hochschild advises, above all precisely what was then missing: a “vigilant respect for civil rights and constitutional safeguards” (p.358).   With another Trump presidency looming on the political horizon, as one respected commentator recently opined, today’s United States stands perilously close to slipping back into a darkness reminiscent of that which Hochschild describes so vividly in this chilling yet instructive volume.

Thomas H. Peebles

Paris, France

December 16, 2023

 

13 Comments

Filed under American Politics, History, Rule of Law, United States History

In Search of a Realistic Utopia

 

Daniel Chandler, Free and Equal:

What Would a Fair Society Look Like?

(Penguin Books)

In 1971, Harvard political philosopher John Rawls published A Theory of Justice, a book frequently described as the 20th century’s greatest work of political theory.  There, Rawls re-examined notions of justice and fairness in modern democratic society, offering a vision of a society where equality for the least well off was compatible with maximum individual freedom for all.   “It is almost impossible to overstate Rawls’ influence within academia” (p.6) writes Daniel Chandler in Free and Equal: What Would a Fair Society Look Like.  Rawls’ work provided a “model of constructive and systematic political thinking,” Chandler explains, which “inspired a new generation” and led to an “’outpouring of philosophical literature on social, political, and economic justice unmatched in the history of thought’” (p.6).

But as influential as A Theory of Justice was within academia, its influence in the world outside has been negligible — and it is not difficult to see why.   While Rawls’ seminal work described what he often termed a “realistic utopia,” it said infuriatingly little about the granular details of that utopia, or how we might someday come closer to realizing it.  That task was best left to social scientists, Rawls maintained, the task that Chandler, both an economist and a political philosopher, undertakes in this rigorously argued volume.

Chandler begins with an illuminating summary in the first third of the book of Rawls’ complex and interlocking theories.  Over the last two thirds, he offers recommendations as to how Rawlsian principles could be put into place in the real world, particularly in the United States and Great Britain, the two countries he focuses upon most intently.  His recommendations are not intended to be a “rigid blueprint,” he writes, but rather a “contribution to a more open and imaginative public conversation about how we should organize our society” (p.271).  These recommendations assume the existence of a well-functioning state and bureaucratic infrastructure.   Countries that lack these basic institutions will “inevitably need to focus on putting them in place first” (p.103*).

 Chandler concedes that his attempt to defend and reimagine political liberalism, “both as a set of values and as a way of organizing society” (p.10), is likely to appeal most to the “progressive” or “left wing” side of the political spectrum.  His recommendations include a more progressive tax policy, a more generous allocation of social welfare, more workplace democracy, more freedom for workers to organize, and more public and less private financing for both political parties and education.  Readers of the progressive persuasion are likely to view these recommendations as eminently sensible solutions to some of the most intractable problems afflicting contemporary America, Britain, and other prosperous modern societies.

But in Chandler’s view, Rawls’s ideas are capable not only of providing an overall conceptual coherence to progressive politics but also of appealing to people across the political spectrum.  With cooperation and reciprocity as their cornerstones, Rawls’s ideas can “help us find common ground even where there appears to be none” (p.105), Chandler writes.  His vision, grounded in the “‘common sense’ of democratic life,” should be fully palatable to “citizens irrespective of their wider religious and moral beliefs”  p.68).  My gut reaction: good luck with that; try pitching Rawls at a MAGA rally (“Make America Great Again,” usually used to refer to supporters of Donald Trump).

We will probably never have the opportunity to learn whether Rawls’ ideas could withstand the scrutiny of a MAGA rally.  But after making my way through Chandler’s recommendations, I remained skeptical whether Rawls’ theory is vital to advancing progressive reform.  Chandler’s faith in the power of Rawlsian ideas to unite a polarized polity likewise left me behind as a non-believer.  But it is difficult to resist Chandler’s core point that we need a vigorous defense of liberal democracy in today’s world, given widespread global dissatisfaction with democratic governance and the accompanying rise of authoritarianism.  To restore faith in democracy, “we need to redesign it from the bottom up,” (p.138), Chandler contends, starting with a vision of a “better society that people will stand up and fight for” (p.2).

Rawls spun out the implications and ramifications of his thinking over some 600 dense pages.  Summarizing these ideas succinctly in understandable terms represents a daunting challenge, which Chandler deftly meets in the first third of his book.  He sets  out Rawls’s body of thought in an almost Cartesian format, breaking down major  principles  into component sub-parts.  But he starts by explaining why A Theory of Justice shook up the ivory tower world of academic political philosophy when it first appeared in 1971.

At the heart of Rawls’s complex work lies a “strikingly simple and powerful idea,” Chandler writes: “that society should be fair” (p.8) – precisely why Rawls called his theory “justice as fairness.”  Rawls devoted his life to thinking through what it means to live together in a democracy on “terms that everyone could accept as fair” (p.20); and to identifying a clear set of principles that could guide us in designing fair democratic institutions.  The result was a philosophy “committed to both freedom and equality at the deepest level” (p.4; the terms “freedom” and “liberty” were synonymous for Rawls, as they are for most political theorists, and  Rawls often substituted “rights” for one or the other).

Rawls staked out ground that set him apart from both the classical liberal tradition that “prized individual freedoms above all else” (p.4), and a socialist tradition “often willing to sacrifice these freedoms in the name of equality” (p.4).  He also emphatically rejected the “harsh neoliberalism” that was to gain ascendancy in the 1980s and in Chandler’s view still dominates political discourse today, characterized by an “almost religious faith in markets and an overriding focus on economic growth.” (p.2).  Rawls further rejected the deference to tradition that is one of the defining features of conventional conservative political thought.  Communitarians argued that he failed to take seriously enough the importance of family, religion, and community, while many on the left found him insufficiently attentive to questions of race and gender.

Rawls distanced himself from all these perspectives and defined his thought in terms of what Chandler terms a “humane and egalitarian liberalism” (p.4), primarily through three far-reaching, interdependent principles that are key to understanding Rawls: the “basic liberties” principle, the “difference” principle; and the “just savings” principle.   Taken together, Rawls’s three principles provide what Chandler describes as a “unified and comprehensive framework for reimagining society, moving us beyond vague platitudes about freedom, equality and sustainability” (p.50).

* * *

Basic liberties are “those rights and freedoms that we need in order to live freely and to play our part in society” (p.27), which for Rawls were personal, political, and economic.  Personal liberties, including freedom of expression, association, and religious belief, constitute the “liberal core of Rawls’s theory, and the foundation on which everything else is built” (p.103).  They reflect his deep commitment to the “liberal ideal of a society in which people with different mores and religious values can live side by side, and where we agree not to use the power of the state to impose our beliefs on anyone else” (p.23).  We are “probably closer to achieving this aspect of Rawls’s ideal than any other,” Chandler argues.  Citizens in today’s rich liberal democracies “enjoy more wide-ranging personal freedoms than has been true of almost any society in history” (p.103-04).

Political liberties include the right to vote and otherwise participate in the political process, but also the right to “scrutinize and criticize the government and to form political parties and campaign groups” (p.23), based on a commitment to political equality in which all citizens have the same opportunity to influence and participate in the process – “substantively equal opportunities to exercise [their]rights and to influence collective decision-making, irrespective of wealth, race, [and] gender” (p.23).  This commitment to equal political opportunities distinguishes democracies from those forms of government in which decision-making is the province of a single individual or a minority.  But Rawls’s notion of political equality remains far removed from current reality, Chandler writes, primarily because the preferences of the wealthy have a “much greater influence on government policy than those of average and poorer voters.” (p.142).

Rawls considered only two economic rights to be basic: the right to own personal property and to choose one’s profession, rights without which it would be “impossible to live freely and to express ourselves” (p.29).   Rawls thus rejected most of the economic freedoms favored by neoliberal thinkers, such as freedom from “excessive” taxation or regulation, which he argued were not necessary to live freely or engage in democratic debate.  If economic freedom is defined too widely, he cautioned, it “severely limits what the state can do to address poverty or inequality, or even to regulate markets in order to promote more economic growth” (p.27).  Distinguishing the economic freedoms that are truly essential from those that are not constituted one of Rawls’s most significant achievements, Chandler contends.

As the complement to the basic liberty principle, the difference principle is based on a “strongly egalitarian idea of ‘shared prosperity’” (p.39), where all citizens enjoy an equal chance to develop and employ their talents and abilities within an economy organized to maximize the life chances of the least well off.  Although the difference principle addresses the broad notion of economic inequality, Chandler stresses that it is concerned “not just with the distribution of income and wealth, but with the concentration of economic power and control, and with the extent to which people have opportunities for self-respect, including through work” (p.199).

Economic inequalities can be justified “only if everyone ultimately benefits from them” (p.198), Rawls posited,  and therefore must be of “greatest benefit to the least advantaged members of society, consistent with the just savings principle” (p.21).   We should “only want some people to have more than others if this also benefits those who have less” (p.39).  In terms of the familiar economic pie analogy, the difference principle aims to make the slice that goes to the least well off as big as possible.  It contrasts to simply growing the size of the pie, as a market capitalist might prefer, or rigidly insisting that everyone’s slice be about the same size, the end goal of some socialists.  This led Rawls to a relatively benign, pragmatic approach to market capitalism, another reason many on the political left remained unimpressed by Rawlsian theory.

From the perspective of the difference principle, if free markets can “increase the total wealth of society, then, with the right institutions in place, they can also help to raise the living standards of the least well off” (p.206).  When properly regulated, capitalist markets have “proven themselves to be the most effective system for generating economic prosperity” (p.198).  Market capitalism offers individuals their best chance to exercise their basic freedom of occupational choice. The problem with capitalism, Rawls argued, is not the existence of private ownership, but ownership that is “so heavily concentrated in the hands of a wealthy elite” (p.84).

The just savings principle involves our environmental obligations toward future generations.  Already in 1971, Rawls argued that we have an “overriding duty to maintain the material wealth and vital ecosystems on which society depends” (p.9), as Chandler puts it.  Whatever we do to increase prosperity and raise the living standards of the least well off “must be consistent with this basic commitment to social and environmental stewardship” (p.9).  Today we are “dangerously failing to live up to this commitment: the world is facing the prospect of catastrophic and irreversible ecological and environmental breakdown” (p.49-50).  Addressing this crisis is the “most urgent and important policy priority of our times” (p.50).

* * *

After setting out with admirable clarity the complex interplay between these basic Rawlsian principles, Chandler picks up where Rawls left off in the latter portions of Free and Equal, looking at “how far our actual societies fall short of his inspiring ideal and, crucially, developing a bold practical agenda that would make it a reality”  (p.9).  This agenda needs to be about more than alleviating poverty, he argues, but it must begin by tackling the economic inequality that is “tearing apart our societies today” (p.205).  Gaps between the wealthiest few and those in the bottom half of the income scale have increased dramatically since the 1980s in all the world’s prosperous countries, Chandler notes, reaching heights not seen since before World War II in the United States and Great Britain.  But if we are to move beyond the neoliberal laissez-faire dogmatism of the 1980s, we need to transcend vague platitudes about addressing economic inequality.

Tackling inequality necessitates rethinking the taxation system and, inevitably, increasing taxes, especially on the income of those with the highest incomes, while closing avenues for tax avoidance and evasion.  Chandler likes Thomas Piketty’s idea of a “universal minimum inheritance,” a “one off transfer of wealth to every citizen when they reach adulthood, funded by taxes on large inheritances and the very wealthy” (p.226-27).  Without raising taxes, governments can increase the minimum wage and strengthen trade unions.  More public funding for education is imperative: there must be a universal, legal entitlement to high quality early education, starting as early as the end of parental leave, and we must raise the quality and status of vocational education for those who do not attend university.

But the two priorities on Chandler’s agenda are a system of Universal Basic Income (UBI), regular cash payments paid to every citizen at all income levels; and what he and others term “workplace democracy,” where workers have the legal right to participate in workplace decision-making on much more equal terms. UBI is a system that is both “universal” and “unconditional,” paid without any requirement to be in the workforce or looking for work. A UBI could replace most existing welfare transfers and benefits, Chandler contends.  The justification for a UBI “rests on the importance not just of income but of dignity and self-respect,” meeting individuals’ needs “in a way that supports rather than undermines the independence and self-respect of the least well off” (p.212-213).

Chandler downplays the possibility that a UBI would allow some people to stop working altogether.  The number is likely to be small, he asserts, and would be more than balanced by the enormous potential of the UBI to bolster the self-respect of the least well off by ending the need for intrusive and humiliating assessments of eligibility and the stigma associated with receiving means-tested benefits.  Critically, a UBI could fundamentally alter the balance of power at work: everyone would be able to say “no” to demeaning and degrading work, one of the potential benefits of workplace democracy, which seeks to expand the opportunities for meaningful work and enhance individuals’ self-respect in an area of their lives too often characterized by “subservience and powerlessness quite unlike any other domain of life in a modern democratic society” (p.244).

Workplace democracy challenges the “shareholder primacy” model of corporate ownership, in which shareholders retain final control over the organization of economic activity and the conditions of work.  Although there is no single formula for effective workplace democracy, workers’ councils have emerged across continental Europe as a feasible form of co-management power sharing between owners and workers.  By limiting shareholders’ exclusive rights over corporate direction and policy, workers’ councils serve as a mechanism through which employees can shape decisions at both the strategic and day-to-day level. Germany has the continent’s most extensive co-management system, with workers enjoying the right to a certain percentage of seats on the board of directors, thereby sharing control rights within a given corporation.

An extensive body of academic literature on co-management in Europe reveals that co-managed companies generally have better work conditions, greater job security, and more family-friendly policies, such as flexible working hours, parental leave and childcare provisions.  Further, although co-management has proven to have little or no impact on corporate productivity and profits, and only minimal overall impact on wages, it “tends to reduce inequality within companies, mostly by increasing the wages of the lowest paid” (p.254), precisely the objective of Rawls’s difference principle.

* * *

Political progressives, even those who do not acknowledge the need for a Rawlsian conceptual framework, are likely to embrace most of Chandler’s recommendations for a truly fair society.   But Chandler raises what might be termed the MAGA problem with his qualification that there are boundaries to legitimate democratic debate.  Rawls recognized  “limits to what we can achieve through the power of reason.”  If we want to persuade others, we “always need to find some common starting point . . . [E]very argument has to begin somewhere, from some shared premise” (p.71-72).   Racist or authoritarian views are “incompatible with any reasonable interpretation of fairness, freedom and equality that rightly sets them outside the boundaries of acceptable political discourse” (p.71).

With the MAGA movement retaining its firm grip on one of America’s two major political parties, it is difficult to see where the shared premises might come from that could lead to a genuine Rawlsian debate in today’s United States.  Overt racism is never far from the movement’s surface and one of its core articles of faith is that the 2020 presidential election was “fraudulent” and “stolen” — a notion for which there is near-zero empirical evidence, rejected repeatedly by courts at all levels throughout the United States.  Numerous surveys have indicated that MAGA voters are far more likely than other voters to advocate violence in support of political objectives.  Rawls’s theories unsurprisingly have little to say to such people, Chandler indicates.

Of course, the United States is not the only country where a Rawlsian transformation could take place. But it is Rawls’ home country and one of the two that Chandler concentrates on, along with Great Britain.  Despite the formidable political obstacles facing a Rawlsian transformation in today’s United States, Chandler’s work constitutes a useful reminder that A Theory of Justice can be a template for the realistic utopia which Rawls imagined in his day, and progressives continue to imagine in ours. But it is a sobering reminder: the chances of applying Rawls’ notions of fairness and justice in the real political world seem weaker today than when Rawls first articulated them over a half-century ago.

 

Thomas H. Peebles

Paris, France

November 4, 2023

 

 

2 Comments

Filed under American Politics, American Society, Political Theory

Battling for Hong Kong’s Soul

 

Louisa Lim, Indelible City:

Dispossession and Defiance in Hong Kong

(Riverhead Books)

When the United Kingdom turned control of Hong Kong over to the Peoples’ Republic of China in 1997, China pledged that the island city would retain a high degree of autonomy for fifty years: its robust capitalist system would remain intact and its governmental system, based on something approaching democracy and the rule of law, would continue. That pledge, reduced frequently to the mantra “One Country, Two Systems,” was underpinned in 1984 by a Joint Declaration that had been hammered out between Britain and China after two years and twenty-two rounds of negotiations.  But as Louisa Lim details in her elegantly written, heartfelt, yet altogether dispiriting account, IndelibleCity: Dispossession and Defiance in Hong Kong, the “One Country, Two Systems” pledge now seems like a bad joke. China has turned this once free society into an authoritarian one.

The “One Country, Two Systems” pledge, although resting entirely on Beijing’s good faith, with no mechanisms to monitor or ensure compliance, nonetheless held precariously through the first decade of the 21st century.  But in the century’s second decade, as Hong Kongers asserted their desire for greater democracy, China reneged on its pledge.  Ever harsher assertions of Chinese control over what is known euphemistically as a “Special Administrative Region of China” have seen pro-democracy activists, lawmakers and journalists jailed, major political parties shut down, voting rights curbed and freedom of the press and media undermined.  China’s endgame, Lim argues, is now “total dominance” (p.252): it views Hong Kong’s systems and its attendant freedoms as a “threat to its own security (p.259).”

At the core of Indelible City is Lim’s account of resistance to the power of the Chinese Communist Party  in the 2010s, especially in 2014 and 2019, when Hong Kongers turned out in massive numbers to demand democracy, human rights, and the principle that Hong Kongers should rule Hong Kong.  Lim witnessed these demonstrations first-hand, as both a journalist and a participant. After growing up in Hong Kong and working there as a journalist for many years, she found herself not only writing about the collapse of the familiar world she had known since her earliest years but also working actively to prevent it.

Lim recognizes that participating in events as an individual and detailing them as a professional journalist require tricky balancing.  Although she tried earnestly to preserve her professional neutrality and objectivity, she found herself continually asking whether she had fallen short of the standards of her profession. Yet, gradually, the feeling that she had to “choose between being a journalist and being a Hong Konger” (p.207) gave way to the uneasy realization that she was both, not one or the other.

The first mass demonstrations that Lim covered, in 2014, known as the “Umbrella Movement” (because many demonstrators used umbrellas as shields against the police), were organized as a protest against proposed reforms which would have limited Hong Kongers’ capacity to choose their city’s Chief Executive to candidates approved by Beijing.  The Umbrella movement was followed by even more significant demonstrations in 2019 against a proposed extradition provision that would have allowed Hong Kongers suspected or charged with criminal acts to be tried on the Chinese mainland under the very different Chinese system of justice, threatening Hong Kong’s traditional role as a refuge for mainland dissidents and activists.

Beijing imposed a new National Security Law on Hong Kong in the following year, 2020, which effectively criminalized dissent, with broad and vague definitions of sedition, subversion, terrorism, and collusion with foreign forces (a similar law had been proposed but not adopted in 2003).  Since 2020, this law has been used to further tighten China’s grip on Hong Kong.  The organization of demonstrations on the scale of 2014 and 2019 is now far more perilous, and none appear likely in the foreseeable future.

Although there were multiple causes underlying the 2014 and 2019 demonstrations,  in Lim’s view they all “revolved around one core issue: identity” (p.151), an issue that Lim digs into deeply. Throughout Hong Kong’s colonial history, Great Britain competed with China to define the city’s identity, but, as Lim shows, Hong Kongers defied both as they sought to define their own identity. These battles “for Hong Kong’s soul,” (p.231), as Lim terms them, cast critical light on Hong Kong’s turbulent recent years.

* * *

About one third of Indelible City concentrates upon how Great Britain controlled Hong Kong territories from the 1840s up to 1997. In a century and a half of colonial rule, Britain did not establish genuinely democratic institutions in Hong Kong.  It did not endow its subjects with full citizenship, the right of abode in Britain, or universal suffrage. But it nonetheless provided a framework within which democratic institutions could emerge and potentially flourish – more by accident than by an “imperial masterstroke,” Lim writes, driven by a “confluence of misplaced personal initiative, misunderstanding, and overreach” (p.70-71).  Born “in fits and starts,” Hong Kong’s current plight is rooted in what Lim describes as “century old acts of piecemeal acquisitions that bear revisiting” (p.71).

In the first such acquisition, in 1842, Hong Kong became an official British Crown Colony when the Qing dynasty ceded Hong Kong Island through the Treaty of Nanjing.  The treaty put an end to what was known as the First Opium War between the Britain and China,  fought over the right to import opium into China.  In 1860, Britain acquired the Kowloon peninsula at the conclusion of the Second Opium War with China.  Finally, in 1898, through what was known as the Second Convention of Peking, Britain acquired the third and largest tranche of land that makes up contemporary Hong Kong, known as the New Territories, 92% of present-day Hong Kong’s land mass. Rather than a permanent concession, the Chinese agreed to a 99-year lease, terminating in 1997. The British minister in Beijing surely thought this was forever, Lim speculates, probably never imagining a China strong enough to demand its return.

By the 1850s, Britain had a flourishing trading station free from Chinese jurisdiction. Many who were low on the social totem pole in mainland China came to Hong Kong and made their fortunes.  In the 19th and early 20th century, Hong Kong allowed “opportunistic and hardworking migrants, both British and Chinese, to reimagine their lives, though few saw it as a permanent home” (p.81).  British colonial administration was mostly about “control and regulation of the local population, rather than governance” (p.71).  While the British “preoccupation with justice brought order to the new colony,” Lim writes, it also “cemented in British minds a view of the incoming Chinese migrants as deplorable” (p.78).

From the 1840s onward, moreover, two competing historical narratives shaped Hong Kong.  For the British, Hong Kong was a “moneymaker, a ‘future Great Emporium of Commence and Wealth,’” (p.78), as Hong Kong’s first British Governor put it.  But in Beijing, the Nanjing Treaty of 1842 was seen as the first “unequal treatment treaty,” imposed by what the Chinese considered the gunboat diplomacy of imperial aggressors.  Hong Kong’s loss marked the start of China’s century and a half of “humiliation by foreign powers, a matter of national shame that could only be eased with the island’s return to its rightful owner” (p.78-79).  For both Britain and China, the competition between the two contrasting historical narratives was part of the effort to define Hong Kong identity.

Throughout British rule, Hong Kongers were told that they were “purely economic actors” (p.154).  Lim writes that she herself was “shaped by Hong Kong values,” in particular a “respect for grinding hard work and stubborn determination” (p.4).  But, somehow, latent civic values also became part of the Hong Kong mindset, including an “almost religious respect for freedom, democracy, and human rights” (p.16).  Over time, Hong Kong became a “place of refuge and free thinking… a sanctuary for Chinese dissidents and revolutionaries, a place where taboo topics could be discussed, and forbidden books sold in tiny bookshops tucked up narrow staircases”  (p.159).

Extensive business and family ties to mainland China also led many Hong Kongers to identify culturally with the mainland.  The Cantonese language spoken in Hong Kong was another identity marker.  Although often viewed as a dialect of Mandarin Chinese, Cantonese is incomprehensible to most Mandarin speakers, Lim indicates.  Hong Kong identity, she suggests, should be considered “plural, more like a constellation of evolving and overlapping self-images rather than one fixed point of light” (p.154).

Through what Lim terms  “gritty perversity,” (p.64), Hong Kongers also sought to forge their own identity in ways that fit neither the British nor the Chinese historical narrative. With no great war heroes or statesmen or even heroic acts capable of becoming the stuff of myths, Hong Kongers invented icons that tended to be “antiheroes in the form of discriminated-against outsiders and bullied misfits, people who resisted and continued to do so despite the overwhelming forces allied against them” (p.64).  Antihero number one for Lim is the so-called King of Kowloon, Tsang Tsou-choi, an eccentric calligrapher whose drawings appeared repeatedly at key points in the demonstrations of the 2010s, long after his death.  Lim begins with the King, as everyone called him, and circles back to him throughout.

A “toothless, often shirtless, disabled trash collector with mental health issues” (p.6), the King was obsessed with a jutting prong of the Kowloon peninsula that he believed had originally belonged to his family and had been stolen by the British in the 19th century.  In the 1950s, he began a furious graffiti campaign in which he accused the British of stealing his family’s land. Viewed then as a “crank and a vandal,” (p.19), the King continued his campaign against the British until the 1997 handoff, when he turned it against China.

The King gained famed and notoriety for his idiosyncratic Chinese calligraphy, easily recognizable by its misshapen, unbalanced characters.  Proficiency in calligraphy was a critical measure of one’s cultivation in traditional China – “both the apogee of all art forms and a tool of power” (p.17).  The King was known and loved for breaking calligraphic rules with “brushstrokes that screamed his illiteracy” (p.18).   For Lim and Hong Kongers of her generation, the King’s work was a “celebration of originality and human imperfection… He broke all the rules, repudiating traditional Chinese behavior” (p.7).  In the early 2000s, the King “seemed to be everywhere” (p.165).  He was the subject of a rapper group’s song and made a commercial for a household cleaning product.

The King died in 2007, but in death became a symbol of the city’s “radical search” for its distinct “social and cultural identity through endless negotiations with its colonial past and neocolonial present” (p.160).  He had raised issues of territory, sovereignty, and loss “at a time when no one else dared to think about them” (p.11).  In their continued acts of defiance, “no matter how small,” the Hong Kong protestors of the 2010s were “following the lead of their dead King” (p.11).  Those acts of defiance “helped to define what it meant to be a Hong Konger” (p.154).  At the heart of that definition was an intensified commitment to democratic values and what Lim terms an “appetite for autonomy” (p.218).

* * *

The method of choosing Hong Kong’s chief executive precipitated the 2014 Umbrella Movement, an eleven week “explosion of discontent, desire and above all hope” (p.181).   After the departure of the last British governor in 1997, Hong Kong’s chief executive was chosen by a committee composed of representatives of Hong Kong’s business elite. The long-term goal was for direct nomination and election by Hong Kong voters by 2017. But in 2014, the Chinese National People’s Congress proposed limiting the ballot to two or three candidates nominated by a 1,200-member Selection Committee.  Universal suffrage was still the long-term goal, the Committee added, but must be achieved only through a “steady and prudent path”  (p.182).

In an early demonstration challenging the proposed nomination procedures, protestors were met with tear gas, the first time Hong Kong police had used this heavy-handed tactic since the 1997 takeover.  Seen as a huge betrayal, the tactic prompted a previously unimaginable numbers of persons to take part in a subsequent demonstration, later estimated to be 1.2 million, roughly 1/6 of Hong Kong’s population. But despite the massive demonstrations, the Umbrella Movement ended without even a symbolic victory. While the Chinese proposal was withdrawn, the chief executive continued to be chosen by the Hong Kong committee of business elites, not the people.

Lim sensed at the time that the fight for democracy in Hong Kong had been lost and the movement crushed, with its leaders imprisoned and marginalized. Many young people were trying to find a way out. Hong Kong’s rights and freedoms were being “salami-sliced away with accelerating speed,” she writes. “One Country, Two Systems was tilting to favor One Country over Two Systems” (p.188-89).  But the 2014 Umbrella Movement can now be seen as “only a dress rehearsal for the carnival of discontent that would explode onto Hong Kong’s streets five years later” (p.183), when the hope for a more democratic Hong Kong once again surged, only to be snuffed out even more ruthlessly.

The 2019 “carnival of discontent” was triggered by a proposal by the Hong Kong government, acting at the behest of Beijing, to alter its extradition laws to permit the rendition to mainland China of alleged criminal suspects, where they would be subject to arbitrary detention and even torture in a legal system with no presumption of innocence.  The proposed legislation threatened the independence of the city’s judiciary, its rule of law, and its status as a political refuge for mainland dissidents and activists, the “very things to which Hong Kong attributed its success” (p.16), Lim writes.

On June 19, 2019, over a million demonstrators, over 1/7th of Hong Kong’s population, took to the streets to protest the government’s plans to change the existing extradition law, drawing an eclectic crowd that included “all who elevated principle over pragmatism, hope over experience” (p.211).  Putting her journalistic hat aside, Lim and her teenage son participated in the demonstration.  The police used tear gas on the crowd and deployed rubber bullets, with Lin and her son nearly caught up in the police riot.  One week later, a still more massive protest took place, with an estimated 2 million people turning out.

The second protest, which Lim covered as a journalist, targeted the extradition changes but also gave voice to a broader set of demands that had been added to the protestors’ agenda: investigation into police brutality, amnesty for those arrested, retraction of the “riot” label the government used to describe the protests, and genuine universal suffrage. As she witnessed the protest, Lim felt a surge in her love for Hong Kong and her identification with its doomed democracy movement.  She considered the moment a “triumph of idealism” for a people “long stereotyped by their colonial masters as motivated only by the pursuit of money” (p.211).

In early September 2019, Hong Kong Chief Executive Carrie Lamb withdrew the extradition proposal.  But it was “far too late,” Lim contends. The battle for Hong Kong’s soul had “already spilled over its borders, causing clashes between Hong Kong supporters and mainland Chinese on campuses around the world.” (p.231).    Lim, who by this time had taken a teaching position in Australia, found that she couldn’t discuss the Hong Kong situation safely in her new location.

Protests and violence continued during much of the remaining portions of 2019.  At one point, Lam called the protestors enemies of the people, a “chilling phrase straight from the Chinese Communist Party lexicon,” (p.235).  The November 2019 elections resulted in a substantial boost to the protest movement,  with 17 of the 18 district councils flipping from pro-Beijing to pro-democracy majorities. But this “did not change a government that had never been answerable to the people” (p.23), Lim observes.

The National Security Law of the following year is anti-climactic in Lim’s account. In addition to proscribing sedition, subversion, terrorism, and collusion with foreign forces in the vaguest terms, the law appeared to allow if not require those suspected of violating the law to be tried on the mainland.  In August 2021, a law allowing the government to impose exit bans went into effect. “Even the freedom to leave was being denied” (p.260), Lim writes.  Historically a place of refuge, Hong Kong had become a “place that people were fleeing from” (p.257).   China’s maneuvers were now producing Hong Kong boat people.

* * *

In the 2010s, the Hong Kong of Lim’s exhilarating yet ultimately disheartening story was on the “front line of a global battle between liberal democratic values and an increasingly totalitarian Communist regime,” with a tiny dot on the map somehow managing to “unsettle the world’s newest superpower with the power of its convictions” (p.34).  Although Lim retains hope that China has not fully snuffed out Hong Kong’s pro-democracy aspirations, her story leaves little reason to support her optimism.  At the heart of Indelible City is a cruel irony:  a full-throated pro-democracy movement did not gain widespread public support in Hong Kong until China intensified its efforts to exert its control over the island. The movement was anything but too little, yet we can now see that it came far too late.

Thomas H. Peebles

Paris, France

October 19, 2023

 

 

 

 

6 Comments

Filed under Politics, Rule of Law, World History

Weaving Together Humanism’s Multicolored Threads

Sarah Bakewell, Humanly Possible:

Seven Hundred Years of Humanist Freethinking, Inquiry, and Hope

(Chatto & Windus, 2023)

British freelance writer and journalist Sarah Bakewell is a proven master at presenting serious questions of philosophy and intellectual history to general readers. In her delightful 2016 work At the Existentialist Café, she examined the philosophical way of thinking termed existentialism, using an historical approach grounded in the actual lives of existentialist philosophers (I reviewed At the Existentialist Café here in 2017).  That study concentrated on about a half century of philosophical history.  Her most recent book Humanly Possible: Seven Hundred Years of Humanist Freethinking, Inquiry, and Hope, sweeps far more broadly.

Bakewell’s objective in this ambitious, erudite, and thoroughly captivating volume is to trace the strain of thought known as humanism from its roots in 14th century Italy to the present, primarily by looking at a broad selection of figures whose thinking in her view comprises the humanist tradition — even though almost none used the term to refer to themselves, and there is little agreement as to who those figures are.  There is no firm consensus, moreover, even among committed humanists, as to what the term humanism means in all its dimensions and ramifications.  The term itself did not come into common usage until the early 19th century, when it was used to describe an educational approach centered upon the study of Greek and Roman classics.

Humanism is often thought of as a form of opposition to religion and specifically to the Christian theological view of humans as inherently sinful beings who need to subordinate their lives to the will of God and position themselves for a good place in the hereafter.  But for Bakewell humanism is far more complex than simple anti-religious thinking. There have always been humanist currents within Christianity and other faiths, she notes, where the focus has been “mostly on the lives and experiences of people here on Earth, rather than on institutions or doctrines, or the theology of the Beyond” (p.2).  Further, the terms “humanist” and “humanism” often arise in contexts outside the realm of religion, including architecture, philosophy, medicine, literature, photography, and film.

Today there are identifiable humanist organizations throughout the world whose members meet, discuss, disagree, and issue manifestoes, focusing frequently on what the word humanism itself means.  While the word sometimes seems ensnared in a “semantic cloud of meanings” (p.7), as Bakewell puts it, her title suggests that, at a minimum, humanism is a perspective based on freethinking, enquiry, and hope.  Freethinking denotes a preference for allowing our lives to be guided by moral consciousness and evidence, “rather than by dogmas justified solely by reference to authority” (p.22); enquiry reflects a belief in the value of study, education and the use of reason as a means to a more virtuous and civilized life; and hope is just that, the feeling that, despite recurring and dispiriting evidence of human failings, it is possible to “achieve worthwhile things during our brief existence on Earth”  (p.22).

 The diverse strands and manifestations which freethinking, enquiry, and hope have taken over the centuries constitute a “coherent, shared humanist tradition,” Bakewell argues, linked by what she terms “multicolored but meaningful threads” (p.7).  All “look to the human dimension of life” (p.3), she writes with emphasis. The views of humanists “now permeate many societies, whether recognized as such or not” (p.8).  Almost by definition, “everything we do can seem a bit humanistic” (p.4).  Although many earlier works have addressed aspects of humanism, Humanly Possible appears to be the first to attempt to weave the multicolored threads into a comprehensive historical overview of the humanist tradition.

Bakewell presents this tradition in approximate but not exact chronological order, with the 14th to 18th centuries constituting roughly the first half of her narrative.  From the 14th to the 17th century, humanism overlapped with what we term the Renaissance, centered initially on the Italian peninsula; in the 18th century, it was nearly congruent with the Enlightenment, centered in France. The second half of Bakewell’s narrative is dedicated to what might be termed modern humanism, from the 19th century to the present.

In the 19th century, when the term humanism came into common usage, humanist thinking supported the rise of liberal democracy and came to embrace science and the scientific method as keys to humanist understanding.  But the 20th century presented new and frightful challenges to the humanist understanding.  Two world wars, the rise of totalitarian governments, the Holocaust, and the dropping of atomic bombs on Japanese cities in the first half of the century caused many to see an “unanswerable refutation of the entire humanist worldview… The idea that humans somehow oozed evil took up residence in the cultural atmosphere” (p.326-27).

Despite widespread questioning of the entire humanist project after World War II, the post-war years saw an ever-wider acceptance for core humanist values.  Post-war humanism began to take into account a wider range of issues, including racism, colonialism, cultural diversity and, more recently, climate change and damage to the planet.  Across the seven centuries, humanists of course had to confront anti-humanist thinking and action, confrontations which helped produce many of the multicolored threads that are central to Bakewell’s narrative.

That narrative is about humanists as much as it is about humanism.  Centered around short biographical sketches of about 50 thinkers who can be considered representative of the humanist tradition across the centuries, Humanly Possible is a distinctive blend of biography and intellectual history.  Some of the figures Bakewell features are well-known, many less so.  Among the 50, the following jumped out at me as Bakewell’s favorites, her most consequential humanists: Francesco Petrarch (1304-1374), Giovanni Boccaccio (1313-1375), Desiderius Erasmus (1469-1536), Michel de Montaigne (1533-1592), David Hume (1711-1776), Wilhelm von Humboldt (1767-1835), John Stuart Mill (1806-1873), Charles Darwin (1809-1882), Thomas Huxley (1825-1895), and Bertrand Russell (1872-1970).

* * *

Together, Petrarch and Boccaccio “more or less invented the way of life that would be, for the next two centuries, the humanist one – not that they used this label of themselves” (p.26), Bakewell writes.  Neither attacked the Catholic Church directly, but each sought to revive the classical Roman tradition which they thought early Christianity had tried to suppress, especially the works of Virgil and Cicero.  Boccaccio also became the first serious Dante scholar. Both were known for their prolific writing, and each engaged in book collecting, translating, and editing.

Bakewell describes the Dutch scholar Desiderius Erasmus as one of the “most many-faceted of humanists, the author of translations, dialogues, diatribes, technological tracts, writing manuals, study guides, proverb collections, amusing diversions, and astonishing quantities of letters.” (p.140).  Indeed, Erasmus almost singlehandedly gave humanism a Northern European face in the 16th century (today, many university students across the European Union know the name Erasmus through the university exchange program that bears his name, which allows students to study at institutions in other EU countries).  Michel de Montaigne, the subject of Bakewell’s first major work, How to Live, was the other major non-Italian humanist giant of the 16th century, whose most significant contribution to humanism in Bakewell’s view was his belief that “all people share an essential, common humanity” (p.159).

Along with the usual personalities of the French Enlightenment, including Voltaire and Diderot, Bakewell highlights the contribution of Mary Wollstonecraft (1759-1797), whose Vindication of the Rights of Woman, appearing at the height of the French Revolution in 1792, pushed the idea that to be fully humanized in matters of virtue, women too must have a humanizing education.  But Bakewell gives special attention to the Scottish philosopher David Hume, whom she describes as the “most intellectually merciless thinker of his time” (p.186), making him both the perfect Enlightenment figure and the perfect humanist.  Hume located the basis for morality in “sympathy,” or fellow feeling, then used the idea of sympathy to produce a comprehensive theory of ethics.

Among the many 19th century thinkers who could be considered humanists, Bakewell focuses upon Prussian educator Wilhelm von Humboldt and English philosopher John Stuart Mill.  The thinking of both helped lay the foundation for modern liberal democracy.  Each arrived at the  idea that the state should not impose any particular religion or dogma on society.  The role of the state for both was to step in only when one’s pursuit of freedom and experience damages others.  Humboldt also developed ideas on education that were picked up by subsequent humanists, while Mill became a 19th century champion of women’s rights.

Charles Darwin’s contribution to humanism lies mostly in a work that appeared in 1871, TheDescent of Man and Selection in Relation to Sex, rather than his more famous 1859 work, On the Origins of the Species.  The latter said little about man and nothing about human morality.  In his 1871 work, by contrast, Darwin developed what Bakewell considers a quintessentially humanistic view that morality entered the human world from social feelings and behavior, with no need to rely upon God.

The zoologist, educator, essayist and polemicist Thomas Huxley did more than anyone else in his time to promote Darwin’s ideas and, more than Darwin, was responsible for the rise of scientific humanism.  Good scientific training, Huxley contended, “protects us against a tendency to go storming off into foolish interpretations based on misunderstandings of the facts, or how matters of evidence or experiment work” (p.254), a perspective that Bakewell suggests could have prevented some of the waves of misinformation and superstition that the recent Covid-19 pandemic engendered.

The English philosopher and mathematician Bertrand Russell stands out as arguably the 20thcentury’s most consistent and persistent proponent of humanist values.  Born in 19th century Victorian England, Russell lived long enough to protest not only World War I but also nuclear weapons and the Vietnam war.  Russell’s strongest conviction was that accepting assertions based solely on authority is never good enough.  Despite the traumas of the 20th century, Russell urged his fellow human beings to “remember your humanity and forget the rest,” (p.339), maintaining up to the end of his long and varied career that progress in happiness, knowledge and wisdom was still possible.

In addition to these luminaries, Bakewell includes biographical sketches of approximately 40 others across the seven centuries who contributed to the humanist tradition.  As in At the Existentialist Café, the sketches, written in her inimitable breezy style, dig into the “adventures, quarrels, efforts and tribulations” (p.7) of the key humanist figures.  But readers may wish to challenge Bakewell’s omissions, asking why any number of luminaries were not among those meriting a biographical sketch.

My list includes two of the German language’s leading lights, poet and writer Johann Wolfgang von Goethe and philosopher Immanuel Kant, who receive only scant attention.  Novelists are almost by definition humanists in that they shed light on the human condition.  Bakewell includes biographical sketches of E.M. Forster, Thomas Hardy, and Thomas Mann but mentions such consequential novelists as Charles Dickens, Victor Hugo, Leo Tolstoy and Marcel Proust only in passing, or  not at all.  Although Bakewell does not ignore the role of sexuality in all its dimensions in shaping humanist thought, Sigmund Freud and his thinking are curiously absent.  It is not difficult to imagine various humanist societies around the world protesting vehemently to Bakewell about the omission or slighting of a particular figure or figures.

The 50 or so figures who anchor Bakewell’s narrative, moreover, are all European or North American, and almost all male. Recognizing that her narrative has a Eurocentric slant, Bakewell periodically includes non-Western contributions to the humanist tradition. Although Johannes Gutenberg famously produced the first European printed book in 1455, for example, she makes clear that printing techniques had been pioneered in China and Korea long before.

Bakewell is also keen to mention the contributions of women throughout the seven centuries and includes an entire chapter dedicated to female contributions to the humanist tradition, along with biographical sketches of both Enlightenment figure Mary Wollstonecraft and Mary Augusta Ward (1851-1920).  Ward’s late-19th century novel Robert Elsmere shows a clergyman transitioning to religious skepticism on the way to becoming a humanist who sets up an alternative organization to a church, “based on meliorism and reform of conditions for the poor” (p.267).

Bakewell also mounts a vigorous response to the criticisms that emerged in the aftermath of World War II, when the anti-humanist disasters of the first half of the 20th century prompted some writers to dismiss humanism as antiquated, hopelessly naïve, and irrelevant.  Among them, French philosopher Michel Foucault nonetheless offered what Bakewell considers a “valuable overhaul service” to conventional humanist thinking by highlighting structural inequities such as racism and colonialism that European humanists had hitherto been “inclined to think too little about” (p.330).

The post-war reevaluation of traditional humanism was undoubtably healthy and useful, Bakewell acknowledges.  But if critiques such as those of Foucault forced humanists to take more serious account of cultural diversity and structural inequities, they were all too easily flipped into “something more like a total rejection of liberal, humanistic and Enlightenment values, as if these values were to blame for their own negation” (p.328).  This “bizarre twist,” she writes, is comparable to the notion that “car crashes will occur despite traffic lights, therefore traffic lights are to blame” (p.328).

One early example of how cultural diversity could shape humanism was the United Nations’ 1948 Universal Declaration of Human Rights, which benefited from input and support from all corners of the globe.  By bringing the perspectives of non-Western as well as Western countries to bear on its task of balancing rights and duties, individualism and community, the declaration offered an answer to the question whether one could speak of “anything ‘universal’ in humanity at all” (p.333).    The resulting text, both humanist and practical, was far more inclusive and culturally sensitive than any similar document of its time.

After the Second World War, moreover, humanist organizations emerged in many parts of the world.  In India, Nath Roy, who in good humanist style rejected the austere lifestyle of Indian independence leader Mahatma Gandhi, led a particularly strong organization. Today, sub-groups of Black, Latina and LGBTQ+ humanists often form part of larger humanist groups. More than ever before, today’s humanist organizations work to become more approachable, building “better connections with wider communities – including some that may have a high level of distrust or dislike of humanism” (p.347).  The wider range of traditions that nourish contemporary humanism are captured in a 2022 version of the Humanist Manifesto, a text Bakewell considers comprehensive yet appropriately modest.

Along with its emphasis on human diversity, the recent manifesto rejects racism and ethnic prejudice far more explicitly than any of its predecessors. It further makes clear that humanism does not define itself in opposition to organized religion, recognizing that if humanists are “perceived mainly as anti-religious, they may be thought of as opposing the validity not just of specific beliefs but of the whole principle of meaning and identity” (p.347).  But humanist values are anything but firmly entrenched in today’s world.

Bakewell notes  that humanism is outlawed as a form of blasphemy in Pakistan, with Pakistani humanists often killed by vigilante mobs, most recently in 2017.  Human rights abuses recur with discouraging regularity across the globe, while democratic governments struggle to withstand nationalist and populist authoritarianism that seeks to undermine basic democratic – and humanist – principles.  Contemporary humanists must also acknowledge the many ways that humans are “hardly a good influence on the planet, wrecking its climate and ecosystem, obliterating species with our crops and livestock” and “redirecting every resource to the production of more and more humanity” (p.363).

* * *

Although anti-humanist thinking and actions remain very much part of our daily human landscape, Bakewell counsels us not to despair.  History and the human world are “neither stable and good on the one hand, nor hopelessly tragic on the other.”  They are, she writes with emphasis, “our own work, so if we want it to proceed well, we have to exert ourselves to make it happen” (p.332).  Keeping the humanist perspective vital and relevant, she concludes, requires “all the ingenuity we can muster” (p.368).   In her illuminating account of that rich and complex perspective over the course of seven centuries, Bakewell’s own ingenuity is on full display.

 

Thomas H. Peebles

Paris, France

September 16, 2023

1 Comment

Filed under Biography, European History, History, Intellectual History

Can an Adjective Transform Our Politics?

 

Michael Walzer, The Struggle for a Decent Politics:

On “Liberal” As An Adjective

(Yale University Press)

Michael Walzer is among America’s most distinguished political philosophers.  Currently a professor emeritus at the Institute for Advanced Study in Princeton, New Jersey, he had previous teaching stints at both Harvard and Princeton universities and is the author of a books on a wide range of subjects, such as economic justice, just and unjust wars, nationalism, and Zionism.  For many decades, he was an editor of Dissent magazine, a broadly leftist political voice that was (and still is) often at odds with mainstream American liberalism.  Throughout his long career, Walzer has pondered and written about American liberalism, mainstream and otherwise, from a perspective well to the left of the political center.

Now in his late 80s, Walzer’s most recent work, The Struggle for a Decent Politics: On “Liberal” As an Adjective, seeks to provide what he terms a “newly imagined” (p.ix) perspective on the slippery word “liberal” by focusing upon its role as an adjective rather than a noun, thereby side-stepping the need to define “liberalism” comprehensively.  In seven chapters, Walzer examines the implications of the adjective “liberal” as applied to what he terms “commitments”: democrats, socialists, nationalists and internationalists, communitarians, feminists, professors and intellectuals, and Jews — the seven commitments that have defined Walzer’s personal and professional life.

The book’s main argument, stated simply, is that the adjective “liberal” cannot “stand by itself as it is commonly made to do (by adding the ‘ism’; it needs its nouns.” (p.5).  If nouns like socialist, nationalist, and so forth are political “commitments” — – political ends — the adjective “liberal” denotes one of the means to meet that commitment – but in Walzer’s view by far the best means.  A substantial portion of the book addresses the illiberal version of each commitment, and Walzer sees illiberalism in most coming from both the left and the right.

The Struggle for a Decent Politics thus constitutes an argument for a centrist perspective on the major issues of our time, leaning left but avoiding radicalism. Its perspective echoes that which 20thcentury arch-liberal Arthur Schlesinger, Jr., staked out in 1949 in The Vital Center, where he outlined the case for liberal democracy and robust but regulated capitalism as the best if not only workable alternative to the political extremes — both Communism and Fascism, Schlesinger’s main targets, but also extreme left- and right-wing propensities within the United States.  Like Schlesinger in the late 1940s, Walzer nearly 75 years later stakes out a position firmly rooted in the American political center, looking left.

Walzer crafts exceptionally tempered arguments as he discusses some of the most polarizing issues in the United States and beyond today, including populism and the current Republican party, the #MeToo movement, free speech on campus and contemporary Israel.  In addressing the illiberal tendencies he strongly opposes, he rarely names names.  This measured tone, never shrill or polemical, appears designed to win over those not usually inclined to look favorably on anything labeled “liberal,” and thereby help make our politics more “decent,” as his title suggests.  Sounding much like the former civil rights and anti-war activist he once was, Walzer describes the battles for decency and truth for his commitments as “among the most important of our time,” with the adjective “liberal” being “our most important weapon’ (p.151).

The Struggle for a Decent Politics has the flavor of a memoir — “in part a personal testament”(p.xii), Walzer writes.   He acknowledges at the outset that he may not have much time left for additional works, and this book may be about as close as he comes to defining who he is and what he stands for in the 21st century’s tumultuous third decade.  While his stratagem of considering “liberal” as a means toward various ends may avoid defining “liberalism” comprehensively, a picture of the liberal temperament nonetheless emerges across the book’s seven substantive chapters.

Walzer liberals are committed to restrained political power, individual rights and pluralism.  Open-minded and tolerant, they are willing to accept change, including changing their own views, and are guided in shaping those views by scientific and other empirical evidence.  Whatever their religion, Walzer liberals are not dogmatic.  But neither are they relativists: “We recognize moral limits; above all, we oppose every kind of bigotry and cruelty” (p.3).  Unlike many radicals on both the left and right, Walzer liberals recognize the crucial role of the modern nation-state in forging a type of civil religion with the potential to bind together a diverse polity.

If The Struggle for a Decent Politics is something less than a comprehensive analysis of liberalism, it nonetheless constitues an extended argument for liberal democracy, the subject of the book’s first substantive chapter and the centerpiece of Walzer’s political credo.  It is difficult to imagine any of the subsequent commitments  functioning effectively outside a liberal democratic framework.

* * *

Modern democracy is an “extraordinary project,” Walzer writes, based on majority rule, a “political order where the greater number of the people, when everyone is counted, actually govern the country” (p.8).  But majority rule in a liberal democracy needs to be constrained, in part by such institutional checks as a constitution that guarantees individual rights, an independent judiciary and a free press.   Competition for political power in a liberal democracy is wide-open, with much – but, crucially, not everything – at stake.  Built on the expectation of a peaceful transfer of power, “without death or prison for the losers” (p.12), liberal democracy is “transformative but not revolutionary” (p.17).  Right-wing populists consider themselves democrats, representing “the people,” but theirs is an illiberal form of democracy, a triumph of majoritarianism over liberal constraints. “Populist demagogues,” Walzer argues, “are wrong to claim that once they have won an election, they embody the ‘will of the People’ and can do anything they want” (p.9).

Sounding much like a 21st century Alexis de Tocqueville, Walzer argues that liberal democracy thrives when it encourages a vibrant civil society which includes “religious, ethnic, economic, philanthropic, and cultural organizations” (p. 92) of all stripes. Institutions like churches, synagogues, mosques, unions, professional groupings, political clubs and even extended families allow for “close intense, personal relationships,” where we “talk, argue, negotiate, come together” (p.92).  A pluralist and inclusive civil society constitutes an additional restraint on majority rule in a liberal democracy, where every individual is a political agent, “able to join any all meetings and movement and free to stay home – the equal of all the others” (p.30).

This is Walzer’s argument for liberal communitarianism.  Communitarians emphasize the value of membership in private and voluntary groups, focusing on the “close connection of a group of people who share a strong commitment to a religion, a culture, or a politics.” (p.85).  Walzer, considered a leading communitarian in the 1980s and 1990s, insists that to be liberal, communitarians must be pluralist. The distinct and separate activities that form the backbone of civil society should not separate us from each other.  Liberal communitarians oppose the “exclusivist passions and fierce partisanship” (p.151)  of some identity groups, which Walzer traces to the illiberal communitarianism of Jean Jacques Rousseau.  For Rosseau, participation in the civic republic was a “commitment that excludes all others” (p.85).  Rousseau viewed secondary associations, church and above all political parties as a “threat to the integrity of the republic,” Walzer writes, although he notes that, if there are to be any such associations, Rousseau considered it “better if there are a lot of them  so that no single one competes with the republic for the loyalty of its citizens”  (p.85).

Walzer considers citizenship to be the pluralist glue holding together a liberal democracy’s wide range of independent and voluntary associations.  Citizenship constitutes a formal recognition that we all are “equal members of a political community” – that we are “all in the same boat.” (p.22-23).  Through common citizenship, we share what might be termed a civic religion, without theology but with its own rituals, holidays and authoritative texts.  A recognition that we are all in the same boat also captures the spirit of democratic socialism, for Walzer the most crucial sub-set of liberal democracy itself.

The adjective “liberal” in front of “socialism,” Walzer writes, means that a socialist society can be achieved “only with the consent of the people as they are here and now with all their differences of character, belief, and ability, and it must be fought for democratically” (p.39).  Liberal socialism is pluralist, with ample room for disagreement – disagreement over the “strategy and tactics of the struggle, and about which compromises are necessary, and which are ‘rotten’” (p.45).  We should expect to find many versions of liberal socialism, with “parties, unions, and magazines of different sorts competing for members (subscribers, too) and influence within a liberal democratic framework.” (p.45).

Walzer’s liberal socialism seems indistinguishable from what we today term “social democracy,” reformist rather than revolutionary, like liberal democracy itself.   Unlike many of its predecessors, liberal socialism accepts regulated market capitalism, with Walzer offering a relatively benign view of capitalism.  Liberal  socialism should leave room for entrepreneurs, he notes.  We need to “make sure there is space for them to do what they do, space for individual innovation and risk-taking” (p.49).

The breathing room provided for pluralism and market capitalism distinguishes liberal socialism from its conspicuously illiberal predecessors, particularly Bolshevism and its fanatical progenies.  Joseph Stalin and Mao Zedong considered themselves socialists but there was nothing remotely liberal about either.  Today’s liberal socialists must acknowledge the murderous socialist past, “memorialize the victims, and set themselves against any return of authoritarian or totalitarian politics” (p.39).  Two steps forward, one step back is always better than “three steps forward over the bodies of our opponents” (p.39), Walzer wisely observes.

Liberal socialists’ “central creedal commitment” (p.47) should be to combat unjust economic inequalities. They need to do so within the modern nation-state, Walzer counsels, because socialists have not been successful anywhere except within a nation-state in which they recognize the “national loyalties of their people and build strong national parties” (p.60).  Liberal socialists thus need to be liberal nationalists.  And nationalists are simply people who put the interests of their own nation first.

History is of course punctuated by illiberal nationalists who expound and act upon xenophobic, aggressive forms of nationalism – Adolph Hitler in his day, Vladimir Putin in ours.  But liberal nationalism is no oxymoron. As Walzer reminds us, at the time the term came into popular use in the 19th century, nationalism was among the preeminent manifestations of liberalism – Mazzini working to unite the Italian peninsula, for example.  While putting the interests of their own nation first, liberal nationalists also recognize the right of other people to “do the same and seek accommodation, cooperation and solidarity with nationalists across borders” (p.59).

Liberal nationalism thus requires a “political struggle against illiberal nationalists at home and complicated diplomatic dealings with self-regarding nation-states abroad . . . The goal is peaceful coexistence”.  (p.58).  Although they do not support entirely open borders, liberal nationalists take a benign view of immigration, unlike most of their illiberal counterparts, making room wherever possible for asylum seekers and refugees.  They “resist contemporary xenophobic nationalisms, including those that are anti-Muslim and anti-Semitic” (p.151).   They further recognize the rights of minorities within the states that nations create.  Being a liberal nationalist thus necessitates a commitment to pluralism.

Being a liberal Jew also necessitates a commitment to pluralism.  Liberal Judaism evolved from Jewish emancipation in Western Europe and North America in the mid-19th century, Walzer explains, making it “possible for many Jews to abandon all the denominations and become unaffiliated and nonobservant Jews, identified simply as members of the Jewish people” (p.125). Liberal Jews like himself are easy to caricature, he writes. They are “very moderately committed to one way of being Jewish; look benignly, or maybe indifferently, on all the other ways; congratulate themselves on their broad-mindedness; and avoid as much as they can Jews who are more strongly committed” (p.128-29).

Walzer sees overly committed, illiberal Jews, in control in today’s Israel and visible elsewhere, rejecting Jewish pluralism in favor of allocating power only to those who live as they do.  But he also sees liberal Catholics, Protestants, Muslims, Hindus and Buddhists who “stand against the unexpected return of religious zealotry” (p.151).  For Walzer, a commitment to liberal Judaism entails a commitment not only to pluralism but also to secularism, as manifested in the United States through the famous “wall of separation” between church and state.  But he also recognizes that there are illiberal forms of secularism – there have “certainly been secular zealots” (p.141).

In his chapters on Liberal Feminists and Liberal Intellectuals and Professors, Walzer sees illiberal versions at work to his left.  Men can of course be feminists, he recognizes. But if you really want to be a feminist, you “have to join the arguments about what feminism means, and if you are a male outsider, you need help from the inside” (p.98).   Walzer is no outsider when he addresses liberal professors and intellectuals, having been both for many decades.  His major concerns in this chapter are free speech and its limits on university campuses, focusing upon the tendency of some professors to avoid offending students.

This tendency arose out of legitimate anger of early Black students in predominantly white universities, Walzer explains, an anger which somehow transformed into a “plea for comfort,” with sensitivities encouraged to become “ever more sensitive” (p.118).  Decidedly illiberal academic practices have resulted: lectures cancelled, speakers shouted down, professors reprimanded and disciplined, and students harassed by fellow students, “all because of ‘offensive’ opinions” (p.118).  Too many professors hide in such instances, “self-censoring, reluctant to say anything that might make them a target” (p.118).  There is a crying need for liberal professors who defend free speech on campuses, especially speech which might offend some students.

Where Walzer would close the liberal circle is the subject of his intriguing final chapter, entitled “Who Is and Who Isn’t?”  Here, he identifies nouns for which the adjective “liberal” does constitute an oxymoron: liberal racists or liberal Nazis are two obvious examples.  “Bigotry and hate don’t have liberal versions” (p.147), he writes.  The adjective also seems out of place if used before ultra-orthodox Jews or fundamentalist Christians. “Religious dogmatists, whatever the dogma, probably can’t be liberal” (p.146).   Nor can there be a liberal imperialism, “although there are certainly more and less brutal versions of imperial rule” (p.62).

Even today, we can imagine liberal Republicans, although there are not many left.  There can also be liberal conservatives, “most obviously those who try to conserve or rescue liberal democracy when it comes under attack” (p.148), with former Representative Liz Chaney being an obvious example, although Walzer does not mention her by name.  But in deciding which conservatives deserve to be inside the liberal circle, we “always have to ask what is being conserved. The effort to defend or revive hierarchical regimes can be a romantic project but not a liberal one” (p.148), he writes.

But what about historical figures whose reputations are currently under attack, such as Voltaire, who had anti-Semitic tendencies, and slaveholder Thomas Jefferson?  We are “endlessly reminded of the sins of our forefathers and of our own failure to acknowledge the sins and to repair the damage they caused” (p.145).  The failures of such figures may be obvious, but Walzer urges a more generous, less iconoclastic view for such historical figures: Voltaire in his time was indeed a liberal philosophe and Jefferson a liberal republican, he insists.

* * *

 Walzer finishes with a cheeky question: can there be illiberal liberals?  Of course, he says, there will always be “absolutists of different sorts who believe that their own version of liberalism is the last word” (p.149).  Walzer certainly does not intend the capacious versions of his commitments to “decent politics” to be the last word on these subjects.  And let us hope, fervently, that this timely, deftly argued work is not the last word from Walzer.

 

Thomas H. Peebles

Paris, France

July 15, 2023

 

3 Comments

Filed under American Politics, Political Theory, Politics

Testing Britain’s Commitment to Decolonization and the Rule of Law

 

 

Philippe Sands, The Last Colony:

A Tale of Exile, Justice and Britain’s Colonial Legacy (Weidenfeld & Nichols, 2022)

For many across the globe, the 1960s were above all the decade of decolonization.  In 1960, the United Nations for the first time directly addressed the legality of colonization under international law when the General Assembly approved Resolution 1514, “Declaration on the Granting of Independence to Colonial Countries and Peoples.” The resolution characterized foreign rule as a violation of human rights, affirmed the right to self-determination, and called for an end to colonial regimes. The decade saw over 30 colonies, mostly in Africa and Asia, gain their independence from Great Britain, France and other European powers. But the Cold War confrontation between the Soviet Union and the United States was also a global fact of life throughout the 1960s, reaching the brink of nuclear war in the October 1962 Cuban Missile Crisis.

Decolonization and the realities of the Cold War converged in 1968 when the Indian Ocean Island nation of Mauritius gained its independence from Britain. In granting independence, Britain split off an area known as the Chagos Archipelago, located about 2000 kilometers north of Mauritius’ capital city, Port Louis, from the rest of the newly independent nation to form a new colonial entity, the British Indian Ocean Territory (BIOT).  Two years earlier, in 1966, Britain and the United States had secretly concluded an agreement to locate an American naval base on the Chagos Archipelago’s largest island, Diego Garcia, to support US military operations across the Indian ocean.

Part of the agreement for the Diego Garcia base involved the “resettlement” – forcible deportation – of the entire local population of the Chagos Archipelago from what was in most cases the only homeland its residents had ever known.  The deportations took place between 1968 and 1973.  Although Britain came under criticism immediately for the deportations, it was not until a half century later, in 2019, that an international tribunal squarely determined that Britain’s detachment of the Chagos archipelago had been contrary to international law under UNGA 1514 and that Britain could not legitimately claim sovereignty over the archipelago.

That tribunal was the International Court of Justice (ICJ), the United Nations’ principal judicial organ, located at The Hague in the Netherlands. In addition to resolving contentious issues between member states, the ICJ is also empowered to give advisory opinions on “any legal question” requested either by the UN’s General Assembly or its Security Council.  A UNGA advisory opinion was the route that Mauritius pursued in the Chagos case.

Mauritius was represented before the ICJ by Philippe Sands, a London-based international human rights lawyer who has also litigated high-profile cases involving Chile, Congo, Rwanda, and the ex-Yugoslavia, to name just a few, as well as writing prolifically on and teaching international law. In his most recent work, The Last Colony: A Tale of Exile, Justice and Britain’s Colonial Legacy, Sands walks his readers through the Chagos case, allowing us to see the strategies, thinking, and legal maneuvering required to get the case to The Hague and present it effectively before the ICJ. He uses the litigation as a springboard to demonstrate how the international justice system operates at the ground level in a case that in his view goes to “the heart of any system of justice, how the rule of law protects the weak and vulnerable from the excesses of the powerful” (p.130).

The major issues at The Hague, involving the applicability and scope of UNGA Resolution 1514 and conflicting claims of sovereignty, may sound abstract and coldly legal.  But they are significantly less so in Sands’ account because he explains them through the eyes of Madame Lisby Elysé, whom he describes as his book’s “beating heart” (p.x).  Like most in her community of 1,500, located on an island within the Chagos archipelago, Madame Elysé is Black, a descendant of enslaved plantation workers. She dropped out of school early to assist her family and can neither read nor write.  In 1973, when she was 20, recently married and then pregnant, British authorities informed her with almost no advanced notice that she had to leave her island home.  She was allowed to bring one suitcase.  She and the other members of her community have never been allowed to return permanently, although they have been accorded the option of occasional subsidized returns, euphemistically termed “heritage visits.”

The path to The Hague for Madame Elysé and her fellow Chagnossians was long, with many preliminary stops.  Along the way, they brought numerous cases in courts in London challenging the legality of the detachment and subsequent forced deportations; took a trip to the European Court of Human Rights in Strasbourg; and engaged in an arbitration proceeding in Istanbul, all before convincing the UNGA to refer the case to the ICJ for an advisory opinion.  As Sands takes his readers along this path, he never loses sight of how the case affected Madame Elysé and her fellow islanders.

Tying together the diverse strands of Sands’ narrative yields a withering account of Great Britain’s relationship with its former colony.  A state policy of forced deportations as recently as the late 1960s and early 1970s now sees shocking.  But even more shocking in Sands’ account is the degree to which 21st century Britain, backed by the United States, continues to this day to defend the deportations and assert sovereignty over the Chagos Archipelago, despite the ICJ decision and a subsequent, nearly unanimous, UNGA resolution which had the effect of affirming the court’s decision.

* * *

To understand the Chagos case’s long journey to The Hague, Sands provides a useful textbook overview of the basic principles and institutions of the post-war international legal order, connecting them to the era’s decolonization movement and to modern notions of human rights and self-determination.  The major documents and instruments creating that legal order, such as the initial UN Charter of 1945, the 1948 Universal Declaration of Human Rights, and the 1949 Geneva Convention, avoided directly addressing the future of colonial regimes, an indication of British and French influence on the drafting process.

UNGA Resolution 1514 redressed the evasions and omissions contained in the early post-war documents and instruments. Passage of the resolution in 1960 rendered colonial domination “illegitimate for the first time in modern international society,” Adom Getachew wrote in Worldmaking After Empire: The Rise and Fall of Self-Determination, reviewed here last year.  Under Resolution 1514, self-determination became a human right, with colonialism itself becoming an international crime. The resolution was adopted by an 89-0 vote, with 9 countries abstaining, including Britain and France.  The   United States was on the cusp of voting yes until President Eisenhower overruled his diplomats and ordered abstention, purportedly upon the personal request of British Prime Minister Harold Macmillan.  Britain’s official position on Resolution 1514 was that it accepted self-determination as a “principle” although not as a legal “right” (p.34).

But the internal deliberations over Mauritian decolonization which Sands has unearthed suggest that Britain had difficulties accepting self-determination even as a principle. In discussions leading up to independence, Sands indicates, Britain informed the surprised Mauritians that it intended to retain the Chagos archipelago but did not mention the plan for the naval base at Diego Garcia. Facing international condemnation when the detachment arrangement came to world attention, Britain’s Colonial Secretary warned that Britain needed to move quickly before the Mauritians and the world at large learned of the United States’ place in the arrangement, which might “lay ourselves open to an additional charge of dishonesty” (p.44).

To avoid an “additional charge of dishonesty,” the Foreign Office instructed its Ambassador in New York to tell the UN that the Chagos islands “have virtually no permanent inhabitants” (p.44), a “big lie” (p.47) in Sands’ words. But the British Ambassador to the UN was uncomfortable with the word “virtually,” fearing it might raise questions over what that qualification was meant to suggest and advised that it would be preferable to proceed on the basis that there were “no permanent inhabitants” (p.4) on the islands. The word “virtually” was excised, resulting in an even bigger lie.

When one of the Foreign Office’s legal advisors expressed reservations about this approach, which he considered fraudulent, another countered that there was nothing wrong in law or principle with the forced deportations because Britain could “make up the rules as we go along” (p.45).  The Foreign Office stressed that Britain needed to be “very tough” in managing the public relations fallout from Mauritius and that Chagos should become a place with “no indigenous population except seagulls” (p.47).   Several suits were filed in London in the 1970s and 1980s challenging the forced deportations from Chagos, none of which provided the primary relief Madame Elysé and her fellow Chagnossians sought: the right to return to their home islands.

* * *

The major breakthrough in the Chagnossians’ quest to reach The Hague occurred decades later, in 2010, when British Foreign Secretary David Miliband announced the creation of a vast “Marine Protected Area” around the Chagos archipelago.  The Marine Protected Area was intended to protect marine biodiversity, burnish Britain’s environmental credentials and, not incidentally, cast its policy toward the Chagnossians in a more favorable light.  Diego Garcia was excluded from the special area. Although the proposal was warmly received by environmental groups, no one in Mauritius was consulted about British plans for the area.

Sands’ direct involvement in the case began a few months after Miliband’s announcement, when he conferred with Mauritius’ then-Prime Minister Navi Ramgoolam, a member of the English bar, who wanted to find a way to challenge the lawfulness of the Marine Protected Area.  The two focused upon the novel idea of seeking relief through the United Nations Convention on the Law of the Sea (UNCLOS), which had been finalized in 1982 and ratified by more than 150 countries after more than 10 years of negotiations.

Although many of its terms addressed technical matters like fishing rights and the delimitation of sea boundaries, the UNCLOS also contained new rules on the protection of the marine environment and on the “common heritage of mankind,” giving all states rights to mineral resources under the seabed.  The treaty can thus be considered a “post-colonial instrument” that “sought to give effect to the principle of self-determination” (p.62), Sands writes. He and the Prime Minister settled upon attacking Britain’s new policy on several technical grounds, including that it violated Mauritius’ fishing rights around Chagos, coupled with a broader challenge that Britain was not Chagos’ “coastal state” under the UCLOS –  a direct challenge to the legitimacy of both the detachment of Chagos at the time of independence and Britain’s continued assertion of sovereignty over the archipelago.

The UNCLOS provided for arbitration of disputes, along with dispute resolution at the ICJ and at a new International Tribunal for the Law of the Sea.  For strategic reasons, the Prime Minister and Sands decided to pursue the arbitration route, with Mauritius launching proceedings in December 2010. A single document supported its application, a United States cable intercepted and published by Wikileaks, which quoted a British official telling the United States that we, Britain, “do not regret the removal of the population” (p.88-89), and suggesting that Britain intended to harness the Marine Protected Area to “extinguish forever the Chagnossians’ ability to return” (p.89), a positive side effect for a project that had already won the approval of environmentalists.

The arbitration panel’s decision, delivered in March 2015, produced a limited victory for Mauritius. The panel unanimously ruled in Mauritius’s favor on the technical issues it had raised. But it declined to rule on which of the two countries was the “coastal state” under the UNCLOS, the broader challenge to British sovereignty, or on the effect of UNGA Resolution 1514 on the case.  Two dissenting arbitrators, however, agreed with Mauritius’ position on both sets of issues. This partial victory led Mauritius to conclude that the time was ripe to petition the UNGA for a referral of the Chagnos case to the ICJ for an advisory opinion, which it did in June 2017.

The June 2016 Brexit referendum, when the United Kingdom voted to leave the European Union, provided Mauritius with an unexpected boost at the UNGA.  Sands notes how British ministers were “waxing lyrically about a new Empire 2.0” (p.101), enough by itself to scare many UNGA member states, less lyrical about British Empire 1.0.  But the “brutal reality” was that Britain “could no longer rely on the unqualified support of EU members and their networks across the UN”  (p.101).  Both within and beyond the EU, Britain’s authority had “suffered a major collapse” (p.102), Sands writes.  Britain fell far short in its effort to defeat the referral resolution, which passed the UNGA by a comfortable margin, with 94 member states voting in favor, 16 against, and 65 abstentions.

There were nine factual and legal points that Sands considered essential to the Chagos case at the ICJ, and he explains how these had to be tailored to appeal to judges from a wide variety of legal systems and cultures. But he and his legal team also wrestled with how to present the human side of the case to the judges. They opted for a video statement from Madame Elysé.  In an intense address of less than 4 minutes, delivered in her native Créole and translated into English and French, she conveyed the circumstances surrounding her forcible uprooting from her home in 1973 with “clarity, force and passion” (p.4), revealing that she had lost her baby during the passage out of her native island.  She finished by telling the court that as she reached her last years, she had one overwhelming desire:  to return home, to the island where she was born.

The United Kingdom’s legal representative urged the court to dismiss the case as a “bi-lateral sovereignty dispute” outside the court’s authority, although he provided assurances that Britain supported the court and the international rule of law. He expressed “deep respects” to the Chagnossians, conceding that the manner – although not the fact – of their removal had been “shameful and wrong” (p.126-27).    He expressed no commitment to allow the Chagnossians to return. That Britain had paid compensation over the years was “amends enough” (p.127).

In its decision, announced in early 2019, the ICJ rejected Britain’s argument that the case was simply a bilateral territorial dispute. The court found that the Chagnossians had been “forcibly removed” and “prevented from returning” (p.132), actions contrary to UNGA Resolution 1514.  Rather than creating a new rule, 1514 had declared an existing rule of customary law, “one that no state voted against” (p.132).  Because the detachment of Chagos had not been based on the “free and genuine expression of the will of the people concerned” (p.133), it followed that Britain’s continued assertion of sovereignty over the archipelago was a “wrongful act” which should end “as rapidly as possible” (p.133). The resettlement of Mauritian nationals, the ICJ concluded, involved issues “relating to the protection of human rights” (p.133), but those were for the UNGA to address.

A few months later, the UNGA adopted a near unanimous resolution (116 nation-states in favor, 55 abstentions, and 5 no votes) which amounted to an affirmation of the ICJ decision, stating that the Chagos archipelago “forms an integral part of the territory of Mauritius” and demanding that Britain “withdraw its colonial administration … unconditionally within a period of no more than six months.”    To date, that withdrawal has not happened. Rather, Britain continues to cling to the notion that it retains sovereignty over the Chagos Archipelago and has not recognized Madame Elysé’s right to reinhabit the island of her birth.

Although Mauritius’ Prime Minister Pravind Jugnauth and British Prime Minister Theresa May met in the aftermath of the ICJ decision and UNGA resolutions, those meetings ended with May’s defiant written response that sovereignty over the Chagos archipelago “will be ceded when the British Indian Ocean Territory is no longer needed for defense purposes” (p.136), flatly rejecting return of the Chagnossians. The stream of diplomatic notes, press statements and answers to parliamentary questions on Mauritius, Sands indicates, almost invariably begin with the same words:

The United Kingdom has no doubt about its sovereignty over the Chagos Archipelago, which has been under continuous British sovereignty since 1814.  Mauritius has never held sovereignty over the Archipelago and we do not recognize its claim  (p.146).

This absence of doubt is particularly striking, Sands writes, “since the British have never been unable able to persuade any international judge – not even one – to express support for its claim to the archipelago. This raises serious questions about the country’s purported commitment to the rule of law. Two Prime Ministers and five Foreign Secretaries have embraced lawlessness, for reasons that are unclear, hoping to tough it out and make the problem go away”  (p.146).

* * *

Sands’ work also raises  questions about Britain’s commitment to the rule of law. It is a piece of advocacy, in which Britain’s consistently hardline positions seem almost cartoonish, leaving the reader wondering whether there may be more substance to those positions than what Sands presents here.  But if Sands is writing more as a lawyer than a journalist or historian, this searing work nonetheless represents a clear victory in the court of public opinion for Madame Elysé and her fellow Chagnossians – and for international justice.

Thomas H. Peebles

Bogotá, Colombia

June 1, 2023

 

 

 

 

 

 

 

 

7 Comments

Filed under British History, World History