Category Archives: European History

Taking Exception To American Foreign Policy

Andrew Bacevich, After the Apocalypse:

America’s Role in a World Transformed (Metropolitan Books 2020)

Andrew Bacevich is one of America’s most relentless and astute critics of United States foreign policy and the role the American military plays in the contemporary world.  Professor Emeritus of History and International Relations at Boston University and presently president of the Quincy Institute for Responsible Statecraft, Bacevich is a graduate of the United States Military Academy who served in the United States Army for over 20 years, including a year in Vietnam.  In his most recent book, After the Apocalypse: America’s Role in a World Transformed, which came out toward the end of 2020, Bacevich makes an impassioned plea for a smaller American military, a demilitarized and more humble US foreign policy, and more realistic assessments of US security and genuine threats to that security, along with greater attention to pressing domestic needs.  Linking these strands is Bacevich’s scathing critique of American exceptionalism, the idea that the United States has a special role to play in maintaining world order and promoting American democratic values beyond its shores.

In February 2022, as I was reading, then writing and thinking about After the Apocalypse, Vladimir Putin continued amassing soldiers on the Ukraine border and threatening war before invading the country on the 24th.  Throughout the month, I found my views of Bacevich’s latest book taking form through the prism of events in Ukraine.   Some of the book’s key points — particularly on NATO, the role of the United States in European defense, and yes, Ukraine – seemed out of sync with my understanding of the facts on the ground and in need of updating. “Timely” did not appear to be the best adjective to apply to After the Apocalypse. 

Bacevich is a difficult thinker to pigeonhole.  While he sometimes describes himself as a conservative,  in After the Apocalypse he speaks the language of those segments of the political left that border on isolationist and recoil at almost all uses of American military force (these are two distinct segments: I find myself dependably in the latter camp but have little affinity with the former).  But Bacevich’s against-the-grain perspective is one that needs to be heard and considered carefully, especially when war’s drumbeat can be heard.

* * *

Bacevich’s recommendations in After the Apocalypse for a decidedly smaller footprint for the United States in its relations with the world include a gradual US withdrawal from NATO, which he considers a Cold War relic, an “exercise in nostalgia, an excuse for pretending that the past is still present” (p.50).  Defending Europe is now “best left to Europeans” (p.50), he argues.   In any reasoned reevaluation of United States foreign policy priorities, moreover, Canada and Mexico should take precedence over European defense.  Threats to Canadian territorial sovereignty as the Artic melts “matter more to the United States than any danger Russia may pose to Ukraine” (p.169).

I pondered that sentence throughout February 2022, wondering whether Bacevich was at that moment as unequivocal about the United States’ lack of any geopolitical interest in Ukraine as he had been when he wrote After the Apocalypse.  Did he still maintain that the Ukraine-Russia conflict should be left to the Europeans to address?  Was it still his view that the United States has no business defending beleaguered and threatened democracies far from its shores?  The answer to both questions appears to be yes.  Bacevich has had much to say about the conflict since mid-February of this year, but I have been unable to ascertain any movement or modification on these and related points.

In an article appearing in the February 16, 2022, edition of The Nation, thus prior to the invasion, Bacevich described the Ukrainian crisis as posing “minimal risk to the West,” given that Ukraine “possesses ample strength to defend itself against Russian aggression.”  Rather than flexing its muscles in faraway places, the United States should be “modeling liberty, democracy, and humane values here at home. The clear imperative of the moment is to get our own house in order” and avoid “[s]tumbling into yet another needless war.”   In a nutshell, this is After the Apocalypse’s broad vision for American foreign policy. 

Almost immediately after the Russian invasion, Bacevich wrote an OpEd for the Boston Globe characterizing the invasion as a “crime” deserving of “widespread condemnation,” but cautioning against a “rush to judgment.”  He argued that the United States had no vital interests in Ukraine, as evidenced by President Biden’s refusal to commit American military forces to the conflict.  But he argued more forcefully that the United States lacked clean hands to condemn the invasion, given its own war of choice in Iraq in 2003 in defiance of international opinion and the “rules-based international order” (Bacevich’s quotation marks).  “[C]coercive regime change undertaken in total disregard of international law has been central to the American playbook in recent decades,” he wrote.  “By casually meddling in Ukrainian politics in recent years,” he added, alluding most likely to the United States’ support for the 2013-14 “Euromaidan protests” which resulted in the ouster of pro-Russian Ukrainian president Viktor Yanukovych, it had “effectively incited Russia to undertake its reckless invasion.”

Bacevich’s article for The Nation also argued that the idea of American exceptionalism was alive and well in Ukraine, driving US policy.  Bacevich defined the idea hyperbolically as the “conviction that in some mystical way God or Providence or History has charged America with the task of guiding humankind to its intended destiny,” with these ramifications:

We Americans—not the Russians and certainly not the Chinese—are the Chosen People.  We—and only we—are called upon to bring about the triumph of liberty, democracy, and humane values (as we define them), while not so incidentally laying claim to more than our fair share of earthly privileges and prerogatives . . . American exceptionalism justifies American global primacy.

Much  of Bacevich’s commentary about the Russian invasion of Ukraine reflects his impatience with short and selected historical memory.  Expansion of NATO into Eastern Europe in the 1990s, Bacevich told Democracy Now in mid-March of this year, “was done in the face of objections by the Russians and now we’re paying the consequences of those objections.”  Russia was then “weak” and “disorganized” and therefore it seemed to be a “low-risk proposition to exploit Russian weakness to advance our objectives.”  While the United States may have been advancing the interests of Eastern European countries who “saw the end of the Cold War as their chance to achieve freedom and prosperity,” American decision-makers after the fall of the Soviet Union nonetheless  “acted impetuously and indeed recklessly and now we’re facing the consequences.”

* * *

“Short and selected historical memory” also captures Bacevich’s objections to the idea of American exceptionalism.  As he articulates throughout After the Apocalypse, the idea constitutes a whitewashed version of history, consisting “almost entirely of selectively remembered events” which come “nowhere near offering a complete and accurate record of the past” (p.13).  Recently-deceased former US Secretary of State Madeline Albright’s 1998 pronouncement that America resorts to military force because it is the “indispensable nation” which “stand[s] tall and see[s] further than other countries into the future” (p.6) may be the most familiar statement of American exceptionalism.  But versions of the idea that the United States has a special role to play in history and in the world have been entertained by foreign policy elites of both parties since at least World War II, with the effect if not intention of ignoring or minimizing the dark side of America’s global involvement.

 The darkest in Bacevich’s view is the 2003 Iraq war, a war of choice for regime change,  based on the false premise that Saddam Hussein maintained weapons of mass destruction.  After the Apocalypse returns repeatedly to the disastrous consequences of the Iraq war, but it is far from the only instance of intervention that fits uncomfortably with the notion of American exceptionalism. Bacevich cites the CIA-led coup overthrowing the democratically elected government of Iran in 1953, the “epic miscalculation” (p.24) of the Bay of Pigs invasion in 1961, and US complicity in the assassination of South Vietnamese president Ngo Dinh Diem in 1963, not to mention the Vietnam war itself.  When commentators or politicians indulge in American exceptionalism, he notes, they invariably overlook these interventions.

A  telling example is an early 2020 article in  Foreign Affairs by then-presidential candidate Joe Biden.  Under the altogether conventional title “Why America Must Lead Again,” Biden contended that the United States had “created the free world” through victories in two World Wars and the fall of the Berlin Wall.  The “triumph of democracy and liberalism over fascism and autocracy,” Biden wrote, “does not just define our past.  It will define our future, as well” (p.16).  Not surprisingly, the article omitted any reference to Biden’s support as chairman of the Senate Foreign Relations Committee for the 2003 invasion of Iraq.

Biden had woven “past, present, and future into a single seamless garment” (p.16), Bacevich contends.  By depicting history as a “story of America rising up to thwart distant threats,” he had regurgitated a narrative to which establishment politicians “still instinctively revert in stump speeches or on patriotic occasions” (p.17) — a narrative that in Bacevich’s view “cannot withstand even minimally critical scrutiny” (p.16).  Redefining the United States’ “role in a world transformed,” to borrow from the book’s subtitle, will remain “all but impossible until Americans themselves abandon the conceit that the United Sates is history’s chosen agent and recognize that the officials who call the shots in Washington are no more able to gauge the destiny of humankind than their counterparts in Berlin or Baku or Beijing” (p.7).

Although history might well mark Putin’s invasion of Ukraine as an apocalyptic event and 2022 as an apocalyptic year, the “apocalypse” of Bacevich’s title refers to the year 2020, when several events brought into plain view the need to rethink American foreign policy.  The inept initial response to the Covid pandemic in the early months of that year highlighted the ever-increasing economic inequalities among Americans.  The killing of George Floyd demonstrated the persistence of stark racial divisions within the country.  And although the book appeared just after the presidential election of 2020, Bacevich would probably have included the assault on the US Capitol in the first week of 2021, rather than the usual transfer of presidential power, among the many policy failures that in his view made the year apocalyptic.  These failures, Bacevich intones:

 ought to have made it clear that a national security paradigm centered on military supremacy, global power projection, decades old formal alliances, and wars that never seemed to end was at best obsolete, if not itself a principal source of self-inflicted wounds.  The costs, approximately a trillion dollars annually, were too high.  The outcomes, ranging from disappointing to abysmal, have come nowhere near to making good on promises issued from the White House, the State Department, or the Pentagon and repeated in the echo chamber of the establishment media (p.3).

In addition to casting doubts on the continued viability of NATO and questioning any US interest in the fate of Ukraine, After the Apocalypse dismisses as a World War II era relic the idea that the United States belongs to a conglomeration of nations known as  “the West,” and that it should lead this conglomerate.  Bacevich advocates putting aside ”any residual nostalgia for a West that exists only in the imagination” (p.52).  The notion collapsed with the American intervention in Iraq, when the United States embraced an approach to statecraft that eschewed diplomacy and relied on the use of armed force, an approach to which Germany and France objected.   By disregarding their objections and invading Iraq, President George W. Bush “put the torch to the idea of transatlantic unity as a foundation of mutual security” (p.46).  Rather than indulging the notion that whoever leads “the West” leads the world, Bacevich contends that the United States would be better served by repositioning itself as a “nation that stands both apart from and alongside other members of a global community” (p.32).

After the apocalypse – that is, after the year 2020 – the repositioning that will redefine America’s role in a world transformed should be undertaken from what Bacevich terms a “posture of sustainable self-sufficiency” as an alternative to the present “failed strategy of military hegemony (p.166).   Sustainable self-sufficiency, he is quick to point out, is not a “euphemism for isolationism” (p.170).  The government of the United States “can and should encourage global trade, investment, travel, scientific collaboration, educational exchanges, and sound environmental practices” (p.170).  In the 21st century, international politics “will – or at least should – center on reducing inequality, curbing the further spread of military fanaticism, and averting a total breakdown of the natural world” (p.51).  But before the United States can lead on these matters, it “should begin by amending its own failings (p.51),” starting with concerted efforts to bridge the racial divide within the United States.

A substantial portion of After the Apocalypse focuses on how racial bias has infected the formulation of United States foreign policy from its earliest years.  Race “subverts America’s self-assigned role of freedom,” Bacevich writes.  “It did so in 1776 and it does so still today” (p.104).  Those who traditionally presided over the formulation of American foreign policy have “understood it to be a white enterprise.”  While non-whites “might be called upon to wage war,” he emphasizes, but “white Americans always directed it” (p.119).  The New York Times’ 1619 Project, which seeks to show the centrality of slavery to the founding and subsequent history of the United States, plainly fascinates Bacevich.  The project in his view serves as an historically based corrective to another form of American exceptionalism, questioning the “very foundation of the nation’s political legitimacy” (p.155).

After the Apocalypse raises many salient points about how American foreign policy interacts with other priorities as varied as economic inequality, climate change, health care, and rebuilding American infrastructure.  But it leaves the impression that America’s relationships with the rest of the world have rested in recent decades almost exclusively on flexing American military muscle – the “failed strategy of militarized hegemony.”  Bacevich says little about what is commonly termed “soft power,” a fluid term that stands in contrast to military power (and in contrast to punitive sanctions of the type being imposed presently on Russia).  Soft power can include such forms of public diplomacy  as cultural and student exchanges, along with technical assistance, all of which   have a strong track record in quietly advancing US interests abroad.

* * *

To date, five full weeks into the Ukrainian crisis, the United States has conspicuously rejected the “failed strategy of militarized hegemony.”  Early in the crisis, well before the February 24th invasion, President Biden took the military option off the table in defending Ukraine.  Although Ukrainians would surely welcome the deployment of direct military assistance on their behalf, as of this writing NATO and the Western powers are fighting back through stringent economic sanctions – diplomacy with a very hard edge – and provision of weaponry to the Ukrainians so they can fight their own battle, in no small measure to avoid a direct nuclear confrontation with the world’s other nuclear superpower.

The notion of “the West” may have seemed amorphous and NATO listless prior to the Russian invasion.  But both appear reinvigorated and uncharacteristically united in their determination to oppose Russian aggression.  The United States, moreover, appears to be leading both, without direct military involvement but far from heavy-handedly, collaborating closely with its European and NATO partners.  Yet, none of Bacevich’s writings on Ukraine hint that the United States might be on a more prudent course this time.

Of course, no one knows how or when the Ukraine crisis will terminate.  We can only speculate on the long-term impact of the crisis on Ukraine and Russia, and on NATO, “the West,” and the United States.  Ukraine 2022 may well figure as a future data point in American exceptionalism, another example of the “triumph of democracy and liberalism over fascism and autocracy,” to borrow from President Biden’s Foreign Affairs article.  But it could also be one of the data points that its proponents choose to overlook.

Thomas H. Peebles

La Châtaigneraie, France

March 30, 2022

 

 

 

11 Comments

Filed under American Politics, American Society, Eastern Europe, Politics

Breaking Away

 

J.H. Elliot, Scots and Catalans:

Union and Disunion (Yale University Press)

[NOTE: This review has also been posted to the Tocqueville 21 blog, maintained in connection with the American University of Paris’ Tocqueville Review and its Center for Critical Democracy Studies]

Are the United Kingdom and Scotland barreling toward a crisis over Scottish independence of the magnitude of that which rattled Spain in 2017, when Catalonia, the country’s northeast corner that includes Barcelona, unilaterally declared its independence? That possibility seems less far-fetched after early May’s parliamentary elections in Scotland, in which the Scottish National Party (SNP) fell just one seat shy of an absolute majority. In coalition with the Scottish Green Party, the SNP is now in a position to set the legislative agenda for Scotland. To no one’s surprise, Nicola Surgeon, Scottish First Minister and SNP leader, announced after the recent elections that she would seek a second referendum on Scottish independence, presumably similar to the one that took place in 2014. For Sturgeon, a second independence referendum is now a matter of “when, not if.”  But British Prime Minister Boris Johnson reiterated his opposition to another referendum; that of 2014 was a “once in a generation” event, the Prime Minister explained.

Separatism also advanced appreciably in Catalan regional elections in February of this year, with pro-independence parties capturing a clear majority of seats in the regional parliament. But numerous parties with a range of views on separation seek to carry the independence banner in Catalonia. The movement has no single voice comparable to that of Sturgeon and the SNP.

While no one can say with certainty where Scotland and Catalonia are heading, J.H. Elliot, Regius Professor Emeritus at Oxford University, has produced an extraordinarily timely, in-depth guide to how separatism has come to dominate the 21st century politics of each: Scots and Catalans: Union and Disunion. From the mid-15th century up through the Catalan crisis of 2017, Elliot traces the relationship of Scotland and Catalonia to the larger entities we now call Great Britain and Spain, relationships in which genuine grievances mix with myths, resentments, and manipulations of history.

The Catalan crisis of 2017, the endpoint in Elliot’s narrative, ensued after regional authorities organized a non-binding independence referendum, conducted over the strong objection of the central government in Madrid.  90% of Catalans who voted approved the referendum, but several major Catalan parties boycotted it and only 43% of eligible voters actually voted. When the Catalan regional parliament adopted a resolution declaring the region an independent republic, the central government responded by invoking the 1978 Spanish constitution to remove regional authorities and impose direct rule from Madrid.  Carles Puigdemont, the Catalan regional president, was formally accused of treason and fled to Belgium with key members of his cabinet, where he remains to this day.

In sharp contrast to the 2017 Catalan initiative, the 2014 Scottish independence referendum had the approval of the central government in London, having been negotiated by Johnson’s predecessor, David Cameron.  Scottish voters moreover soundly rejected independence: 55%- 45%, with 85% of eligible voters casting ballots. But one of the main issues in the campaign was the desire of many Scottish voters to maintain membership in the European Union as part of the United Kingdom, rather than secede and apply for EU membership as an independent nation. The Brexit referendum two years later, also a Cameron-approved measure, upended this understanding. While a far from united United Kingdom approved the initiative to leave the European Union, Scottish voters adhered to the “remain” position by an emphatic 62%-38% margin, with about two-thirds of eligible Scottish voters participating.

Elliot is scathing in his condemnation of the Catalonian secessionists’ decision to press ahead in 2017 with their unilateral declaration of independence, describing it as an “act of folly, unleashing consequences that never seemed to have crossed the proponents’ minds as they took the plunge” (p.263).  In more muted terms, he appears to endorse the outcome of the orderly 2014 referendum in Scotland: “Stability had triumphed over risk, pragmatism over utopianism, fear over hope” (p.246).  But Elliot treats the Brexit referendum two years later only in two non-judgmental paragraphs.  Many Scots who voted “No” in 2014 have felt compelled to reassess their position in light of Brexit. Elliot’s decision not to weigh in more forcefully on the impact of Brexit constitutes a missed opportunity in this otherwise painstakingly comprehensive work.

Although Elliot focuses almost exclusively on the Catalan and Scottish independence movements, easily the most visible in today’s Europe, they are hardly the only ones. Depending upon how one counts, there are presently about 20 active separatist movements in Europe, some of which seem to be mainly quests for more autonomy rather than secession.  Finding common denominators among them can be difficult – each is mostly a product of its own historical and cultural circumstances. But nationalism is usually considered one such denominator, often the only one, and what Elliot terms a “resurgent nationalism” (p.4) is at play in both Catalonia and Scotland.

These and other 21st century secessionist movements harken back to the classical 19th century European version of nationalism: the idea that a people with a common culture and history — and often a common language, as in Catalonia – have an inherent right to rule themselves. This idea, which buttressed Europe’s 1848 uprisings, produced the modern nation-state, a state with a nationalist creed binding it together — a common core of shared principles, traditions and values accepted by its disparate regions, and its major ethnic, religious, and cultural groups. But separatist movements in Scotland, Catalonia and elsewhere are predicated on a rejection, implicit if not explicit, of the nationalist creed and in this sense are the antipode of classical 19th century nationalism. Some separatist movements partake of xenophobic and authoritarian-leaning nationalist impulses. But neither the Scottish nor the Catalan independence movement can be described in these terms – if anything, both Scotland and Catalonia tilt leftward on 21st century Europe’s left-right pendulum.

Scots and Catalans consists of six chapters, each focused on a discrete historical period. It begins with “Dynastic Union, 1469-1625.” 1469, the year in which Ferdinand of Aragon married Isabelle of Castile, marked the beginning of a composite, multi-regional monarchy on the Iberian Peninsula, with the Crown of Aragon including the principality of Catalonia. The last chapter, “Breaking Away? 1975-2017” covers the time from the death of Spanish dictator General Francisco Franco and the beginnings of modern democracy in Spain in 1975, up through the Catalan constitutional crisis of 2017.  Unlike many comparative histories, Elliot does not rely on separate chapters for his two subjects. His narrative goes back and forth between Catalonia and Scotland, Spain and Britain, setting out the two histories side-by-side. Although not quite his intention, this technique highlights how different Catalonia’s relationship to Spain has been from that of Scotland to Great Britain from the early 18th century onward. Only in the late 20th and early 21st centuries does Elliot find significant convergences between the two independence movements.

* * *

Prior to its 1707 the union with England, Scotland had been an independent kingdom, one shaken by the 17th century’s religious and civil wars that had upended its more powerful neighbor to the South.  Catalonia, by contrast, had never been a sovereign state in any modern sense of the term.  But as one of several rebellious provinces within Spain’s composite monarchy, Catalonia had a colorable claim to a set of ancient liberties and privileges that the Nueva Planta decrees of Phillip of Anjou, the first Bourbon King of Spain, erased between from 1707 and 1716.

Designed to impose the centralized French model on Spain’s unruly provinces, the Nueva Planta decrees abolished the Catalan legislature and imposed the Castilian language – today’s Spanish — on the region. While Scotland’s consensual association with England was the result of genuine negotiations between two sovereign kingdoms, Catalonia was “subjected to a settlement imposed by a victorious monarch, who stigmatized its peoples as rebels” (p.89).  Catalonia came to be seen, both by its citizens and the central government in Madrid, as a territory under military occupation.

Throughout the 18th and 19th centuries, the feeling in Spain that the Catalans were inherently intractable never disappeared. Catalans, constantly inveighing against “centralization,” responded to pressures from Madrid by emphasizing with “growing stridency” the “uniqueness of their own history and culture” (p.163), Elliot writes. By contrast, the Scots felt less need to be assertive about their distinctive heritage, and less obsessed about their potential loss of identity. Tensions between London and Edinburgh were “far fewer than those to be found in the Barcelona-Madrid relationship” (p.163).

Spain fell under the rule of two military dictatorships in the 20th century. That of Primo de Rivera, from 1923 to 1930, preceded the 1936-39 Spanish Civil War and the ensuing Franco regime, which lasted until the General’s death in 1975. Both de Rivera and Franco pursued national unity by ruthlessly suppressing regionalist tendencies across Spain. But Franco probably distrusted Catalonia more than any other region during his long rule. Spain did not begin its transition to a modern democratic nation-state until after Franco’s death.

In 1979, following the first free elections in Spain since the 1930s in 1977, Catalan voters approved a statute of autonomy for the region that recognized Catalonia as a “nationality,” gave the Catalan language an official status equal to Castilian Spanish, and conceded extensive powers to Catalonia in education, culture and language. Catalonia henceforth became what Elliot describes as an “integral but largely self-governing part of what the bulk of its inhabitants had long wanted – a democratic, decentralized and modernizing Spain” (p.229).

1979 was also the year Margaret Thatcher and her Conservative Party were voted into office in Britain. Thatcher moved quickly to shut down all talk about “devolution,” which envisioned re-establishing the Scottish parliament and according more autonomy to Scotland. In Elliot’s view, Thatcher probably did more to spur the modern separatist movement in Scotland than any other single individual. Devolution came to Scotland in 1997, when Scottish voters approved creation of an independent Scottish parliament, its first since the 1707 union with England. By 1997, Scotland enjoyed approximately the same degree of autonomy from the central government in Westminster that Catalonia had achieved in 1979.

Elliot further fits both independence movements into a broader 21st century framework, wherein pressures upon the traditional nation-state from above, driven by the European Union, economic inequalities, and what we often term globalization, have generated a “general sense in many parts of the western world that highly bureaucratized central governments [have] become too remote to understand the true needs and problems of the governed” (p.3).  Separatism for Scotland and Catalonia, as elsewhere, appears to offer an easy answer to those who feel they have lost control over their lives. “Independence [will] allow them once again to be masters in their own house,” he writes. But much of this, he adds tartly, referring more to Catalonia than Scotland, is “nostalgia for a world that never was” (p.267).

* * *

A second independence referendum for Scotland – and with it Scottish independence — now appears, if not inevitable, more probable than not, despite Boris Johnson’s opposition. As Scottish journalist Jamie Maxwell wrote in the New York Times after the May elections, a Johnson veto would be tantamount to “transforming Britain from a voluntary association based on consent into a compulsory one” –– an ironic transformation to the way Catalan secessionists view their relationship to Spain.

Continued political stalemate, rather than realistic prospects for independence, looks like the better bet for Catalonia. The region lacks a leader comparable to Sturgeon, who has ruled out a “wildcat referendum” and is generally cautious, steady and unusually adept at playing the long game – words rarely used to describe former Catalan regional president Carles Puigdemont.  Sturgeon seems confident that Johnson will “ultimately buckle under the weight of democratic pressure,” as Maxwell puts it.  Independence may nevertheless be in the cards in this decade for both Scotland and Catalonia.  But in demonstrating the deep historical dissimilarities between Scotland’s relationship to Great Britain and Catalonia’s to Spain, Elliot’s erudite history suggests that the two entities are likely to travel distinctly different paths to independence.

Thomas H. Peebles

Paris, France

July 8,  2021

 

 

 

3 Comments

Filed under British History, European History, Spanish History

Papa Franz’ Columbia Circle

 

Charles King, Gods of the Upper Air:

How A Circle of Renegade Anthropologists Reinvented Race, Sex, and Gender

In the Twentieth Century (Doubleday)

 

A book billed as an inside look at the anthropology department of Columbia University from the 1890s through the 1940s seems unlikely to send readers scurrying for a copy.  But readers might be inclined to scurry if they knew that in this timeframe, a small circle of anthropologists associated with Columbia essentially rewrote the books on anthropology and more generally on human nature, giving shape to modern ways in which we think about issues of race, sex and gender, along with what we mean by culture and how we might understand people living in societies very different from our own.  These epic transformations in thinking and the anthropologists behind them constitute the subject of Charles King’s engaging Gods of the Upper Air: How A Circle of Renegade Anthropologists Reinvented Race, Sex, and Gender In the Twentieth Century.

King’s work revolves around Franz Boas (1856-1942), who taught in Columbia’s anthropology department off and on from 1887 through the late 1930s, and three of his star students, all female: Margaret Mead (1901-1978), Ruth Benedict (1887-1948), and Zora Neale Hurston (1891-1960).   The cantankerous Papa Franz, as he was known, was a German immigrant who made a career of warning against jumping from one’s own “culture-bound schemas to pontificating about the Nature of Man” (p.247), as King puts it.  More than any other intellectual of his era, Boas attacked the pseudo-science that seemed to support society’s deepest prejudices, jousting frequently with late 19th and early 20th century racial theorists who “confidently pronounced that they had all of humanity figured out” (p.247).

Mead and Benedict are today better known than Boas, often thought of together as 20th century pioneers in anthropology and the social sciences. Mead gained fame for her studies of adolescent girls in far-flung places, and how they formed their attitudes toward sex and gender roles.  Benedict almost singlehandedly refined and redefined how we think about the word  “culture,” coining the term “cultural relativity.”  But the two pioneering anthropologists also enjoyed an intimate personal relationship throughout much of their adult lives, even as Mead regularly ran through and disposed of husbands.  King provides probing detail on the Mead-Benedict relationship and the many men in Mead’s complex personal life.   Hurston, African American, was a talented novelist, poet and essayist as well as anthropologist.  Although she lacked Mead or Benedict’s public profile in her lifetime, she has vaulted since her death into the upper echelon of 20th century African-American intellectuals, especially after being “rediscovered” by the poet and novelist Alice Walker in 1975.

King, a professor of international affairs and government at Georgetown University, ably captures how Papa Franz and his circle of renegade anthropologists used Columbia as a point of departure while traveling to the furthest reaches of the globe to develop their insights on human nature and human cultures.  While their insights varied, the four Columbia anthropologists all saw humanity as an indivisible whole.  They put into practice the notion that we can best understand other societies with a data driven methodology, where conclusions are always subject to refinement and change.  Social categories, such as race and gender, they agreed, should be considered artificial, the products of “human artifice, residing in the mental frameworks and unconscious habits of a given society” (p.10).  For all four, the “most enduring prejudices” were the “comfortable ones, those hidden up close; seeing the world as it is requires some distance, a view from the upper air” (p.345).

To the personal stories and professional thinking of Columbia’s renegade anthropologists, King deftly adds rich detail on their cohorts and contemporaries and the times in which they all lived.  The resulting work, written in a mellifluous style, is at once riveting yet surprisingly easy to understand – ample reason to scurry for a copy.

* * *

Franz Boas was born in 1856 into an assimilated Jewish family in Prussia, before Germany had become a unified country.  At age 28, he set out to study migration patterns of the Inuit, the indigenous people on Baffin Island in the Artic.  Boas actually lived among the Inuit people, a novelty for his time. When he put together his conclusions from his time on the island, he began using the German word Herzenbildung, the “training of one’s heart to see the humanity of another” (p.30), a notion that would shape his overall approach to anthropology over the next sixty years.

Boas immigrated to the United States in 1884, primarily to pursue his love interest in his future wife, Austrian American Marie Krackowizer.   Anthropology was then a term, King explains, that people were beginning to use for the combination of travel, artifact collection, language learning, and bone hunting.  But for Boas, anthropology was a data-driven discipline, a form of social science.  More than his peers, Boas emphasized the relationship between the data and the practitioner. “What counted as social scientific data – the specific observations that researchers jotted down in their field notes – was relative to the worldview, skill sets, and preexisting categories of the researchers themselves” (p.71).   A good anthropologist had to be committed to the critical refinement of his or her own experience in light of data gathered.  That was the “whole point of purposefully throwing yourself into the most foreign and remote of places. You had to gather things up before you refined them down” (p.247).

Boas’s penchant for following the data put him on what King describes as a “collision course with his adopted country’s most time-honored way of understanding itself, a cultural obsession that Europeans and Americans had learned to call race” (p.77).  In the late 19th and early 20th centuries, the concept of race was central to the field of anthropology, part of an “unshakable natural order” (p.79).  Humans had races in the same way that other animals had stocks or pedigrees.  A person’s lips, hair texture, nose or head shape, and skin tone all confirmed the multiplicity of human races, arranged in a sort of pyramid, with the white “races” of Northern and Western Europe and Protestant America at its apex.

Boas set forth his most comprehensive rejoinder to early 20th century race theories, purporting to be based on science, in his 1911 book, The Mind of Primitive Man.   Physical traits were a “poor guide to distinguishing advanced peoples from more backward ones” (p.100), Boas contended.  Not only was there “no bright line dividing one race from another, but the immense variation within racial categories called into question the utility of the concept itself” (p.101).  European success in exploiting resources in Africa and American success in settling the North American continent were not due to some inherent superiority on the part of the people typically called “civilized.”  Chance and time could be “equally good explanations for disparities in achievement” (p.100), he suggested.

Our ideas about race are themselves products of history, Boas implied, a “rationalization for something a group of people desperately want to believe”(p.106).  The pseudo-scientific racial theories that abounded in early 20th century Europe and America helped convince people that they are “higher, better and more advanced than some other group.  Race was how Europeans [and Americans] explained to themselves their own sense of privilege and achievement” (p.106).  For Boas, the spread of Europeans overseas during the age of exploration and the establishment of empires across the lands they conquered may have “cut short whatever material and cultural development had been in process there” (p.100).

Boas died in 1942, a time when racial theories emanating from his native Germany, then in the throes of Nazi rule, were being applied to exterminate Europe’s Jewish population.  On the day he died, he purportedly told a refugee from Nazi-occupied Paris, “We should never stop repeating the idea that racism is a monstrous error and an impudent lie” (p.316).  Among Boas’ disciples, Margaret Mead was considered his closest intellectual heir.  Through Mead, Boas’ core ideas “lived on and spread to a broader audience than Papa Franz ever could have dreamed” (p.338), King writes.

Mead, who grew up in a highly educated Philadelphia family, graduated in 1923 from Barnard, Columbia’s “sister” school.  From there, she became one of the first women to enroll in Papa Franz’ fiefdom,  Columbia’s graduate program in anthropology.  Under Boas’ guidance, Mead charted a “new way of doing anthropology itself” (p.148).  She “wanted to know about peoples’ lives: how they thought about childhood and aging, what it meant to be an adult, what they thought of as sexual pleasure, whom they loved, when they felt the sting of public humiliation or the gnawing sickness of private shame” (p.148).  What set Mead apart from her peers was that she determined to do this with the “invisible mass of people whom anthropologists . . . always seemed to miss – women and girls” (p.148).

After completing her dissertation, Boas suggested that Mead conduct first-hand field research, much as he had done as a young man on Baffin Island, and pointed her to American Samoa, a United States territory in the South Pacific.  Mead spent much of her time on three villages on the remote island of Ta’u.  The point of examining Samoa was to “see the schemes that people halfway around the world, in a very different environment, climate, and culture, had devised for rendering children into adults” (p.163).  To understand the lives, fears, passions, and worries of adolescent girls, Mead spent her time talking directly to them, the “true experts of the crisis of adolescence” (p.167).

The result of Mead’s study of adolescent girls in the Ta’u villages was Coming of Age in Samoa.  The book’s basic claim was that the Samoans of Ta’u “did not conceive of adolescence in precisely the same way that Americans tended to see it,” (p.167).  Samoan girls knew as much about sex as their counterparts in New York, probably more, Mead found.  But she observed no real sense of romantic love, inextricably linked in Western societies with monogamy, exclusiveness, jealousy, and undeviating fidelity.

Growing up in New Guinea, Mead’s sequel to Coming of Age in Samoa, appeared in 1930, before she was 30.  Given her frank discussions of sex and her “refusal to acknowledge the self-evident superiority of Western Civilization,” Mead was already considered an “outspoken, even scandalous public scientist” (p.185).  Seemingly overnight, she had become “one of the country’s foremost experts on the relevance of the most remote parts of the globe for understanding what was happening back home” (p.185).  From that point until her death in 1978, Mead was the “face of her discipline, the epitome of an engaged scholar,” even though other academics considered her “somehow outside the mainstream” (p.340).  King summarizes Mead’s core idea as a full recognition of women as human beings, “with the power to choose whatever social roles they wanted – mothers and caretakers as well as anthropologists and poets” (p.339).

As a young woman, Mead had enrolled in Columbia’s graduate program in anthropology at the urging of Ruth Benedict, fourteen years Mead’s senior and already a respected anthropologist.  Benedict served initially as Mead’s teacher, mentor and intellectual anchor.  Thereafter, their relationship evolved into something more intimate and decidedly more complicated.  But it was never quite the relationship Benedict hoped for.

Before she arrived at Columbia, Mead had married Luther Cressman, then a theology student and later an Episcopalian minister.  By the time Boas suggested she travel to American Samoa, Mead was having an affair with a prominent Canadian anthropologist, Edward Sapir, a former student of Boas, even though she was then finding herself increasingly attracted to Benedict.  Another dashing male lover later replaced Sapir, with Benedict serving as what might be unceremoniously described as Mead’s “backup.”  At least two other men subsequently swept Mead away. The players may have been different for Mead in the often cruel game of love, King writes, but it was always the same script, with Mead returning to Benedict until the next Mr. Right Now came along.  Mead’s enduring but erratic love for Benedict, King suggests, underscores her life-long inability to “settle down to one kind of relationship, whether with one person or with one gender” (p.258).

Benedict was always disappointed when the object of her affection moved from one man to the next (there’s no indication of other women in Mead’s life).  But she was herself a formidable anthropologist who rose to be Boas’ chief assistant at Columbia and was primed to become the department chairman upon his retirement, only to have the position given to a man from outside the university.  Unlike Mead, who was most interested in how individuals function within the structures that a given society constructs, Benedict was a “big picture” theorist, fashioning some of anthropology’s most sweeping insights about those structures.

In her signature work, Patterns in Culture, published in 1934, which King describes as arguably the most cited and most taught work of anthropological grand theory ever” (p.267), Benedict argued that real analysis of human societies starts with discarding prior assumptions that one’s own way of seeing the world is universal.  Paying attention to broad patterns enables one to grasp what makes a society “both different from all others and intrinsically meaningful to itself – its way of seeing social life, custom, and ritual, of defining the goals and pathways of life itself” (p.265).  All societies, each with its own coherence and sense of integration that “allows for individuals inside that society to find the way from childhood to adulthood,” Benedict argued, are “just snippets of a ‘great arc’ of possible ways of behaving” (p.264).

During World War II, while in Washington working at the Office of War Information, Benedict wrote her final book, The Chrysanthemum and the Sword.  Benedict was tasked with explaining Japan to America’s policy makers, part of an effort to understand the country’s enemy.  The standard view within the United States government was that the conflict in the Pacific, unlike that in Europe, was “nothing less than a struggle for racial dominance” (p.320).  The Japanese were considered inherently sneaky, treacherous, untrustworthy, and given to a fanatical allegiance to their country, whereas Germany was made up of essentially good people whose government had been hijacked by an evil clique.

Although she had no serious expertise in Japan, and no way to study Japanese culture first hand in wartime, Benedict aimed to counter the prevailing US government view of the Japanese.  The point of her title, The Chrysanthemum and the Sword, was that a society that had “delicate, refined ideas of beauty and creative expression could also value militarism, honor, and subservience” (p.327).  The work was made available to the general public in 1946.  In the years that followed, King notes, The Chrysanthemum and the Sword earned a “good claim to being the most widely read piece of anthropology ever written” (p.330).

Benedict wanted to go to Japan with the American occupation after the war, but was turned down as being too old and, likely, being female.  After an exhausting trip to Europe in 1948, Benedict, then age 61, died suddenly of a heart attack.  Over her long career,  King writes, Benedict  provided a “clearer definition than anyone before her of how social science could be its own design for living.”  She distilled what she had seen, where she had been, and what she was into a “code that was at once analytically sharp and deeply moral” (p.266).

While Zora Neale Hurston did not come close in her lifetime to achieving the high profile of Benedict and Mead, King suggests that this was due at least in part to the same racism that impeded all African-Americans in her time.  The “chasm of race,” he writes, “separated Hurston from the other members of the Boas circle, even at a time when Boas’ students were assiduously denying that race was a fundamental division in human societies” (p.293).

Hurston was born in Alabama but grew up in Central Florida.   All four of her grandparents had been slaves.   Like Mead, she enrolled at Barnard and from there found her way to Columbia’s anthropology department.  Simultaneously, Hurston became part of the African-American intellectual and cultural movement known as the Harlem Renaissance, a “sweeping experiment in redefining blackness in a country that had been built on defining it for you,” (p.193), as King puts it.  She became close to many of its leading luminaries, particularly the poet Langston Hughes.

Hurston returned to her native Central Florida at a time when Ku Klux Klan terror was widespread.  A “fully formed yet unappreciated recipe for living as a human being seemed to be lurking in the dense pinelands and lakeshores of northern and central Florida” (p.201-02), King writes.  More than Mead or Benedict, Hurston “found her calling in fieldwork,” (p.201).  No member of the Boas circle could claim to have gone as deeply as Hurston into the “lived experience of the people she was trying to understand” (p.292).

The result of Hurston’s  work in Florida was Mules and Men, published in 1935.  Mules and Men  marked an unprecedented effort to send the reader “deep inside southern black towns and work camps – not as an observer but as a kind of participant” (p.212).  Boas wrote the book’s preface, describing it as the first attempt to understand the “true inner life of the Negro” (p212).   Mules and Men confirmed the “basic humanity of people who were thought to have lost it, either because of some innate inferiority or because of the cultural spoilage produced by generations of enslavement” (p.214).

Mules and Men appeared the same year as another of Mead’s major studies, Sex and Temperament.  The critics did not view the two works in equal terms. “Volumes on Samoans or New Guineans were hailed as commentaries on the universal features of human society,” King observes, whereas one about African Americans in the American South was a “quaint bit of storytelling” (p.275).  Hurston subsequently spent time in Jamaica and Haiti, producing significant works on voodoo and folklore, while she also churned out essays, short stories and novels.  King derived his title from a deleted chapter in Hurston’s 1942 autobiography, Dust Tracks, where she wrote that the “gods of the upper air” had uncovered many new faces for her eyes.

Hurston died unheralded in 1960.  But in 1975, poet and novelist Alice Walker wrote an essay for Ms. magazine in which she recorded her efforts to retrace Hurston’s life journey.  Hurston, Walker wrote, was “one of the most significant unread authors in America.” (p.336). Walker’s essay marked the start of a Hurston revival that would “elevate her into the pantheon of great American writers, with an almost cult like following” (p.337). Today, King suggests, Hurston’s reputation arguably exceeds that of Langston Hughes and her other contemporaries of the Harlem Renaissance.

* * *

Boas and the Columbia anthropologists in his circle steered human knowledge in a remarkable direction, King concludes, “toward giving up the belief that all history leads inexorably to us” (p.343).  They deserve credit for expanding  the range of people who should be “treated as full, purposive, and dignified human beings” (p.343).  But Boas would be the first to admit that expansion of  that range remains a work in progress.

 

Thomas H. Peebles

La Châtaigneraie, France

December 1, 2020

 

 

 

 

 

15 Comments

Filed under American Society, European History, Gender Issues, Intellectual History, Science

Father and Son and Nazi Art

 

Mary Lane, Hitler’s Last Hostages:

Looted Art and the Soul of the Third Reich

(PublicAffairs)

In November 2013, Mary Lane, chief European art correspondent for the Wall Street Journal and all of 26 years old, was in New York to attend an art auction at Christie’s when her editor called and asked her to fly to Berlin immediately to cover a breaking story: a German magazine, Focus, had just revealed that nearly a year earlier a trove of approximately 1,200 artworks ostensibly stolen by Adolph Hitler’s Nazi regime, including works by Pierre-Auguste Renoir, Pablo Picasso, Edgar Degas, and Henri Matisse, had been discovered by German authorities in the Munich apartment of Cornelius Gurlitt, a reclusive octogenarian, in the course of a tax investigation.  If authentic, the works were clearly worth several million dollars.

Lane got her story out that November, then spent the next several years looking into the story behind the story.  The result is Hitler’s Last Hostages: Looted Art and the Soul of the Third Reich, which lays out how Cornelius’ father, Hildebrand Gurlitt, had amassed these and other paintings (along with some sculptures, woodcuts and etchings) while working on the  Adolph Hitler’s obsessional dream of the Führermuseum, a museum to be built near his birthplace in Linz, Austria, to showcase the art which the Nazis had stolen from museums, galleries, and private collections across Europe.  The Gurlitt case is intriguing, as Lane amply demonstrates, but hardly singular.  The Nazis stole a staggering amount of artwork during their murderous twelve years in power.

What Lane terms the “largest art heist in history” (p.122) includes approximately 600,000 paintings stolen from Jews alone, at least 100,000 of which are still missing, according to Stuart Eizenstat, United States  State Department expert advisor for Holocaust issues.  Eizenstat characterized the looting as “not only designed to enrich the Third Reich, but also an integral part of the Nazi goal of eliminating all vestiges of Jewish identity and culture.”  Eizenstat was the primary negotiator of the “Washington Principles,” a set of terms agreed upon in December 1998 by 44 countries, including Germany, Switzerland and Austria, to facilitate the return of Nazi-confiscated artworks to their lawful owners or compensate them.  The principles were more moral commitments than legal constraints, to be implemented within each country’s legal framework.   Since the principles were adopted, efforts to restore confiscated artworks to their rightful owners or their families have intensified.  Yet one of Lane’s most startling discoveries was that in the Gurlitt case Germany demonstrated a surprisingly tepid commitment to the Washington Principles.

Lane seeks to place the father-and-son Gurlitt case within the broader context of how art figured into the racist ideology of the Nazi regime.  She provides much biographical information on Hitler’s youth and especially his artistic pretensions prior to World War I — her first full chapter for example, is entitled “Portrait of the Dictator as a Young Man.”   Hitler was “genuinely obsessed with art” (p.7), she observes at the outset, considering himself an artist first and a politician second.

In elaborating upon how integral art was to the overall Nazi project, Lane emphasizes the role that Hitler’s sycophantic propagandist Joseph Goebbels played in prioritizing Hitler’s vision of what he termed “Aryan art” and ridding Europe of its opposite, “degenerate art.”  These terms were never satisfactorily defined, but in the Nazis’ binary world, Aryan art tended toward romantic landscapes, classical nudes and depictions of the heroic endeavors of the German people, whereas “degenerate art” usually referred to contemporary works, works that contained unpatriotic or overtly sexual themes, or were produced by Jewish artists – and often a mixture of these factors.   Lane adds specificity to her story by tracing the fate of two confiscated paintings that were discovered in Cornelius’ possession in 2012 and the effort thereafter to return them to their rightful owners: German Jewish impressionist Max Liebermann’s 1921 Two Riders on the Beach, inspired by the equestrian paintings of Edgar Degas; and Henri Matisse’s Woman With a Fan, a 1901 portrait of a “creamy-skinned brunette with a flowered blouse waving a fan to ward off the summer heat “ (p.159).

Lane also takes an unusually long look at George Grosz, a contemporary of Hitler who like the future Führer served in World War I and gained prominence – or notoriety – through his brutal depictions of the war’s realities.  After the war, Grosz was identified with the Dada art movement, which portrayed the follies of war in satirical and often non-nonsensical images.  He further burnished his reputation with his graphic sexual representations.  Grosz  became an outspoken and highly visible opponent of Hitler and his party.  To the Nazis, he represented degenerate art at its most degenerate.  After Grosz fled to the United States in 1933, some of his paintings wound up in the Gurlitt trove.

At times, Grosz seems to be the main protagonist of Lane’s story.  She devotes extensive portions of her book to him presumptively to demonstrate what principled artistic opposition to Hitler entailed.  But the Grosz sections are not an easy fit with the rest of her narrative.  The Gurlitt case, only about one half of this volume, is easily the most compelling half.

* * *

Hildebrand Gurlitt was born in 1895 in Dresden, and grew up in an artistic milieu. His father was a respected art historian whose tastes favored contemporary artists rather than old masters.  Hildebrand’s  maternal grandmother was Jewish, making him vulnerable when the Nazis came to power in 1933.  In the 1920s, Gurlitt became the director of a small-town museum where he promoted contemporary art and numerous Jewish artists, while engaging simultaneously in the ethically dubious practice of brokering sales.  He then moved to head the Hamburg Art Association, but was fired from the position shortly after the Nazis came to power, both because his preference for avant-garde art clashed with the Nazis’ artistic tastes and because he refused to fly the Nazi flag outside the Association’s building.

As the Nazis’ virulent anti-Semitism increased, Gurlitt realized that as a one quarter Jew who was no fan of the Nazis, he had to “leave the country, join the resistance, retreat into obscurity, or collaborate with the Nazis” (p.127).  Gurlitt chose the last option, becoming in 1938 one of four officially designated art dealers authorized to help liquidate confiscated Nazi artworks to support the Führermuseum project.  Hitler and Goebbels envisioned financing the project by seizing paintings and other artworks from galleries and museums across the country — and, later, in countries they planned to conquer — and destroying most “degenerate” pieces but selectively selling others across the continent to increase their foreign currency reserves to finance their war efforts.

Gurlitt used his extensive international connections to put together deals for the acquisition of works for the Führermuseum, many of which took place in France and the Netherlands after the Nazis occupied those countries.  Gurlitt generally returned to the government much of what he realized from his sales, but was allowed to keep a commission.   He also retained a portion of the works on the side for his personal “collection.”  With few exceptions, Gurlitt destroyed the paperwork.  As the Nazis faltered on the battlefield after their defeat at Stalingrad in early 1943, Hitler remained obsessed with the Führermuseum and Gurlitt forged ahead with acquisitions for the museum – and for himself.

Toward the end of 1943 or in early 1944, Gurlitt personally retained several stunning paintings by respected old masters, including a luminous work from the 1630s by Jan Brueghel the Younger of Dutch villagers welcoming home sailors.  He also consummated a huge art deal in Paris just before it was liberated in August 1944, acquiring works by many of the most significant names in modern French art, among them Degas, Manet, Pissaro, Renoir, and Courbet.  The deal included paintings and sculptures, but also woodcuts, lithographs and etchings.  The latter were easier to transport and “particularly difficult to trace as artists usually produced them in limited editions” (p.167).

If Gurlitt paid something for these and other artworks, it was a fraction of their  true value, and the money probably did not reach the genuine owners.  Overall, Gurlitt acquired approximately 3,800 pieces for the Fühermuseum project, making a small fortune in commissions for himself in the process, all the while acquiring works for his own collection.  It is “inconceivable,” Lane observes, that “on his salary Gurlitt could have acquired the more than 1,000 artworks he obtained during the war were it not for the dirty money he took in exchange for working as a high-ranking member of Hitler’s Führermuseum Project” (p.163).

In 1945, the year of the Nazi capitulation, Gurlitt moved most of his works to a private collection outside Dresden, his home city, and later to a manor 250 miles away in southwest Germany.  From there, he began a five year cat-and-mouse game with the “Monuments Men,” a group of about 400 art experts from Allied nations, formed in 1943 to protect art and other culturally significant artifacts in the event of an Allied victory.  In the post-war period, the Monuments Men were charged with finding and recovering artworks stolen by the Nazis (part of what was officially known as the “Monuments, Fine Arts and Archives Program,” the Monuments Men were celebrated in an eponymous 2014 film that starred George Clooney and Matt Damon).   Coming from many countries, the Monuments Men often did not speak a common language and never had the resources needed to accomplish their objectives.  Gurlitt bet his future and his art trove on telling them “calculated lies” for which they would have “insufficient resources to fact-check or rebut” (p.183).

Gurlitt won the bet.  The Monuments Men focused more on Gurlitt’s boss on the Führermuseum project, Hermann Voss, but eventually turned to him.  They questioned him seriously enough that he ended up giving up approximately 7% of his stock, falsely claiming that it represented his entire collection.  In late 1950, the Monuments Men returned the 7% to Gurlitt, which included Liebermann’s Two Riders on the Beach.   At some point in the post-war period, Gurliit also acquired Matisse’s Woman With a Fan, which the Nazis had looted from the renowned Parisian gallery of Paul Rosenberg, a personal friend of Pablo Picasso.   After Rosenberg fled Paris for the United States in 1940, the Nazis turned the gallery into the “Institute for the Study of Jewish Questions.”

* * *

Hildebrand Gurlitt died in an automobile crash on the Autobahn in November 1956, the point at which Lane’s focus turns to son Cornelius, 24 at the time of his father’s death.   Hildebrand’s estate provided Cornelius  with a comfortable inheritance, and from that point onward he determined that he would not work.  But he discretly sold  some of the works his father had retained on the grey market, dealing most frequently with Galerie Kornfeld in Bern, Switzerland.  In 1960, Cornelius moved into a huge house in Salzburg, and took with him 250 of his father’s most precious items, including works by Picasso, Munch, and Kandinsky.  His mother died in 1968 and he and his younger sister Betina had a falling out, after which  the increasingly isolated Cornelius began to manifest symptoms of severe paranoia.

By September 2011, German tax authorities suspected that Cornelius had been selling art without meeting reporting requirements.  In February 2012, the authorities obtained a warrant to enter Cornelius’s Munich apartment and ended up seizing all that he had hoarded there, approximately 1,2000 artworks.  German authorities did not disclose the confiscation to the international community, as the Washington Principles prescribed.  The German Government did commission a task force to evaluate the works, but only for tax purposes, not whether they might constitute confiscated art.  Chancellor Angela Merkel refused to make any public statement on the matter, not even an acknowledgement of the need for Germany to increase its efforts to restitute Nazi-confiscated art.  To Lane, it looked like the German government simply wanted to hide this discovery from world attention.

Cornelius, for his part, remained defiant. He gave an interview to Der Spiegel in which he defended his father, denying that he had been complicit in the crimes of the Nazi regime, and further denying that either he or his father had dealt in confiscated  art.  His father had been a hero for saving art from destruction, Cornelius contended.   Protected by a statute of limitations that had run in 1970, he went on to say that even if clear proof of prior ownership were presented, he had no intention of returning the works.  With the war 70 years in the past, it was time for families with claims to such works to “simply move on” (p.226).   And he chastised the government for invading his property and privacy, without charging him of a crime.

When German art experts suggested that he donate the works to a museum, Cornelius, then gravely ill, came up with a more cunning idea.  While hospitalized in January 2014, he signed a secret will that bequeathed his entire collection to the Kunstmuseum Bern in the Swiss capital.  But later that year, as he literally lay dying, he had a change of heart, in Lane’s view the result of contemplating the adverse effect which publicity about his case had had on his family name.  Cornelius signed an agreement in which the government dropped its tax investigation and stipulated to a one-year research period during which the state would have access to all paintings in his collection.  Shortly thereafter, in May 2014, Cornelius died at age 82.

After Cornelius’s death, his lawyers, the Kunstmuseum Bern and the German government formalized a deal his whereby the government would conduct research into the provenance of each work and return any looted pieces to the rightful heirs, if they could be located.  The remainder would belong exclusively to the museum. The families of the original owners of Max Liebermann’s Two Riders on the Beach and Matisse’s Woman with a Fan, were easily identified.  Both families were by then Jewish-American, living in New York City, and each presented unimpeachable documentation of lawful ownership.

Marianne Rosenberg, the granddaughter of Paul Rosenberg, had actively pursued the Matisse painting with her father, Paul’s son Alexandre, who died in 1987.  The Rosenbergs elected to keep the painting, one of the most valuable in the Gurlitt trove.  Liebermann’s Two Riders on the Beach belonged to the family of Holocaust survivor David Toren, then approaching age 90.  Less wealthy than the Rosenberg family, the Torens sold Liebermann’s work on auction.  Their long pursuit of the painting was by then well-publicized, and the family was more than surprised that the final price came to nearly five times its conservative initial estimate.  Recovery of the painting for the Toren family constituted a “further step in the long process of coping with the pain that Hitler had inflicted on millions of people,” Lane writes, and provided the family with a “certain sense of emotional closure regarding their fraught past” (p.256).   Lane does not indicate whether any additional works in the Gurlitt trove were returned to rightful owners.

* * *

In an Epilogue, Lane discusses an October 2018 exhibition in Berlin that featured 200 works from the Gurlitt trove, most by artists whom Hitler had labeled degenerate, including several Grosz street scenes.  German Culture Minister Monika Grütters made the opening remarks at the exhibition, noting how Germany had made progress in establishing institutions to deal with looted Nazi art.  But she never acknowledged that Germany had made any errors in how it had handled the Gurlitt case.  Nor did Minister Grütters address why the German government, by hiding the existence of the trove for more than a year, had “obstructed the very investigation into the art works that she now claimed to advocate” (p. 61).

By that time, moreover, Lane goes on to note, no high level German official had publicly backed the enactment of legislation, such as amending the statute of limitations, that would prevent a “future Gurlitt” from admitting to hiding Nazi-looted artworks while flaunting how the law protected him over the victims from whom the works had been stolen.  Lane’s answer to the question whether Germany had learned enough in the case she so  thoroughly investigated to prevent future Gurlitts  is a “resounding ‘no’” (p.266).

Thomas H. Peebles

La Châtaigneraie, France

November 5, 2020

 

10 Comments

Filed under Art, European History, German History, History

Deciphering Buber’s Judaism

 

 

Paul Mendes-Flohr, Martin Buber:

A Life of Faith and Dissent

(Yale University Press)

From the late 1890s through the mid-1960s, Martin Buber seemed to be in the middle of every public debate over what it meant to be Jewish and how one could be a good Jew in the modern world.  Although he resisted being labeled either a “theologian” or a “philosopher of religion,” Buber fashioned his own idiosyncratic version of Judaism, a version that rejected most traditional Jewish ritual.  He rarely observed Yom Kippur, and in general disdained the liturgical practices associated with the Jewish faith.  Buber rather spent his adult life searching for what he termed the “primal spirituality” of Judaism, all the while encouraging Jews to embrace people of other faiths.  Buber’s version of Judaism, sometimes referred to as “Jewish humanism,” sometimes more lightheartedly as “religious anarchism,” seemed to some of his critics geared to appeal more to Christians than to his fellow Jews.

I first encountered Buber in an undergraduate comparative religion course, where we were assigned his best-known work, “I-Thou,” generally thought to be the foundational text for what has come to be known as Buber’s “philosophy of dialogue.”  I remember thinking I was in way over my head in trying to decipher what seemed like a deeply serious but altogether inscrutable work.  Now, several decades later, Paul Mendes-Flohr, professor emeritus at the University of Chicago Divinity School and the Hebrew University of Jerusalem, has provided me with another chance to get a handle on Buber.  In his recent biography, Martin Buber: A Life of Faith and Dissent, Mendes-Flohr charts Buber’s multifaceted intellectual journey, emphasizing how Buber’s thinking and writing evolved over the years.

Buber, a quintessential product of what the Germans call Mitteleuropa and its vibrant late 19th century Jewish culture, was born in Vienna in 1878.  He spent most of his youth in the city then known as Lemberg (today Lviv, part of Ukraine), at the time the capital of Galicia, a province within the Austro-Hungarian Empire with a substantial Polish-speaking population.  But it was in Germany where Buber made his professional mark.

Buber lived through Germany’s defeat in World War I and its post-war experiment in democracy, the Weimar Republic.  He survived the early years of Adolph Hitler’s Nazi regime after it assumed power in Germany in 1933.   Although required as a Jew to cede a university position, Buber continued to write and speak in Germany as a highly visible spokesman for Judaism until 1938, when he fled with his family for Jerusalem, in what was then termed Palestine.  Jerusalem was his home base for the remainder of his life, but he traveled extensively in the post-World War II era, including numerous trips to the United States, up to his death in 1965.  What was arguably the single most consequential event in Buber’s long life occurred in Vienna at age three.

* * *

Buber’s parents separated and his mother eloped with a Russian military officer when he was three years old.  The young Buber witnessed his mother leaving, but she did not bid him farewell and he did not see her again.  Buber never recovered entirely from this early childhood trauma. Images of motherhood appeared in his writings and speeches throughout his adult life, indicating that he was still feeling the “enduring impact” of yearning to be reunited with his “inaccessibly remote mother” (p.3), as Mendes-Flohr puts it.  After his parents’ breakup, young Martin moved to Lemberg, where he lived with his grandparents until his teenage years.

Solomon Buber, Martin’s grandfather, was a successful businessman who was also a recognized Jewish scholar and interpreter of Jewish texts. Solomon taught his grandson Hebrew and the panoply of rules and customs required in an observant Jewish household.  Buber’s subsequent rejection of much of formalized Judaism probably had its roots in a rebellion against his grandfather’s pedagogy.  At age 14, Buber moved back with his father, who by then had remarried and moved to Lemberg.  While the young Buber as an adolescent and young adult remained largely estranged from his grandfather, the two reconciled prior to Solomon’s death in 1906.

Although the emotional scars left from his mother’s early departure never left him, while a university student in Zurich in 1899 Buber fortuitously found the woman who would always be there for him, fellow student Paula Winkler.  One of the few women in the university, Winkler was from a Catholic family and considerably taller than the diminutive Buber.  Their romantic attachment quickly produced two daughters, born in July 1900 and July 1901.  Buber did not inform his father or grandparents of Paula or his daughters until after the couple married in April 1907 and Paula had converted to Judaism.  By then, grandfather Solomon had died.

Despite its unconventional beginnings, Buber’s marriage to Paula endured until her death in 1958, at age 81.  Throughout their years together, Paula served as her husband’s confidante, editor and general sounding board for much of the thinking that he put to paper or delivered to audiences, while doing much writing on her own.   Buber found in Paula, Mendes-Flohr writes, “not only the mother figure he longed for, but also a soul mate; they were bonded by both romantic love and their enduring intellectual and spiritual compatibility” (p.13-14).

When Buber first met Paula, he was already active in Zionism, the movement to create a Jewish state in Palestine, the Biblical homeland of the Jewish people.  His attraction to Zionism was due in no small part to his relationship with Theodor Herzl (1860-1904), often considered the founder of the modern Zionist movement. Buber initially saw Zionism as a way to maintain solidarity with his fellow Jews even as he rejected most communal Jewish religious practices.  The young Buber was fascinated with the idea of a Jewish renaissance and saw in Zionism a means to revitalize the spiritual and cultural life of the Jewish people.   Zionism provided Buber and many young Jews of his generation with a “revolutionary, secular alternative for maintaining a Jewish national consciousness and solidarity” (p.22).

But Buber and Herzl had a personal falling out, and by 1905 Buber had ceased to be involved in Zionist activities.   He signed onto a letter that denounced the conventional Zionist vision of a future Jewish state arising in the ancient homeland as “aping Euro-Christian culture” while “utterly bereft of Jewish content” (p.38). For Buber, the movement had come to be based on what he termed the “bonds of blood alone” (p.88). Yet he still saw in Zionism a potential to “reintroduce contemporary Jews to the ‘Jewish spirit’” and to Judaism’s “spiritual and cultural resources” (p.32), becoming what Mendes-Flohr terms a “cultural” rather than  “political” Zionist.

The German experience in World War I further shaped Buber’s approach to Zionism. Like many Jews of his generation, Buber saw no conflict between his allegiance to Judaism and his allegiance to Germany.  That young Jews were joining the war ranks on equal terms with other Germans was initially a positive feature of the war effort for Buber, presenting an opportunity to bring about a higher degree of national unity.  But as the conflict endured, Buber came to oppose not only the war itself, but all forms of chauvinistic nationalism.

These views crystallized when the British government issued the Balfour Declaration in 1917, in which it asserted its support for the establishment of a national home for the Jewish people in Palestine.  Buber’s opposition to the Declaration placed him at odds even with fellow cultural Zionists. If a Jewish state were to materialize in Palestine, he contended, it should be “for humankind . . . for the realization of Judaism” (p.115-16).  For the remainder of his life, Buber continued to criticize Zionism –and the State of Israel when it came into existence in 1948 — for what he  considered its “self-enclosed, parochial nationalism” (p.199).

Buber’s festering doubts over the Zionist project prompted him in 1919 to begin work on a manuscript that aimed to establish what he termed the “general foundations of a philosophical (communal and religio-philosophical) system to which I intend to devote the next several years” (p.131).  He was alluding to I and Thou (Ich-Du in the original German), the work with which Buber would be identified for the rest of his life and thereafter, first appearing in 1923 but not translated and published in English until 1937.

 I and Thou probed the ramifications of the German word Begegnung, meeting, which for Buber meant, in Mendes-Flohr’a words, an “interpersonal encounter between individuals that occurs in an atmosphere of mutual trust” (p.3).  Buber himself once wrote that “[a]ll real life is meeting” (p.3).  His call to engage the world in dialogue, our life with others, “also recognized the painful truth of how difficult it is to achieve, how often life’s journey is filled with mismeetings and the failure of I-Thou encounters to take place” (p.3-4).

Buber’s notion of “I-Thou” and his “philosophy of dialogue” can be understood only in relationship to “I-It,” the opposite of “I-Thou.”  These are Buber’s “two fundamental and dichotomous modes of relating to the world” (p.141), Mendes-Flohr explains..  The human person “achieves the fullness of being by experiencing both modes of existence” (p.262-63).  I-It entails the “physical, historical, and sociological factors that structure objective reality,” in other words the “labyrinthine world we often call ‘reality’” (p.262-63).  To attain the fullness of life, our relationships with other human beings cannot be based on an object — It — but on Thou, as an “automatous subject with a distinctive inner reality” (p.263), as Mendes-Flohr puts it.

As if to show the I-Thou principle in action, much of Mendes-Flohr’s narrative involves Buber’s exchanges with thinkers, colleagues and friends over the course of his long life, among them such luminaries as early Zionist visionary Herzl, Indian independence leader Mahatma Gandhi, and David Ben-Gurion, modern Israel’s founding father who served as its first prime minister.  But his most influential exchanges were with two men whom he also considered  friends, Gustav Landauer (1870-1919) and Franz Rosenzweig (1886-1929).  Landauer led Buber away from Judaism and into mysticism, while Rosenzweig took Buber out of mysticism and helped him reach what Mendes-Flohr considers his most mature understanding of the Jewish faith.

A leading early 20th century German anarchist, Landauer knew Buber from 1900 onward, although their friendship deepened as World War I broke out.  Landauer, who had by then withdrawn entirely from formal Judaism, helped steer Buber away from the nationalist sentiments he had entertained at the outbreak of the war.  Landauer imparted to Buber his interest in Christian mysticism and Buddhism, aiding Buber’s search for the “essential spiritual unity of all beings” (p.53).  Landauer was active in Kurt Eisner’s revolutionary coup d’etat in Bavaria in 1919 and was murdered by counter-revolutionaries in that conflict.   Buber was “deeply shaken by the tragic death of his friend; he viewed Landauer as a martyred idealist, a gentle anarchist who had sacrificed his life in a doomed effort to herald an era of politics without violence” (p.127).  Landauer was, in Mendes-Fllohr’s view, Buber’s ”intellectual and political alter ego” (p.51; he was also the paternal grandfather of American film-director Mike Nichols, born several years after his death).

Rosenzweig, eight years Buber’s junior, was already making his mark as an iconoclastic German philosopher when he first met Buber in Berlin in 1914.  Their friendship blossomed after 1920, not coincidentally after Landauer’s death (Rosenzweig had by then famously backed out at the last minute of a conversion to Christianity, discussed in Mark Lilla’s The Shipwrecked Mind, reviewed here in 2017).  Mendes-Flohr credits Rosenzweig with helping Buber get past his infatuation with mysticism.  The pair undertook to translate the Hebrew Bible into German, a project that was both lingual and theological.  Their task was complicated when Rosenzweig contracted amyotrophic lateral sclerosis (ALS, also known to Americans as “Lou Gehrig’s disease”), which killed him in 1929.  Buber’s friendship with Rosenzweig led to what Mendes-Flohr considers the maturation of Buber’s understanding of Judaism, in which genuine spiritual renewal lies neither in “culture” nor “religion” but rather in the “lived everyday” (p.164).  The sensibility we sometimes call “faith,” Buber wrote, “cannot be constituted by the inwardness of one’s soul: it must manifest itself in the entire fullness of personal and communal life, in which the individual participates.” (p.209).

Buber also met several times after World War II with Martin Heidegger (1889-1976).  Although one of Germany’s most original and complex 20th century philosophers, Heidegger’s professional reputation was permanently tainted by his affinity for Hitler’s Third Reich in the 1930s.  Buber was aware that Heidegger had supported the Nazi regime and studiously avoided the “difficult questions attendant to Heidegger’s Nazi past” (p.281).  Heidegger for his part eagerly engaged with Buber, motivated by his desire to receive at least an implicit exculpation for his Nazi past.  Buber’s meetings with Heidegger tested his vision of reconciliation, “undoubtedly shaped by a Jewish theological sensibility that there can be no divine pardon for offenses against others until one has turned to one’s fellow human beings whom one has offended, and not only asked their forgiveness, but also adequately repented for the wrongs done to them” (p.286).

The Heidegger meetings failed to rise to the level of what Buber considered genuine dialogue, making reconciliation unattainable.  Buber and Heidegger entertained “divergent horizons of expectations” that reflected “very different conceptions of grace and atonement,” (p.285).  In a subsequent lecture, delivered in 1960, Buber argued that through his uncritical embrace of Nazism, Heidegger had neglected the interpersonal responsibility of one individual to the other, “even to a stranger who bears no name, allowing for the excessive celebration of ‘superpersonal’ social and political institutions in our ‘disintegrating human world’” (p.290; Heidegger’s post-World War II attempts at reconciliation with his former girl friend Hannah Arendt are analyzed in a work by Daniel Maier-Katkin, reviewed here in 2013),.

In 1938, when he was 60 years old, Buber and his family immigrated to Palestine, where he began a professorship at the Hebrew University in Jerusalem.  Although Jerusalem was  Buber’s home until his death in 1965, he was never fully at ease there.  His appointment at the Hebrew University was in “Philosophy of Society,” in which he was to draw upon the “principles and methods” of sociology.   With no formal training in sociology, Buber used his university position to stress sociology’s ethical dimensions, very much at odds with its general character as a value-free discipline.

When the State of Israel was created ten years later, in May 1948, a civil war broke out between Arab and Jews, leaving Buber aghast.  With his World War I era objections to the Balfour Declaration and political Zionism resurfacing, Buber wrote that when he had first joined the Zionist movement 50 years previously:

[M]y heart was whole. Today it is torn. The war being waged for a political structure might become a war of national survival at any moment. Thus against my will I participate in it with my own being, and my heart trembles like any other Israeli.  I cannot, however, even be joyful in anticipating victory, for I fear that the significance of Jewish victory will be the downfall of Zionism (p.250).

Buber words were directed at least in part to Israeli leader David Ben-Gurion (1886-1973), with whom he had developed an odd friendship.  The pair disagreed upon almost every issue that Israel faced in its early days, starting with the fate of displaced Arabs, yet they were bound to one another by a deep reservoir mutual respect.  Buber lobbied Ben-Gurion in 1961 to spare the life of Adolph Eichmann after his capture in Argentina and trial in Jerusalem, to no avail (Deborah Lipstadt’s account of the Eichmann trial was the subject of a review here in 2013).

Buber subjected himself to searing criticism in Israel in 1951 when he accepted, in abstentia, the Goethe Prize from the University of Hamburg for his “promotion of supranational thinking and humanitarian endeavors in the spirit of Goethe” (p.270).  Many in Israel considered Buber’s acceptance of the prize as exonerating Germany for its extermination of six million Jews, contending that he should have ostentatiously refused the prize.  Buber responded that by rejecting the prize, he would “undercut the commendable efforts of those Germans ‘fighting for humanism’ and thereby play into the hands of their enemies, even to those guilty of mass murder” (p.271-72).  As a Jew and an Israeli citizen, Buber considered it his duty to “acknowledge (and thus encourage) the German advocates of a rededication to the humanistic tradition associated with Goethe” (p.272).

Then, in 1953, Buber came under further fire when he accepted a prize and gave a lecture in what had once been St. Paul’s church in Frankfurt, destroyed in the war and the city’s first public building to be rebuilt afterwards.   The building had not been used for religious purposes since 1848, a fact that was “ignored by or unknown to some of Buber’s Israeli critics, who excoriated him for speaking in a church” (p.276).  In his lecture, which was attended by the President of the Federal Republic of Germany, Theodor Huess, Buber acknowledged the vast pain and immeasurable suffering which the Nazi regime had inflicted upon his people.  Yet, he recognized that not all Germans had acquiesced in the Nazi horrors.  Some had resisted, others had assisted and protected endangered Jews. “Reverence and love for these Germans now fills my heart” (p.278), Buber told the audience.

Buber’s wife Paula, who never felt accepted in Jerusalem as a Jew despite her conversion, died unexpectedly in 1958 in Venice, where the couple had stopped en route back to Jerusalem after a tour in the United States that had included a stint for Buber at Princeton’s Institute for Advanced Studies.  Paula was buried in the 13th century Jewish cemetery on the Lido.   Buber had a  difficult time resuming his work in the aftermath of his wife’s death.  In his twilight years, he “increasingly cherished friendships and visits, particularly by youth from abroad and Israel” (p.304).  On the occasion of his 85th birthday in 1963, Buber indicated to well wishers that he wanted to be remembered as a “naturally studying person,” someone  for whom “learning and study are an expression of human freedom” (p.320).  Two years later, at age 87, Buber died in his sleep in his Jerusalem home.

* * *

Throughout his long career, Martin Buber’s idiosyncratic version of Judaism sought to sharpen the spiritual sensibilities of his fellow Jews while urging their expanded commitment to the larger family of humankind.  But with his  complex and often portentous thinking, Buber’s writings and lectures were never easy to grasp.  Paul Mendes-Flohr is therefore to be lauded for ably distilling Buber’s thought in this penetrating biography, a work that should appeal even to readers not schooled in Jewish history and culture.

Thomas H. Peebles

La Châtaigneraie, France

October 20, 2020

 

6 Comments

Filed under Biography, Eastern Europe, European History, German History, Intellectual History, Religion

Conservatives, Where Are They Coming From?

Roger Scruton, Conservatism:

An Invitation to the Great Tradition

(St. Martin’s Press)

Roger Scruton’s Conservatism: An Invitation to the Great Tradition should be read in tandem with Helena Rosenblatt’s The Lost History of Liberalism: From Ancient Rome to the Twenty-First Century, reviewed here earlier this month.  Scruton, a fellow of the British Academy and the Royal Society of Literature who currently teaches at the University of Buckingham, has produced a work much like that of Rosenblatt, an erudite yet eminently readable piece of intellectual history.  Whereas Rosenblatt’s work centers on the etymology of the word “liberal,” Scruton focuses on what he terms the “tradition” of conservatism — but that may be a distinction without a difference.

The journey that Scruton takes his readers on overlaps at a surprising number of junctures along the way with people and places highlighted in Rosenblatt’s work, including a focus on the same core countries: France, Germany, Great Britain and the United States.  Scruton’s work accords more attention to Great Britain than to the other three and might be considered first and foremost a portrayal of the British conservative tradition.  But Scruton locates the origins of that tradition in the 18th century Enlightenment and the French Revolution, Rosenblatt’s starting points for modern liberalism.

Modern conservatism, Scruton writes, began more as a “hesitation within liberalism than as a doctrine and philosophy in its own right” (p.33).  The relationship between liberalism and conservatism, he emphasizes, should not be thought of as one of “absolute antagonism” but rather of “symbiosis” (p.55).  In the aftermath of the French Revolution, liberals and conservatives sparred in various contexts over the implications and limitations of the revolution’s ideals of liberté and égalité and the management of change.  Conservative hesitations “began to crystallize as theories and policies” (p.33) as a necessary counter to what Scruton terms the “liberal individualism” that the French Revolution seemed to prioritize.

Liberal individualism leads to a belief in the “right of individuals and communities to define their identity for themselves, regardless of existing norms and customs” (p.6), Scruton writes.  In the eyes of conservatives, liberal individualism does not regard liberty as a “shared culture, based on tacit conventions” (p.6).  This perception runs counter to the liberalism that Rosenblatt depicts, in which liberals at least until World War II consistently grounded individual rights in the needs of the larger community.  But liberalism makes sense, Scruton contends,  “only in the social context that conservatism defends” (p.55), a proposition Rosenblatt would likely endorse.

In Scruton’s account, conservatism in the mid-19th century found its natural antithesis not in liberalism but rather in the cluster of movements known as “socialism,” movements that spoke for an emerging working class as the industrial revolution was changing the face of Europe.  For the remainder of the century and into the 20th, conservatives opposed socialist schemes to reform society from top to bottom, whether utopian,  evolutionary, revolutionary or dictatorial.  Scruton’s conservative tradition might therefore be thought of as a flashing yellow light for liberalism – slow down! – and a stark red light for socialism – – stop!!

With conservatism and socialism at odds from the start, one strand of conservatism aligned with what was termed “classical liberalism,” which favored free markets and generally unfettered industrial capitalism.  But another strand, termed “cultural conservatism,” found itself largely in agreement with much of the socialist analysis of the deleterious effects of capitalism.  This strand, which has proved surprisingly enduring, proposed culture as “both the remedy to the loneliness and alienation of industrial society, and the thing most under threat from the new advocates of social reform” (p.82).

Scrtuon, again like Rosenblatt, is at his best when he describes the conservative tradition during the 19th century.  He too seems to run low on fuel when moving into the 20th century, especially the post World War II era.  Readers may be disappointed to find, for example, no analysis of Margaret Thatcher’s contributions to modern conservatism, or the implications of Brexit and the “populism” which purportedly fueled Britain’s decision to leave the European Union, a term Scruton scrupulously avoids.

But these voids underscore what I suspect may be Scruton’s main if implicit point: that the key to understanding the conservative tradition lies more in an appreciation of conservative attitudes and dispositions than in comprehending discrete principles or the evolution of thinking over the nearly 2 ½ centuries since the French Revolution.  Scruton acknowledges that conservatives have not always been good in defining or explaining their goals and notes wryly that they “suffer under a burden of disapproval, which they believe comes from their habit of telling the truth, but which their opponents ascribe either to ‘nostalgia’ for an old and misremembered way of life or a failure of compassion toward the new ways of life that are emerging to replace it” (p.154-55).

* * *

Scruton begins by emphasizing the debt that modern conservatism owes to Aristotle, to the English “Glorious Revolution” of 1688, and to the philosophies of such key 17th century thinkers as Thomas Hobbes (1588-1677) and John Locke (1632-1704).  But modern conservatism received its first extended articulation in Edmund Burke’s Reflections on the Revolution in France, first published in November 1790, more than a year after the fall of the Bastille but prior to the execution of King Louis XVI and the advent of the Reign of Terror.  Burke (1729-1797), the Irish-born Whig Parliamentarian whom Scruton considers the “greatest of British conservative thinkers” (p.26), demonstrated in Reflections an “astonishing” ability to “see to the heart of things and to predict the way in which they are bound to go” (p.44).

Burke questioned the revolutionaries’ abstract faith in reason.  He favored a more particularized form of reasoning that emerges “through custom, free exchange and ‘prejudice’” (p.51). To Burke, the revolutionaries in France had failed to take account of the passions and sentiments that govern human character at least as much as reason.  The past to Burke was not something to be discarded and overcome, as the most radical of the revolutionaries seemed to maintain, but rather something to be built upon (among the radicals Burke had in mind was the American Thomas Paine, whose debates with Burke are ably captured in Yuval Levin’s work reviewed here in 2015).

Burke and his Reflections provided modern conservatism – or at least the British version – with a blueprint that defined its distinctive character throughout the 19th century and into the 20th century: a “defence of inheritance against radical innovation, an insistence that the liberation of the individual could not be achieved without the maintenance of customs and institutions that were threatened by the single-minded emphasis on freedom and equality” (p.104).  To be sure, human societies must change over time, but only in the name of “continuity, in order to conserve what we are and what we have” (p.3).  Burkean conservatism should not therefore be mistaken for political reaction.

The most articulate of the reactionaries, diehard French lawyer and philosopher Joseph Comte de Maistre (1753-1821), defended the divine right of kings, advocated for restoration of the Bourbon monarchy, and saw the Enlightenment as a an “insurrection against God” (p.69).  De Maistre spoke for a wide range of ultra-royalists, disaffected nobles and backward-looking Catholics who sought in essence to undo the whole Enlightenment project and restore all that had been swept away by the French Revolution.  Scruton sees in de Maistre’s thinking a “certain remorseless extremism” (p.69) which does not fit comfortably within the conservative tradition he depicts.  Since de Maistre’s time, Scruton argues, conservatism in France has “almost invariably” been connected with a “reverence for the Catholic faith and for France as bearing witness to that faith” (p.71).

In German-speaking lands in the early 19th century, the differences between liberalism and conservatism were placed in sharp focus by debates between the two greatest German-speaking political philsophers, Immanuel Kant (1724-1804) and Georg Wilhem Friedrich Hegel (1770-1831).  Kant in many ways epitomized the liberal individualism of the Enlightenment, placing the “freely choosing individual into the very center of his world view” and judging “all institutions and procedures in terms of that one idea” (p.56; in a work on the 18th century Enlightenment reviewed here in 2015, Anthony Pagden argued that Kant was the Enlightenment’s single most important thinker).

Hegel by contrast regarded Kant’s freely choosing self as an “empty abstraction. The self does not exist prior to society, but is created in society, through . . . custom, morality and civil association” (p.59).  Hegel found the “roots of legitimate order” (p.70) not only in custom but also in continuity and free association.  In Scruton’s phrase, Hegel “rescued the human individual from the philosophy of individualism” (p.66).

But as conservatives and liberals in the middle decades of the 19th century ruminated over the limitations to the French Revolution’s ideal of liberté , it fell to the aristocrat Alexis de Tocqueville, one of France’s leading 19th  century liberals, to spell out conservative hesitations over the the revolutionary ideal of égalité.  Tocqueville’s views were shaped by his tour of the United States in the 1830s, as expressed in his classic work, Democracy in America.  Tocqueville considered equality among citizens to be the hallmark of American democracy, although he was aware that the institution of slavery undermined the country’s claims of equality.

Tocqueville wrestled with how equality might be reconciled with liberty in the “increasing absence of the diversity of power that had characterized traditional aristocratic regimes” (p.75).  For Tocqueville, unchecked pursuit of equality breeds loss of individuality that tends, as Scruton puts it, “towards uniformity, and begins to see the eccentric as a threat” (p.76).  Tocqueville was one of the first to warn against what he called “democratic despotism,” where majority sentiment is in a position to override the rights of minorities.

Tocqueville was among those mid-19th century liberals who shared conservative anxieties over the rise of the diverse working class movements known as “socialist.”  Conservatives recoiled at what they perceived to be socialism’s “gargantuan schemes for a ‘just’ society, to be promoted by the new kind of managerial state” (p.104).  Socialism for conservatives seemed altogether indifferent if not hostile to the very traditions they revered, and was bent upon undermining the bonds among citizens that they regarded as the glue holding societies together.  Conservative opposition to socialism in all its forms hardened in the 20th century after Vladimir Lenin and his band of Bolsheviks seized power in Russia, leading to a “tyranny yet more murderous than that of the Jacobins in revolutionary France” (p.104).

One conservative response was to align with so-called “classical liberalism,” that strand within liberalism that championed free trade, market capitalism and economic laissez faire.  But not all conservatives found the answer to socialism in laissez faire economics.  Many saw free markets as altogether amoral, exalting individualism and financial profit above the needs of the community.  The “cultural conservatism” that emerged in the mid-19th century included a strong anti-capitalist strain, addressing concerns that the demographic changes brought about by industrialization had detached people from their religious and social roots.

Scruton finds a nascent cultural conservatism in Germany with the thinking of Johann Gottried von Herder (1744-1803), once a student of Immanuel Kant.  Herder posited culture, consisting of “language, custom, folk tales and folk religion,” as the element that “unites human beings in mutual attachment” (p.96).  Herder’s cultural conservatism, Scruton notes, became a “kind of political radicalism, influencing the revolutions of 1848,” in which German speakers “laid claim to a shared identity within boundaries that would bring them together as a single nation state” (p.97).  In Britain, the romantic poet Samuel Taylor Coleridge (1772-1834) was among the earliest cultural conservatives.

Coleridge sought to infuse religion back into society, but was also a strong proponent of increased government assistance for the poor, thereby setting the agenda for “subsequent cultural conservatives who opposed unbridled free market economics” (p.83).  After Coleridge, the cultural conservative banner was carried by the poet and essayist John Ruskin (1819-1900), the essayist Matthew Arnold (1822-1888), and, in the 20th century, by the poems, plays and essays of T.S. Eliot (1888-1965) and the religious reflections of G.K. Chesterton (1874-1936) and C.S. Lewis (1898-1963).  But Scruton’s analysis of the conservative tradition in 20th century Britain revolves primarily around the thinking of three key theorists: lawyer and legal historian Frederic William Maitland (1850-1906), a transition figure from 19th to 20th century conservatism; the eminent Austrian economist Friedrich von Hayek (1899-1993), who almost single handedly kept the argument for free market capitalism alive in the mid-20th century; and the complex and often enigmatic political philosopher Michael Oakeshott (1901-1990) who — also almost single handedly — was able to maintain the academic respectability of conservatism in post-World War II Britain.

* * *

In a series of posthumously published lectures, The Constitutional History of England (1908), Maitland contended that the foundations for liberty in Britain lay not in the abstract theorizing of the Enlightenment and the French Revolution but in the English common law and the tradition of parliamentary representation.   Limited government,  he maintained, had been the rule rather than the exception in England from medieval times onward.  The rights claimed by Britain’s 17th and 18th century theorists in Maitland’s view had always been implied in the English common law.

Half a century later, Hayek linked Maitland’s insights into the English common law with his case for unfettered free market capitalism – for “classical” liberalism — as a further argument against centralized government planning.  In a work published in 1960, The Constitution of Liberty, his second best known work after his 1944 best seller, The Road to Serfdom, Hayek portrayed the English common law as the “heart of English society,” living proof that justice resides in the “transactions between freely associating people and not in the plans of sovereign power” (p.110).  Just as the free market is an example of a “spontaneous order, which arises by an invisible hand from free association,” generating solutions to economic problems “of its own accord,” the common law also generates a “spontaneous legal order, which, because it grows from particular solutions to particular conflicts, inherently tends to restore society to a state of equilibrium” (p.107-08).

Oakeshott attacked the murderous collectivist ideologies of the 20th century — communism, fascism and Nazism — but a part of his argument also applied to Britain and democracies generally: the damage done when politics is directed from above.  Oakeshott mounted an assault on what Scruton terms the “dirigisme” that entered British politics after World War II, in which the state would “manage” not only the economy, but also education, poverty relief, housing, employment, “just about anything on which the well-being and security of the people might seem to depend” (p.114).  Scruton goes on to note that Oakeshott utilized his position as a professor of political philosophy at the London School of Economics (where Hayek also taught) to “build up a network of sympathetic students and colleagues.”  For a while,  the LSE politics department “became a center of conservative resistance to the prevailing socialist consensus” (p.115).

This passage hit me like a thud.  In the late 1960s, I was fortunate to participate in this Oakeshott-led program in political philosophy, which I considered at the time to be a stimulating but relatively obscure academic enterprise.  Scruton even mentions the contributions to conservative thought of my advisor that year – termed “tutor” at LSE – Elie Kedourie, and those of Professor Kenneth Minogue, who was my instructor for an in-depth course on Thomas Hobbes.  In Scruton’s view, Oakeshott’s program in political thought at the LSE bore some resemblance to that of Leo Strauss at the University of Chicago in the same time period – although it is easier to say “Straussian” than “Oakeshottian” (Strauss and the influence of the Straussians were the subject of a review here in 2015).  None of this even remotely registered with me during an otherwise memorable year at LSE.

But overall, British conservatism since World War II for Scruton has been at best a “fragmentary force on the edge of intellectual life, with little or no connection to politics” (p.127).   Conservatism as the antithesis of socialism and Bolshevism more or less fell with the Berlin wall, and it has had difficulty establishing new moorings.  Today, British conservatism’s main enemies in Scruton’s view are religious extremism, especially an “armed and doctrinaire enemy, in the form of radical Islam” (p.148), the emerging orthodoxy of multi-culturalism, and “political correctness,” that “humorless and relentless policing of language, so as to prevent heretical thoughts from arising” (p.128).  Not by accident, recent intellectual conservatism in Britain has been buttressed by many immigrant voices.  It is the “privilege of the immigré,” Scruton writes, to “speak without irony of the British Empire and of the unique culture, institutions and laws that have made Britain the safe place of refuge for so many in a smoldering world” (p.131).

* * *

The hesitations that are baked into the conservative tradition that Scruton depicts have doubtless served as useful checks on liberal enthusiasm over the past two centuries.  But readers may leave Scruton’s work wondering how these  hesitations fit into today’s cantankerous political debates.

Thomas H. Peebles

La Châtaigneraie, France

September 19, 2020

 

4 Comments

Filed under British History, European History, French History, German History, History, Intellectual History

Liberals, Where Are They Coming From?

 

Helena Rosenblatt, The Lost History of Liberalism: From Ancient Rome

To the Twenty-First Century

(Princeton University Press) 

             If you spent any time watching or listening to the political conventions of the two major American parties last month,  you probably did not hear the word “liberal” much, if at all, during the Democratic National Convention.  But you may have heard the word frequently at the Republican National Convention, with liberalism perhaps described as something akin to a “disease or a poison,” or a danger to American “moral values.”  These, however, are not the words of Donald Trump Jr. or Rudy Giuliani, but rather of Helena Rosenblatt, a professor at the Graduate Center, City University of New York, in The Lost History of Liberalism: From Ancient Rome to the Twenty-First Century (at p.265).  American Democrats, Rosenblatt further notes, avoid using the word “liberal” to describe themselves “for fear that it will render them unelectable” (p.265). What the heck is wrong with being a “liberal”? What is “liberalism” after all?

Rosenblatt argues that we are “muddled” about what we mean by “liberalism”:

People use the term in all sorts of different ways, often unwittingly, sometime intentionally. They talk past each other, precluding any possibility of reasonable debate. It would be good to know what we are speaking about when we speak about liberalism (p.1).

Clarifying the meaning of the terms “liberal” and “liberalism” is the lofty goal Rosenblatt sets for herself in this ambitious work, a work that at its heart is an etymological stud — a “word history of liberalism” (p.3) — in which she explores how these two terms have evolved in political and social discourse over the centuries, from Roman to present times.

The word “liberal,” Rosenblatt argues, took on an overtly political connotation only in the early 19th century, in the aftermath of the French Revolution. Up until that time, beginning with the Roman authors Cicero and Seneca, through the medieval and Renaissance periods in Europe, “liberal” was a word referring to one’s character.  Being “liberal” meant demonstrating the “virtues of a citizen, showing devotion to the common good, and respecting the importance of mutual connectedness” (p.8-9).  During the 18th century Enlightenment, the educated public began for the first time to speak not only of liberal individuals but also of liberal sentiments, ideas, ways of thinking, even constitutions.

Liberal political principles emerged as part of an effort to safeguard the achievements of the French Revolution and to protect them from the forces of extremism — from the revolution’s most radical proponents on one side to its most reactionary opponents on the other.  These principles included support for the broad ideals of the French Revolution, “liberté, égalité, fraternité;” opposition to absolute monarchy and aristocratic and ecclesiastical privilege; and such auxiliary concepts as popular sovereignty, constitutional and representative government, the rule of law and individual rights, particularly freedom of the press and freedom of religion.  Beyond that, what could be considered a liberal principle was “somewhat vague and debatable” (p.52).

Rosenblatt is strongest on how 19th century liberalism evolved, particularly in France and Germany, but also in Great Britain and the United States.  France and French thinkers were the center points in the history of 19th century liberalism, she contends, while Germany’s contributions are “usually underplayed, if not completely ignored” (p.3).  More cursory is her treatment of liberalism in the 20th century, packed into the last two of eight chapters and an epilogue.  The 20th century in her interpretation saw the United States and Great Britain become centers of liberal thinking, eclipsing France and Germany.  But since World War II, she argues, liberalism as defined in America has limited itself narrowly to the protection of individual rights and interests, without the moralism or  dedication to the common good that were at the heart of 19th and early 20th century liberalism.

From the early 19th century through World War II, Rosenblatt insists, liberalism had “nothing to do with the atomistic individualism we hear of today.”  For a century and a half, most liberals were “moralists” who “never spoke about rights without stressing duties” (p.4).  People have rights because they have duties.  Liberals rejected the idea that a viable community could be “constructed on the basis of self-interestedness alone” (p.4).  Being a liberal meant “being a giving and a civic-minded citizen; it meant understanding one’s connectedness to other citizens and acting in ways conducive to the common good” (p.3-4).  The moral content to the political liberalism that emerged after the French Revolution constitutes the “lost” aspect of the history that Rosenblatt seeks to bring to light.

Throughout much of the 19th century, however, being a liberal did not mean being a democrat in the modern sense of the term.  Endorsing popular sovereignty, as did most early liberals, did not mean endorsing universal suffrage.  Voting was a trust, not a right.  Extending suffrage beyond property-holding males was an invitation to mob rule.  Only toward the end of the century did most liberals accept expansion of the franchise, as liberalism gradually became  synonymous with democracy, paving the way for the 20th century term “liberal democracy.”

While 19th century liberalism was often criticized as opposed to religion, Rosenblatt suggests that it would be more accurate to say that it opposed the privileged position of the Catholic Church and aligned more easily with Protestantism, especially some forms emerging in Germany (although a small number of 19th century Catholic thinkers could also claim the term liberal).  But by the middle decades of the 19th century, liberalism’s challenges included not only the opposition of monarchists and the Catholic Church, but also what came to be known as “socialism” — the political movements representing a working class that was “self-conscious, politicized and angry” (p.101) as the Industrial Revolution was changing the face of Europe.

Liberalism’s response to socialism gave rise in the second half of the 19th century to the defining debate over its nature: was liberalism compatible with socialist demands for government intervention in the economy and direct government assistance to the working class and the destitute?  Or were the broad objectives of liberalism better advanced by the policies of economic laissez faire, in which the government avoided intervention in the economy and, as many liberals advocated, rejected what was termed “public charity” in favor of concentrating upon the moral improvement of the working classes and the poor so that they might lift themselves out of poverty?  This debate carried over into the 20th century and, Rosenblatt indicates, is still with us.

* * *

With surprising specificity, Rosenblatt attributes the origins of modern political liberalism to the work of the Swiss couple Benjamin Constant and his partner Madame de Staël, born Anne-Louise Germaine Necker, the daughter of Jacques Necker, a Swiss banker who served as finance minister to French King Louis XIV (Rosenblatt is also the author of a biography of Constant).  The couple arrived in Paris from Geneva in 1795, a year after the so-called Reign of Terror had ended with the execution of its most prominent advocate, Maximilien Robespierre.  As they reacted to the pressing circumstances brought about by the revolution, Rosenblatt contends, Constant and de Staël formulated the cluster of ideas that collectively came to be known as “liberalism,” although neither ever termed their ideas “liberal.”  Constant, the “first theorist of liberalism” (p.66), argued that it was not the “form of government that mattered,” but rather the amount. “Monarchies and republics could be equally oppressive. It was not to whom you granted political authority that counted, but how much authority you granted.  Political power is dangerously corrupting” (p.66).

Influenced in particular by several German theologians, Constant spoke eloquently about the need for a new and more enlightened version of Protestantism in the liberal state.  Religion was an “essential moralizing force” that “inspired selflessness, high-minded principles, and moral values, all crucial in a liberal society. But it mattered which religion, and it mattered what its relationship was to the state” (p.66).  A liberal government needed to be based upon religious toleration, that is, the removal of all legal disabilities attached to the faith one professed.  Liberalism envisioned strict separation of church and state and what we would today call “secularism,” ideas that placed it in direct conflict with the Catholic Church throughout the 19th century.

Constant and Madame de Staël initially supported Napoleon Bonaparte’s 1799 coup d’état.  They hoped Napoleon would thwart the counterrevolution and consolidate and protect the core liberal principles of the revolution. But as Napoleon placed the authority of the state in his own hands, pursued wars of conquest abroad, and allied himself with the Catholic Church, Constant and Madame de Staël became fervent critics of his increasingly authoritarian rule.

After Napoleon fell from power in 1815, an aggressive counter-attack on liberalism took place in France, led by the Catholic Church, in which liberals were accused of trying to “destroy religion, monarchy, and the family.  They were not just misguided but wicked and sinful.  Peddlers of heresy, they had no belief in duty, no respect for tradition or community.  In the writings of counter-revolutionaries, liberalism became a virtual symbol for atheism, violence, and anarchy” (p.68).  English conservative commentators frequently equated liberalism with Jacobinism.  For these commentators, liberals were “proud, selfish and licentious,” primarily interested in the “unbounded gratification of their passions” while refusing “restraints of any kind” (p.76).

Liberals hopes were buoyed, however, when the  bloodless three day 1830 Revolution in France deposed the ultra-royalist and strongly pro-Catholic Charles X in favor of the less reactionary Louis Philippe.  Among those initially supporting the 1830 Revolution was Alexis de Tocqueville, 19th century France’s most consequential liberal thinker after Constant and Madame de Staël.  Tocqueville famously toured the United States in the 1830s and offered his perspective on the country’s direction in Democracy in America, published in two volumes in 1835 and 1840, followed by his analysis in 1856 of the implications of the French Revolution, The Old Regime and the Revolution.

Tocqueville shared many of the widespread concerns of his age about democracy, especially its tendency to foster egoism and individualism.  He worried about the masses’ lack of “capacity.” He was one of the first to warn against what he called “democratic despotism,” where majority sentiment would be in a position to override the rights and liberties of minorities.  But Tocqueville also foresaw the forward march of democracy and the movement toward equality of all citizens as unstoppable, based primarily upon what he had observed in the United States (although he was aware of how the institution of slavery undermined American claims to be a society of equals).  Tocqueville counseled liberals in France not to try to stop democracy, but, as Rosenblatt puts it, to “instruct and tame” democracy, so that it “did not threaten liberty and devolve into the new kind of despotism France had seen under Napoleon” (p.95).

Tocqueville’s concerns about democracy and “excessive” equality were related to anxieties about how to accommodate the diverse movements that termed themselves socialist.  Initially, Rosenblatt stresses, the term socialist described “anyone who sympathized with the plight of the working poor . . . [T]here was no necessary contradiction between being liberal and being socialist” (p.103).   The great majority of mid-19th liberals, she notes, whether British, French, or German, believed in free circulation of goods, ideas and persons but were “not all that adverse to government intervention” and did not advocate “absolute property rights” (p.114).

In the last quarter of the 19th century, a growing number of British liberals began to favor a “new type of liberalism” that advocated “more government intervention on behalf of the poor.  They called for the state to a take action to eliminate poverty, ignorance and disease, and the excessive inequality in the distribution of wealth .  They began to say that people should be accorded not just freedom, but the conditions of freedom” (p. p.226).   French commentators in the same time period began to urge that a middle way be forged between laissez-faire and socialism, termed “liberal socialism,” where the state became an “instrument of civilization” (p.147).

But it was in 1870s Germany where the debate crystalized between what came to be known as “classical” laissez faire liberalism and the “progressive” version, thanks in large part to the unlikely figure of Otto von Bismarck.   Although no liberal, Bismarck, who masterminded German unification in 1871 and served as the first Chancellor of the newly united nation, instituted a host of sweeping social welfare reforms for workers, including full and comprehensive insurance against sickness, industrial accidents, and disability.  Most historians attribute his social welfare measures to a desire to coopt and destroy the German socialist movement (a point Jonathan Steinberg makes in his masterful Bismarck biography, reviewed here in 2013).

Bismarck’s social welfare measures coincided with an academic assault on economic laissez faire led by a school of “ethical economists,” a small band of German university professors who attacked laissez faire with arguments that were empirical but also moral, based on a view of man as not a “solitary, self-interested individual” but a “social being with ethical obligations “(p.222).  Laissez-faire “allowed for the exploitation of workers and did nothing to remedy endemic poverty,” they contended, “making life worse, not better, for the majority of the inhabitants of industrializing countries” (p.222).  Industrial conditions would “only deteriorate and spread if governments took no action” (p.222).

In the late 19th and early 20th centuries, many young Americans studied in Germany under the ethical economists and their progeny.  They returned to the United States “increasingly certain that laissez-faire was simply wrong, both morally and empirically,” and “began to advocate more government intervention in the economy” (p.226).  On both sides of the Atlantic, liberalism and socialism were drawing closer together, but the debate between laissez faire liberalism and the interventionist version played out primarily on the American side.

* * *

During World War I, Rosenblatt argues, liberalism, democracy and Western civilization became “virtually synonymous,” with America, because of its rising strength, “cast as their principal defender” (p.258).  Germany’s contribution to liberalism was progressively forgotten or pushed aside and the French contribution minimalized.  Two key World War I era American thinkers, Herbert Croly and John Dewy, contended that only the interventionist, or progressive, version of liberalism could claim to be truly liberal.

Croly, cofounder of the flagship progressive magazine The New Republic, delivered a stinging indictment of laissez-faire economics and a strong argument for government intervention in his 1909 work, The Promise of American Life.  By 1914, Croly had begun to call his own ideas liberal, and by mid-1916 the term was in common use in The New Republic as “another way to describe progressive legislation” (p.246).

The philosopher John Dewey acknowledged that there were “two streams” of liberalism.  But one was more humanitarian and therefore open to government intervention and social legislation, while the other was “beholden to big industry, banking, and commerce, and was therefore committed to laissez-faire” (p.261).  American liberalism, Dewey contended, had nothing with laissez-faire, and never had.  Nor did it have anything to do with what was called the “gospel of individualism.”  American liberalism stood for “‘liberality and generosity, especially of mind and character.’ Its aim was to promote greater equality and to combat plutocracy with the aid of government” (p.261).

Rosenblatt credits President Franklin D. Roosevelt’s New Deal with demonstrating how progressive liberalism could work in the political arena. Roosevelt, 20th century America’s most talented liberal practitioner, consistently claimed the moral high ground for liberalism.  He argued that liberals believed in “generosity and social mindedness and were willing to sacrifice for the public good” (p.261).  For Roosevelt, the core of the liberal faith was a belief in the “effectiveness of people helping each other” (p.261). But despite his high-minded advocacy for progressive liberalism – buttressed by his leadership of the country during the Great Depression and in World War II – Roosevelt did not vanquish the argument that economic laissez faire constituted the “true” liberalism.

In 1944, with America at war with Nazi Germany and Roosevelt within months of unprecedented fourth term, the eminent Austrian economist Friedrich Hayek, then teaching at the London School of Economics, published The Road to Serfdom, the 20th century’s most concerted intellectual challenge to the interventionist strand of liberalism.  Any sort of state intervention or “collectivist experiment” threatened individual liberty and put countries on a slippery slope to fascism, Hayek argued in his surprise best seller.  Hayek grounded his arguments in English and American notions of individual freedom.  “Progressive liberalism,” which he considered a contradiction in terms, had its roots in Bismarck’s Germany, he argued, and leads ineluctably to totalitarianism.  “[I]t is Germany whose fate we are in some danger of repeating” (p.268), Hayek warned his British and American readers in 1944.

Although Hayek always insisted that he was a liberal, his ideas became part of the American post World War II conservative argument against both fascism and communism (meanwhile, in France laissez faire economics became synonymous with liberalism; “liberal” is a political epithet in today’s France, but means a free market advocate, diametrically opposed to its American meaning).  During the anti-Communist fervor of the Cold War that followed World War II, the interventionist liberalism that Croly and Dewey had preached and Roosevelt had put into practice was labeled “socialist” and even “communist.”  To American conservatives, those who accepted the interventionist version of liberalism were not really liberal; they were “totalitarian.”

* * *

The intellectual climate of the Cold War bred defensiveness in American liberals, Rosenblatt argues, provoking a need to “clarify and accentuate what made their liberalism not totalitarianism. It was in so doing that they toned down their plans for social reconstruction and emphasized, rather, their commitment to defending the rights of individuals” (p.271).  Post World War II American liberalism thus lost “much of its moral core and centuries-long dedication to the public good.  Individualism replaced it as liberals lowered their sights and moderated their goals” (p.271).  In bowing to Cold War realities, American liberals in the second half of the 20th century “willingly adopted the argument traditionally used to malign them . . . that liberalism was, at its core, an individualist, if not selfish, philosophy” (p.273).   Today, Rosenblatt finds, liberals “overwhelmingly stress a commitment to individual rights and choices; they rarely mention duties, patriotism, self-sacrifice, or generosity to others” (p.265-66).

Unfortunately, Rosenblatt provides scant elaboration for these provocative propositions, rendering her work incomplete.  A valuable follow up to this enlightening and erudite volume could concentrate on how the term “liberalism” has evolved over the past three quarters of a century, further helping us out of the muddle that surrounds the term.

Thomas H. Peebles

La Châtaigneraie, France

September 7, 2020

 

3 Comments

Filed under American Politics, English History, European History, France, French History, German History, History, Intellectual History, Political Theory

A Time for New Thinking

 

Arthur Haberman, 1930: Europe in the Shadow of the Beast

(Wildred Lurier University Press) 

 

            Anxiety reigned in Europe in 1930.  The Wall Street stock market crash of the previous October and the ensuing economic crisis that was spreading across the globe threatened to undo much of the progress that had been made in Europe after recovering from the self-inflicted catastrophe of World War I.  A new form of government termed fascism was firmly in place in Italy, based on xenophobic nationalism, irrationality, and an all-powerful state.  Fascism seemed antithetical in just about every way to the universal, secular and cosmopolitan values of the 18th century Enlightenment.  In what was by then known as the Soviet Union, moreover, the Bolsheviks who had seized control during World War I were firmly in power in 1930 and were still threatening, as they had in the immediate post-war years, to spread anti-capitalist revolution westward across Europe.  And in Germany, Adolph Hitler and his unruly Nazi party realized previously unimaginable success in legislative elections in 1930, as they challenged the fragile Weimar democracy.  But if anti-democratic political movements and economic upheavals made average citizens across Europe anxious in 1930, few foresaw the extent of the carnage and destruction that the next 15 years would bring. Things were about to get worse — much worse.

In 1930: Europe in the Shadow of the Beast, Arthur Haberman, professor of history and humanities at York University, seeks to capture the intellectual and cultural zeitgeist of 1930. “What makes 1930 such a watershed is that rarely have so many important minds worked independently on issues so closely related,” Haberman writes. “All argued that something was seriously amiss and asked that people become aware of the dilemma” (p.1).  Haberman focuses on how a handful of familiar thinkers and artists expressed the anxiety that their fellow citizens felt; and how, in different ways, these figures foreshadowed the calamities that lay ahead for Europe.  There are separate chapters on Thomas Mann, Virginia Woolf, Aldous Huxley, Ortega y Gasset, Bertolt Brecht, and Sigmund Freud, each the subject of a short biographical sketch.  But each either published a major work or had one in progress in the 1929-31 time frame, and Haberman’s sketches revolve around these works.  He also includes two lesser known sisters, Paulette and Jane Nardal, two Frenchwomen of African descent who promoted writing that expressed identity and solidarity between blacks in Europe, the Americas and Africa.  Another chapter treats the visual arts in 1930, with a dissection of the various schools and tendencies of the time, among them surrealism, cubism, and fauvism.

But before getting to these figures and their works, Haberman starts with a description of an unnamed, composite European middle class couple living in a major but unidentified city in one of the World War I belligerents.  With all the maimed young men walking the streets using canes and crutches, the “metaphor of sickness and a need to be healed was part of everyday life” (p.7) for the couple.  The couple’s unease was “mirrored by the intellectuals they admired, as they all grappled with what Europe had become and where it was heading” (p.15).

In an extensive final chapter, “Yesterday and Today,” and an Epilogue, “”Europeans Today” — together about one quarter of the book — Haberman assigns himself the herculean task of demonstrating the continued relevance of his figures in contemporary Europe.   Here, he seeks to summarize European anxiety today and the much-discussed European crisis of confidence, especially in the aftermath of the 2008 economic downturn.  It’s an overly ambitious undertaking and the least successful portion of the book.

The key figures Haberman portrays in the book’s first portions were a diverse lot, and it would be an uphill task to tie them together into a neat conceptual package. But if there is a common denominator linking them, it is the specter of World War I, the “Great War,” and the reassessment of Western civilization that it prompted.  The Great War ended the illusion that Europe was at the forefront of civilization and introduced “deep cultural malaise” (p.6).  The “so-called most civilized people on earth committed unprecedented a carnage on themselves” (p.36).  It was thus necessary to think in new ways.

Haberman identifies a cluster of related subjects that both represented this new thinking and heightened the anxiety that average Europeans were sensing about themselves and their future in 1930. They include: the viability of secular Enlightenment values; coming to terms with a darker view of human nature; the rise of the politics of irrationality; mass culture and its dangers; fascism as a norm or aberration; identity and the Other in the midst of Western Civilization; finding ways to represent the post war world visually; and dystopian trends of thought.  The new thinking thus focused squarely on what it meant to be European and human in 1930.

* * *

            None of the figures in Haberman’s study addressed more of these subjects in a single work than the Spanish thinker Ortega y Gasset, whose Revolt of the Masses appeared in 1930.  Here, Ortega confronted the question of the viability of liberal democracy and the durability of the Enlightenment’s core values.  Ortega emphasized liberal democracy’s potential for irrationality and emotion to override reason in determining public choices.  He described a new “mass man” who behaved through “instinct and desire,” could be “violent and brutal” (p.55), and “will accept, even forward, both political and social tyranny” (p.53).  Ortega referred to Bolshevism and Fascism as “retrograde and a new kind of primitivism” (p.54).  The two ideologies, he concluded, gave legitimacy to the brutality he saw cropping up across Europe.

Although Ortega posited a dark view of human nature, it was not far from what had been apparent in the works of Sigmund Freud for decades prior to 1930.  Freud, whom Haberman ranks on par with Einstein as the most famous and influential intellect of his time, was 74 years old in 1930.  Although ill with throat cancer that year, Freud used an extended essay, Civilization and its Discontents, to reflect upon the conscious and unconscious, on sanity, insanity, and madness, and on the contradictions we live with.  His reflections became “central to how humans understood themselves as individuals and social beings” (p.143).

Culture and civilization are more fragile than we had thought, Freud contended. We must constantly reinforce those things that keep civilization going: “the limitations on our sexual life, the rule of law, the restrictions on our aggressive nature, and the hopeless commandment to love our neighbors, even if we don’t like them” (p.150).  The insights from Civilization and its Discontents and Freud’s other works were used in  literature, art and the study of religion, along with philosophy, politics and history.  These insights – these Freudian insights — opened for discussion “matters that had been sealed” (p.162), changing the way we think about ourselves and our nature.  Freud “tried to be a healer in a difficult time,” Habermas writes, one who “changed the discourse about humans and society forever” (p.162).

Virginia Woolf claimed she had not read Freud when she worked on The Waves, an experimental novel, throughout 1930.  The Waves nonetheless seemed to echo Freud, especially in its idea that the unconscious is a “layer of our personality, perhaps the main layer.  All of her characters attempt to deal with their inner lives, their perceptions” (p.44). In The Waves, Woolf adopted the idea that human nature is “very complex, that we are sometimes defined by our consciousness of things, events, people and ourselves, and that there are layers of personality” (p.43).  There are six different narrative voices to The Waves.  The characters sometimes seem to meld into one another.

Woolf had already distinguished herself as a writer heavily invested in the women’s suffragette movement and had addressed  in earlier writings how women can achieve freedom independently of men.  Haberman sees Woolf as part of a group of thinkers who “set the stage for the more formal introduction of existentialism after the Second World War . . . She belongs not only to literature but to modern philosophy” (p.46).

With Mario and the Magician, completed in 1930, novelist Thomas Mann made his first explicit foray into political matters.  Mann, as famous in Germany as Woolf was in Britain, suggested in his novel that culture and politics were intertwined in 1930 as never before.  By that year, Mann had become an outspoken opponent of the Nazi party, which he described as a “wave of anomalous barbarism, of primitive popular vulgarity” (p.29).  Mario and the Magician, involving a German family visiting Italy, addressed the implications of fascism for Italy and Europe generally.

Like Ortega, Mann in his novel examined the “abandonment of personality and individual responsibility on the part of the person who joins the crowd” (p.24).  Like Freud, Mann saw humanity as far more irrational and complicated than liberal democracy assumed.  The deified fascist leader in Mann’s view goes beyond offering simply policy solutions to “appeal to feelings deep in our unconscious and [tries] to give them an outlet” (p.24).  Mann was in Switzerland when the Nazis assumed power in 1933.  His children advised him not to return to Germany, and he did not do so until 1949.  He was stripped of his German citizenship in 1936 as a traitor to the Reich.

Still another consequential novel that appeared in 1930, Aldous Huxley’s Brave New World, was one of the 20th century’s first overtly dystopian works of fiction, along with Yevgeny Zamiatin’s We (both influenced George Orwell’s 1984, as detailed in Dorian Lynskey’s study of Orwell’s novel, reviewed here last month).   Brave New World used “both science and psychology to create a future world where all are happy, there is stability, and conflict is ended” (p.132).  The dystopian novel opened the question of the ethics of genetic engineering.   In 1930, eugenics was considered a legitimate branch of science, a way governments sought to deal with the undesirables in their population, especially those they regarded as unfit.  Although bioethics was not yet a field in 1930, Huxley’s Brave New World made a contribution to its founding.  Huxley’s dystopian work is a “cautionary tale that asks what might happen next.  It is science fiction, political philosophy, ethics, and a reflection on human nature all at once” (p.132).

Haberman’s least familiar figures, and for that reason perhaps the most intriguing, are the Nardal sisters, Paulette and Jane, French citizens of African descent, born in Martinique and living in 1930 in Paris.  The sisters published no major works equivalent to Civilization and Its Discontents or Revolt of the Masses.  But they founded the highly consequential La Revue du Monde Noir, a bi-lingual, French and English publication that featured contributions from African-American writers associated with the Harlem Renaissance, along with French-language intellectuals.   Writings in La Revue challenged head-on the notions underlying French colonialism.

Although France in 1930 was far more welcoming to blacks than the United States, the French vision of what it meant to be black was, as Haberman puts it, a “colonialist construction seen through the eyes of mainly white, wealthy elites” (p.89) that failed to acknowledge the richness and variety of black cultures around the world.  Educated blacks in France were perceived as being  “in the process of becoming cosmopolitan, cultured people in the French tradition, a process they [the French] called their mission civilatrice” (p.89).  Like many blacks in France, Paulette and Jane Nardal “refused to accept this formulation and decided that their identity was more varied and complex than anything the French understood” (p.89).

The Nardal sisters advanced the notion of multiple identities, arguing that the black spirit could be “informed and aided by the association with the West, without losing its own core” (p.92).   Blacks have an “alternative history from that of anyone who was white and born in France. Hence, they needed to attempt to get to a far more complex concept of self, one deeper and richer than those in the majority and the mainstream” (p.100).   The Nardals also came to understand the connection between black culture in Europe and gender.  Black women, “like many females, are a double Other, and this makes them different not only from whites but from Black men as well” (p.101; but conspicuously missing in this work is any sustained discussion of the Jew as the Other, even though anti-Semitism was rising alarmingly in Germany and elsewhere in Europe in 1930).

Between 1927 and 1933,  Bertold Brecht collaborated with Kurt Weill to rethink theatre and opera.  Brecht, alone among the thinkers Haberman portrays, brought an explicit Marxist perspective to his work.  Brecht supplied both the lyrics and dialogue to the pair’s plays, while Weill composed the music.   The Three Penny Opera, their joint work first performed in Berlin in 1928, was a decidedly non-traditional opera that proved to be spectacular success in Weimar Germany.

In 1930, the Brecht and Weill produced The Rise and Fall of the City of Mahagonny, an even less traditional production.  Brecht termed Mahagonny “epic theatre,” whose purpose was “not to entertain or provide the audience with an imitation of their lives” (p.70), but rather to engage the audience in issues of social justice.  Epic theatre was designed to “force the spectator to be active, to query his own assumptions”(p.78).

Haberman describes Mahagonny as an angry anti-capitalist production, a strange sort of “utopia of desire,” where money rules.  Its lesson: in a capitalist society, all is “commoditized, no relationship is authentic . . . [M]oney cannot satisfy human needs” (p.81-82).  The Nazis, who enjoyed increased popular support throughout 1930, regularly demonstrated against Mahagonny performances. Both Brecht and Weill fled Germany when the Nazis came to power in early 1933.  Neither The Three Penny Opera nor Mahagonny was performed again in Germany until after World War II.

Haberman sees Brecht and Weill as stage and musical companions to surrealist painters such as René Magritte and Salvador Dali, who were also juxtaposing traditional elements to force audiences to ask what was really going on.  Magritte’s The Key to Dreams, a name that is a direct reference to Freud, was a painting about painting and how we construct reality.  Words are not the objects themselves, Magritte seemed to be saying.  Paintings can refer to an object but are not the object itself.   Salvador Dali was the rising star of surrealism in 1930.  His paintings were at once “provocative, mythic, and phallic, while also using juxtaposition to great effect” (p.115).  As with Magritte, the code of understanding in Dali paintings is “closer to Freudian psychology than it is to ‘reason’” (p.115).

The most transformative shift in the visual arts by 1930 was the abandonment of mimesis, the idea that a work of art should represent external reality.  Artists from the many varying schools regarded external reality as “just appearance, not realty at all.  Now it was necessary to go through or beyond appearance to investigate what was real” (p.107).  Artists like Pablo Picasso, Georges Braque and Henri Matisse “wanted a painting to be seen holistically before being analyzed in its parts” (p.118). Like Woolf in literature, these artists by 1930 were depicting “multiple realities,” with the “whole, deep world of the unconscious guiding us” (p.108).

In the end, Haberman concludes, the perspective of the major artists of 1930 was in line with that of the writers he portrays. All in their own way:

feared where humanity was headed, in some cases they feared what they discovered about human nature. They wrote and created art. They did so in order to both help us know about ourselves and offer some redemption for a hard time. They did so because, in spite of their fears, and in spite of their pessimism, they had hope that our better nature would triumph.   Their works are relevant today as they were in 1930 (p.212).

* * *

                        Articulating their contemporary relevance is the purpose of Haberman’s extensive final chapter and epilogue, where he also seeks to summarize contemporary Europe’s zeitgeist.  The Enlightenment faith in the idea and inevitability of progress has now “more or less ended,” he argues, and the world “no longer seems as automatically better as time moves on” (p.171) – the core insight which World War I provided to the generation of 1930.  The politics of irrationality of the type that so worried Ortega seems again resurgent in today’s Europe.  Nationalism – in Haberman’s view, the most influential of the modern ideologies born in the 19th century – “persists and appears to be growing in Europe in a more frightening manner, in the rise of racist neo-fascist and quasi-fascist parties in many countries. What was once thought impossible after the defeat of Hitlerian Germany is now coming into being” (p.168).

Despite the rise of European social democracy in the aftermath of World War II, there is a trend toward appropriation of wealth in fewer and fewer hands, with the gap between the rich and poor widening.   Traditional religion has less hold on Europeans today than it did in 1930 — although it had no apparent hold on any of the writers and artists Haberman features. The question of the place for the Other – marginalized groups like the blacks of the Nardal sisters’ project – has come to the fore in today’s Europe.  Haberman frames the question as whether today’s Europe, theoretically open, liberal, tolerant and egalitarian, is so “only for those who conform to the norm – who are white, indigenous to whatever place they live, nominally or deeply Christian, and identifying strongly with the nation.”  Or is there something “built into European culture as it is taught and practiced that automatically marginalizes women, Blacks, Jews, Roma, and Muslims?” (p.185).

After posing this unanswerable question, Haberman finishes by returning to his composite couple, explaining how their lives were changed by events between 1930 and 1945.  They lost a son in battle in World War II and some civilian relatives were also killed.  Haberman then fast-forwards to the couple’s granddaughter, born in 1982, who married at age 30 and is now pregnant.   She and her husband are ambivalent about their future.  Peace is taken for granted in the way it was not in 1930.  But there is pessimism in the economic sphere.  The couple sees the tacit social contract between generations fraying. The issues that move the couple most deeply are the environment and concerns about climate change.

* * *

               Through his individual portraits, Haberman provides a creative elaboration upon the ideas which leading thinkers and artists wrestled with in the anxious year of 1930.  Describing contemporary applications of these ideas , as he attempts to do in the latter portion of his work, would be a notable accomplishment for an entire book and his attempt to do so here falls flat.

 

 

Thomas H. Peebles

La Châtaigneraie, France

March 15, 2020

 

 

9 Comments

Filed under European History, Intellectual History

A Defense of Truth

 

Dorian Lynskey, The Ministry of Truth:

The Biography of George Orwell’s 1984 

                           George Orwell’s name, like that of William Shakespeare, Charles Dickens and Franz Kafka, has given rise to an adjective.  “Orwellian” connotes official deception, secret surveillance, misleading terminology, and the manipulation of history.   Several terms used in Orwell’s best known novel, Nineteen Eighty Four, have entered into common usage, including “doublethink,” “thought crime,” “newspeak,” “memory hole,” and “Big Brother.”  First published in June 1949, a little over a half year prior to Orwell’s death in January 1950, Nineteen Eighty Four is consistently described as a “dystopian” novel – a genre of fiction which, according to Merriam-Webster, pictures “an imagined world or society in which people lead wretched, dehumanized, fearful lives.”

This definition fits neatly the world that Orwell depicted in Nineteen Eighty Four, a world divided between three inter-continental super states perpetually at war, Oceania, Eurasia and Eastasia, with Britain reduced to a province of Oceania bearing the sardonic name “Airstrip One.”  Airstrip One is ruled by The Party under the ideology Insoc, a shortening of “English socialism.”  The Party’s leader, Big Brother, is the object of an intense cult of personality — even though there is no hard proof he actually exists.  Surveillance through two-way telescreens and propaganda are omnipresent.  The protagonist, Winston Smith, is a diligent lower-level Party member who works at the Ministry of Truth, where he rewrites historical records to conform to the state’s ever-changing version of history.  Smith enters into a forbidden relationship with his co-worker, Julia, a relationship that terminates in mutual betrayal.

In his intriguing study, The Ministry of Truth: The Biography of George Orwell’s 1984, British journalist and music critic Dorian Lynskey seeks to explain what Nineteen Eighty-Four “actually is, how it came to be written, and how it has shaped the world, in its author’s absence, over the past seventy years” (p.xiv). Although there are biographies of Orwell and academic studies of Nineteen Eighty-Four’s intellectual context, Lynskey contends that his is the first to “merge the two streams into one narrative, while also exploring the book’s afterlife” (p.xv; I reviewed Thomas Ricks’ book on Orwell and Winston Churchill here in November 2017).   Lynskey’s work is organized in a “Before/After” format.  Part One, about 2/3 of the book, looks at the works and thinkers who influenced Orwell and his novel, juxtaposed with basic Orwell biographical background.  Part II, roughly the last third, examines the novel’s afterlife.

But Lynskey begins in a surprising place, Washington, D.C., in January 2017, where a spokesman for President Donald Trump told the White House press corps that the recently-elected president had taken his oath of office before the “largest audience to ever witness an inauguration – period – both in person and around the globe.”  A presidential adviser subsequently justified this “preposterous lie” by characterizing the statement as “alternative facts” (p.xiii).   Sales of Orwell’s book shot up immediately thereafter.  The incident constitutes a reminder, Lynskey contends, of the “painful lessons that the world appears to have unlearned since Orwell’s lifetime, especially those concerning the fragility of truth in the face of power” (p.xix).

How Orwell came to see the consequences of mutilating truth and gave them expression in Nineteen Eighty-Four is the focus of Part I.  Orwell’s brief participation in the Spanish Civil War, from December 1936 through mid-1937, was paramount among his personal experiences in shaping the novel’s worldview. Spain was the “great rupture in his life; his zero hour” (p.4), the experience that lead Orwell to the conclusion that Soviet communism was as antithetical as fascism and Nazism to the values he held dear (Lynskey’s list of Orwell’s values: “honesty, decency, fairness, memory, history, clarity, privacy, common sense, sanity, England, and love” (p.xv)).  While no single work provided an intellectual foundation for Nineteen Eighty Four in the way that the Spanish Civil War provided the personal and practical foundation, Lynskey discusses numerous writers whose works contributed to the worldview on display in Orwell’s novel.

Lynskey dives deeply into the novels and writings of Edward Bellamy, H.G. Wells and the Russian writer Yevgeny Zamytin.  Orwell’s friend Arthur Koestler set out what Lynskey terms the “mental landscape” for Nineteen Eighty-Four in his 1940 classic Darkness at Noon, while the American conservative James Burnham provided the novel’s “geo-political superstructure” (p.126).  Lynskey discusses a host of other writers whose works in one way or another contributed to Nineteen Eighty-Four’s world view, among them Jack London, Aldous Huxley, Friedrich Hayek, and the late 17th and early 18th century satirist Jonathan Swift.

In Part II, Lynskey treats some of the dystopian novels and novelists that have appeared since Nineteen Eighty-Four.  He provides surprising detail on David Bowie, who alluded to Orwell in his songs and wrote material that reflected the outlook of Nineteen Eighty-Four.  He notes that Margaret Atwood termed her celebrated The Handmaid’s Tale a “speculative fiction of the George Orwell variety” (p.241).  But the crux of Part II lies in Lynskey’s discussion of the evolving interpretations of the novel since its publication, and why it still matters today.  He argues that Nineteen Eighty Four has become both a “vessel into which anyone could pour their own version of the future” (p.228), and an “all-purpose shorthand” for an “uncertain present” (p.213).

In the immediate aftermath of its publication, when the Cold War was at its height, the novel was seen by many as a lesson on totalitarianism and the dangers that the Soviet Union and Communist China posed to the West (Eurasia, Eastasia and Oceania in the novel correspond roughly to the Soviet Union, China and the West, respectively).  When the Cold War ended with the fall of Soviet Union in 1991, the novel morphed into a warning about the invasive technologies spawned by the Internet and their potential for surveillance of individual lives.  In the Age of Trump and Brexit, the novel has become “most of all a defense of truth . . . Orwell’s fear that ‘the very concept of objective truth is fading out of the world’ is the dark heart of Nineteen Eighty-Four. It gripped him long before he came up with Big Brother, Oceania, Newspeak or the telescreen, and it’s more important than any of them” (p.265-66).

* * *

                            Orwell was born as Eric Blair in 1903 in India, where his father was a mid-level civil servant. His mother was half-French and a committed suffragette.  In 1933, prior to publication of his first major book,  Down and Out in Paris and London, which recounts his life in voluntary poverty in the two cities, the fledgling author took the pen name Orwell from a river in Sussex .  He changed names purportedly to save his parents from the embarrassment which  he assumed his forthcoming work  would cause.  He was at best a mid-level journalist and writer when he went to Spain in late 1936, with a handful of novels and lengthy essays to his credit – “barely George Orwell” (p.4), as Lynskey puts it.

The Spanish Civil war erupted after Spain’s Republican government, known as the Popular Front, a coalition of liberal democrats, socialists and communists, narrowly won a parliamentary majority in 1936, only to face a rebellion from the Nationalist forces of General Francisco Franco, representing Spain’s military, business elites, large landowners and the Catholic Church.  Nazi Germany and Fascist Italy furnished arms and other assistance for the Nationalists’ assault on Spain’s democratic institutions, while the Soviet Union assisted the Republicans (the leading democracies of the period, Great Britain, France and the United States, remained officially neutral; I reviewed Adam Hochschild’s work on the Spanish Civil War here in August 2017).   Spain provided Orwell with his first and only personal exposure to the “nightmare atmosphere” (p.17) that would envelop the novel he wrote a decade later.

Fighting with the Workers’ Party of Marxist Unification (Spanish acronym: POUM), a renegade working class party that opposed Stalin, Orwell quickly found himself in the middle of what amounted to a mini-civil war among the disparate left-wing factions on the Republican side, all within the larger civil war with the Nationalists.  Orwell saw first-hand the dogmatism and authoritarianism of the Stalinist left at work in Spain, nurtured by a level of deliberate deceit that appalled him.  He read newspaper accounts that did not even purport to bear any relationship to what had actually happened. For Orwell previously, Lynskey writes:

people were guilty of deliberate deceit or unconscious bias, but at least they believed in the existence of facts and the distinction between true and false. Totalitarian regimes, however, lied on such a grand scale that they made Orwell feel that ‘the very concept of objective truth is fading out of the world’ (p.99).

Orwell saw totalitarianism in all its manifestations as dangerous not primarily because of secret police or constant surveillance but because “there is no solid ground from which to mount a rebellion –no corner of the mind that has not been infected and warped by the state.  It is power that removes the possibility of challenging power” (p.99).

Orwell narrowly escaped death when he was hit by a bullet in the spring of 1937.  He was hospitalized in Barcelona for three weeks, after which he and his wife Eileen escaped across the border to France.  Driven to Spain by his hatred of fascism, Orwell left with a “second enemy. The fascists had behaved just as appallingly as he had expected they would, but the ruthlessness and dishonesty of the communists had shocked him” (p.18).  From that point onward, Orwell criticized communism more energetically than fascism because he had seen communism “up close, and because its appeal was more treacherous. Both ideologies reached the same totalitarian destination but communism began with nobler aims and therefore required more lies to sustain it” (p.22).   After his time in Spain, Orwell knew that he stood against totalitarianism of all stripes, and for democratic socialism as its counterpoint.

The term “dystopia” was not used frequently in Orwell’s time, and Orwell distinguished between “favorable” and “pessimistic” utopias.   Orwell developed what he termed a “pitying fondness” (p.38) for nineteenth-century visions of a better world, particularly the American Edward Bellamy’s 1888 novel Looking Backward.  This highly popular novel contained a “seductive political argument” (p.33) for the nationalization of all industry, and the use of an “industrial army” to organize production and distribution.  Bellamy had what Lynskey terms a “thoroughly pre-totalitarian mind,” with an “unwavering faith in human nature and common sense” that failed to see the “dystopian implications of unanimous obedience to a one-party state that will last forever” (p.38).

Bellamy was a direct inspiration for the works of H.G. Wells, one of the most prolific writers of his age. Wells exerted enormous influence on the young Eric Blair, looming over the boy’s childhood “like a planet – awe inspiring, oppressive, impossible to ignore – and Orwell never got over it” (p.60).  Often called the English Jules Verne, Wells foresaw space travel, tanks, electric trains, wind and water power, identity cards, poison gas, the Channel tunnel and atom bombs.  His fiction imagined time travel, Martian invasions, invisibility and genetic engineering.  The word Wellsian came to mean “belief in an orderly scientific utopia,” but his early works are “cautionary tales of progress thwarted, science abused and complacency punished” (p.63).

Wells was himself a direct influence upon Yevgeny Zamatin’s We which, in Lymskey’s interpretation, constitutes the most direct antecedent to Nineteen Eighty-Four.  Finished in 1920 at the height of the civil war that followed the 1917 Bolshevik Revolution (but not published in the Soviet Union until 1988), We is set in the undefined future, a time when people are referred to only by numbers. The protagonist, D-503, a spacecraft engineer, lives in the One State, where mass surveillance is omnipresent and all aspects of life are scientifically managed.  It is an open question whether We was intended to satirize the Bolshevik regime, in 1920 already a one-party state with extensive secret police.

Zamyatin died in exile in Paris in 1937, at age 53.   Orwell did not read We until sometime after its author’s death.  Whether Orwell “took ideas straight from Zamyatin or was simply thinking along similar lines” is “difficult to say” (p.108), Lynskey writes.  Nonetheless, it is “impossible to read Zamyatin’s bizarre and visionary novel without being strongly reminded of stories that were written afterwards, Orwell’s included” (p.102).

Koestler’s Darkness at Noon offered a solution to the central riddle of the Moscow show trials of the 1930s: “why did so many Communist party members sign confessions of crimes against the state, and thus their death warrants?” Koestler argued that their “years of unbending loyalty had dissolved their belief in objective truth: if the Party required them to be guilty, then guilty they must be” (p.127).  To Orwell this meant that one is punished in totalitarian states not for “ what one does but for what one is, or more exactly, for what one is suspected of being” (p.128).

The ideas contained in James Burnham’s 1944 book, The Managerial Revolution “seized Orwell’s imagination even as his intellect rejected them” (p.122).  A Trotskyite in his youth who in the 1950s helped William F. Buckley found the conservative weekly, The National Review, Burnham saw the future belonging to a huge, centralized bureaucratic state run by a class of managers and technocrats.  Orwell made a “crucial connection between Burnham’s super-state hypothesis and his own long-standing obsession with organized lying” (p.121-22).

Orwell’s chronic lung problems precluded him from serving in the military during World War II.  From August 1941 to November 1943, he worked for the Indian Section of the BBC’s Eastern Service, where he found himself “reluctantly writing for the state . . . Day to day, the job introduced him to the mechanics of propaganda, bureaucracy, censorship and mass media, informing Winston Smith’s job at the Ministry of Truth” (p.83; Orwell’s boss at the BBC was notorious Cambridge spy Guy Burgess, whose biography I reviewed here in December 2017).   Orwell left the BBC in 1943 to become literary editor of the Tribune, an anti-Stalinist weekly.

While at the Tribune, Orwell found time to produce Animal Farm, a “scrupulous allegory of Russian history from the revolution to the Tehran conference” (p.138), with each animal representing an individual, Stalin, Trotsky, Hitler, and so on.  Animal Farm shared with Nineteen Eighty-Four an “obsession with the erosion and corruption of memory” (p.139).  Memories in the two works are gradually erased, first, by the falsification of evidence; second, by the infallibility of the leader; third, by language; and fourth, by time.  Published in August 1945, Animal Farm quickly became a best seller.  The fable’s unmistakable anti-Soviet message forced Orwell to remind readers that he remained a socialist.  “I belong to the Left and must work inside it,” he wrote, “much as I hate Russian totalitarianism and its poisonous influence of this country” (p.141).

Earlier in 1945, Orwell’s wife Eileen died suddenly after being hospitalized for a hysterectomy, less than a year after the couple had adopted a son, whom they named Richard Horatio Blair.  Orwell grieved the loss of his wife by burying himself in the work that culminated in Nineteen Eighty-Four.   But Orwell became ever sicker with tuberculosis as he worked  over the next four years on the novel which was titled The Last Man in Europe until almost immediately prior to publication (Lynskey gives no credence to the theory that Orwell selected 1984 as a inversion of the last two digits of 1948).

Yet, Lynskey rejects the notion that Nineteen Eighty-Four was the “anguished last testament of a dying man” (p.160).  Orwell “never really believed he was dying, or at least no more than usual. He had suffered from lung problems since childhood and had been ill, off and on, for so long that he had no reason to think that this time would be the last ” (p.160).  His novel was published in June 1949.  227 days later, in January 1950, Orwell died when a blood vessel in his lung ruptured.

* * *

                                    Nineteen Eighty-Four had an immediate positive reception. The book was variously compared to an earthquake, a bundle of dynamite, and the label on a bottle of poison.  It was made into a movie, a play, and a BBC television series.  Yet, Lynskey writes, “people seemed determined to misunderstand it” (p.170).  During the Cold War of the early 1950s, conservatives and hard line leftists both saw the book as a condemnation of socialism in all its forms.  The more astute critics, Lynskey argues, were those who “understood Orwell’s message that the germs of totalitarianism existed in Us as well as Them” (p.182).  The Soviet invasion of Hungary in 1956 constituted a turning point in interpretations of Nineteen Eighty-Four.  After the invasion, many of Orwell’s critics on the left “had to accept that they had been wrong about the nature of Soviet communism and that he [Orwell] had been infuriatingly right” (p.210).

The hoopla that accompanied the actual year 1984, Lynskey notes wryly, came about only because “one man decided, late in the day, to change the title of his novel” (p.234).   By that time, the book was being read less as an anti-communist tract and more as a reminder of the abuses exposed in the Watergate affair of the previous decade, the excesses of the FBI and CIA, and the potential for mischief that personal computers, then in their infancy, posed.  With the fall of the Berlin wall and the end of communism between 1989 and 1991, focus on the power of technology intensified.

But today the focus is on Orwell’s depiction of the demise of objective truth in Nineteen Eighty-Four, and appropriately so, Lynskey argues, noting how President Trump masterfully “creates his own reality and measures his power by the number of people who subscribe to it: the cruder the lie, the more power its success demonstrates” (p.264).  It is truly Orwellian, Lynskey contends, that the phrase “fake news” has been “turned on its head by Trump and his fellow authoritarians to describe real news that is not to their liking, while flagrant lies become ‘alternative facts’” (p.264).

* * *

                                 While resisting the temptation to term Nineteen Eighty-Four more relevant now than ever, Lynskey asserts that the novel today is nonetheless  “a damn sight more relevant than it should be” (p.xix).   An era “plagued by far-right populism, authoritarian nationalism, rampant disinformation and waning faith in liberal democracy,” he concludes, is “not one in which the message of Nineteen Eighty-Four can be easily dismissed” (p.265).

Thomas H. Peebles

La Châtaigneraie, France

February 25, 2020

2 Comments

Filed under Biography, British History, European History, Language, Literature, Political Theory, Politics, Soviet Union

Stirring Rise and Crushing Fall of a Renaissance Man

 

 

Jeff Sparrow, No Way But This:

In Search of Paul Robeson (Scribe)

            If you are among those who think the term “Renaissance Man” seems fuzzy and even frivolous when applied to anyone born after roughly 1600, consider the case of Paul Robeson (1898-1976), a man whose talents and genius extended across an impossibly wide range of activities.  In the 1920s and 1930s, Robeson, the son of a former slave, thrilled audiences worldwide with both his singing and his acting.  In a mellifluous baritone voice, Robeson gave new vitality to African-American songs that dated to slave plantations.  On the stage, his lead role as Othello in the play of that name gave a distinctly 20th century cast to one of Shakespeare’s most enigmatic characters.  He also appeared in a handful of films in the 1930s.  Before becoming a singing and acting superstar, Robeson had been one of the outstanding athletes of his generation, on par with the legendary Jim Thorpe.  Robeson  further earned a degree from Columbia Law School and reportedly was conversant in upwards of 15 languages.

Robeson put his multiple talents to use as an advocate for racial and economic justice internationally.  He was among the minority of Americans in the 1930s who linked European Fascism and Nazism to the omnipresent racism he had confronted in America since childhood.  But Robeson’s political activism during the Cold War that followed World War II ensnared the world class Shakespearean actor in a tragedy of Shakespearean dimension, providing a painful denouement to his uplifting life story.

Although Robeson never joined a communist party, he perceived a commitment to full equality in the Soviet Union that was missing in the West.  While many Westerners later saw that their admiration for the Soviet experiment had been misplaced, Robeson never publicly criticized the Soviet Union and paid an unconscionably heavy price for his stubborn consistency during the Cold War.  The State Department refused to renew his passport, precluding him from traveling abroad for eight years.  He was hounded by the FBI and shunned professionally.  Robeson had suffered from depression throughout his adult life.  But his mental health issues intensified in the Cold War era and included a handful of suicide attempts.  Robeson spent his final years in limbo, silenced, isolated and increasingly despairing, up to his death in 1976.

In No Way But This: In Search of Paul Robeson, Jeff Sparrow, an Australian journalist, seeks to capture Robeson’s stirring rise and crushing fall.  The book’s subtitle – “In Search of Paul Robeson” — may sound like any number of biographical works, but in this case encapsulates precisely the book’s unique quality.  In nearly equal doses, Sparrow’s work consists of the major elements of Robeson’s life and Sparrow’s account of how he set about to learn the details of that life — an example of biography and memoir melding together.  Sparrow visited many of the places where Robeson lived, including Princeton, New Jersey, where he was born in 1898; Harlem in New York City; London and Wales in Great Britain; and Moscow and other locations in today’s Russia.

In each location, Sparrow was able to find knowledgeable people, such as archivists and local historians, who knew about Robeson and were able to provide helpful insights into the man’s relationship to the particular location.  We learn for instance from Sparrow’s guides how the Harlem that Robeson knew is rapidly gentrifying today and how the economy of contemporary Wales functions long after closure of the mines which Robeson once visited.  Sparrow’s travels to the former Soviet Union take him to several locations where Robeson never set foot, including Siberia, all in effort to understand the legacy of Soviet terror which Robeson refused to acknowledge.  Sparrow’s account of his travels to these diverse places and his interactions with his guides reads at times like a travelogue.  Readers looking to plunge into the vicissitudes of Robeson’s life may find these portions of the book distracting.  The more compelling portions are those that treat Robeson’s extraordinary life itself.

* * *

            That life began in Princeton, New Jersey, world famous for its university of that name.  The Robeson family lived in a small African-American community rarely visited by those whose businesses and lives depended upon the university.  Princeton was then considered,  as Sparrow puts it, a “northern outpost of the white supremacist South: a place ‘spiritually located in Dixie’” (p.29).  William Robeson, Paul’s father, was a runaway former slave who earned a degree from Lincoln University and became an ordained Presbyterian minister.  His mother Maria, who came from an abolitionist Quaker family and was of mixed ancestry, died in a house fire when Paul was six years old.  Thereafter, William raised Paul and his three older brothers and one older sister on his own.  William played a formidable role in shaping young Paul, who later described his father as the “glory of my boyhood years . . . I loved him like no one in all the world” (p.19).

William abandoned Presbyterianism for the African Methodist Episcopal Zion Church, one of the oldest black denominations in the country, and took on a much larger congregation in Somerville, New Jersey, where Paul attended high school.  One of a handful of African-American students in a sea of whites, Robeson excelled academically and played baseball, basketball and football.  He also edited the school paper, acted with the drama group, sang with the glee club, and participated in the debating society.  When his father was ill or absent, he sometimes preached at his father’s church.  Robeson’s high school accomplishments earned him a scholarship to nearby Rutgers University.

At Rutgers, Robeson again excelled academically.  He became a member of the Phi Beta Kappa honor society and was selected as class valedictorian.  As in high school, he was also an outstanding athlete, earning varsity letters in football, basketball and track.  A standout in football, Robeson was “one of the greatest American footballers of a generation,” so much so that his coach “designed Rutgers’ game-plan tactics specifically to exploit his star’s manifold talents” (p.49).  Playing in the backfield, Robeson could both run and throw. His hefty weight and size made him almost impossible to stop.  On defense, his tackling “took down opponents with emphatic finality” (p.49).  Twice named to the All-American Football Team, Robeson was not inducted into the College Football Hall of Fame until 1995, 19 years after his death.

After graduation from Rutgers in 1919, Robeson spent the next several years in New York City.  He enrolled in New York University Law School, then transferred to Columbia and moved to Harlem.  There, Robeson absorbed the weighty atmosphere the Harlem Renaissance, a flourishing of African-American culture, thinking and resistance in the 1920s.  While at Columbia, Robeson met chemistry student Eslanda Goode, known as “Essie.”  The couple married in 1921.

Robeson received his law degree from Columbia in 1923 and worked for a short time in a New York law firm.  But he left the firm abruptly when a secretary told him that she would not take dictation from an African-American.  Given his talents, one wonders what Robeson could have achieved had he continued in the legal profession.  It is not difficult to imagine Robeson the lawyer becoming the black Clarence Darrow of his age, the “attorney for the damned;” or a colleague of future Supreme Court Justice Thurgood Marshall in the 20th century’s legal battles for full African-American rights.  But Robeson gravitated instead toward singing and acting after leaving the legal profession, while briefly playing semi-pro football and basketball.

Robeson made his mark as a singer by rendering respectable African-American songs such as “Sometimes I Feel Like a Motherless Child” and “Swing Low Sweet Chariot” that had originated on the plantations — “sorrow songs” that “voiced the anguish of slavery” (p.81), as Sparrow puts it.  After acting in amateur plays, Robeson won the lead role in Eugene O’Neill’s All God’s Chillun Got Wings, a play about inter-racial sexual attraction that established Robeson as an “actor to watch” (p.69).  Many of the leading lights of the Harlem Renaissance criticized Robeson’s role in the play as reinforcing racial stereotypes, while white reviewers “blasted the play as an insult to the white race” (p.70).  An opportunity to star in O’Neill’s Emperor Jones on the London stage led the Robesons to Britain in 1925, where they lived for several years.  The couple’s  only child, Paul Jr., whom they called “Pauli,” was born in London in 1927.

Robeson delighted London audiences with his role in the musical Show Boat, which proved to be as big a hit in Drury Lane as it had been on Broadway.  He famously changed the lines to “Old Man River” from the meek “I’m tired of livin’” and “feared of dyin'” to a declaration of resistance: “I must keep fightin’/Until I’m dyin'”.  His rendition of “Old Man River,” Sparrow writes, transported the audience “beyond the silly narrative to an almost visceral experience of oppression and pain.”  Robeson used his huge frame, “bent and twisted as he staggered beneath a bale, to convey the agony of black history while revealing the tremendous strength forged by centuries of resistance” (p.103).

The Robesons in their London years prospered financially and moved easily in a high inner circle of respectable society.  The man who couldn’t rent a room in many American cities lived as an English gentleman in London, Sparrow notes.  But by the early 1930s, Robeson had learned to see respectable England as “disconcertingly similar” to the United States, “albeit with its prejudices expressed through nicely graduated hierarchies of social class.  To friends, he spoke of his dismay at how the British upper orders related to those below them” (p.131).

In London, as in New York, the “limited roles that playwrights offered to black actors left Paul with precious few opportunities to display any range. He was invariably cast as the same kind of character, and as a result even his admirers ascribed his success to instinct rather than intellect, as a demonstration not so much of theatrical mastery but of an innate African talent for make-believe, within certain narrow parameters” (p.107). Then, in 1930, Robeson received a fateful invitation to play Othello in a London production, a role that usually went to an actor of Arab background.

Robeson’s portrayal of Othello turned out triumphal, with the initial performance receiving an amazing 20 curtain calls.  In that production, which  ran for six weeks, Robeson transformed Shakespeare’s tragedy into an “affirmation of black achievement, while hinting at the rage that racism might yet engender” (p.113).  Thereafter, Othello “became central to Paul’s public persona,” (p.114), providing a role that seemed ideal for Robeson: a “valiant high-ranking figure of color, an African neither to be pitied nor ridiculed” (p.109).

While in London, Robeson developed sensitivity to the realities of colonial Africa through friendships with men such as Nnamdi Azikiwe, Jomo Kenyatta, and Kwame Nkrumah, future leaders of independence movements in Nigeria, Kenya and Ghana, respectively.  Robeson retained a keen interest in African history and politics for the remainder of his life.  But  Robeson’s commitment to political activism seems to have crystallized through his frequent visits to Wales, where he befriended striking miners and sang for them.

Robeson supported the Welsh labor movement because of the “collectivity it represented. In Wales, in the pit villages and union lodges and little chapels, he’d found solidarity” (p.149).  Robeson compared Welsh churches to the African-American churches he knew in the United States, places where a “weary and oppressed people drew succor from prayer and song” (p.133).  More than anywhere else, Robeson’s experiences in Wales made him aware of the injustices which capitalism can inflict upon those at the bottom of the economic ladder, regardless of color.  Heightened class-consciousness proved to be a powerful complement to Robeson’s acute sense of racial injustice developed through the endless humiliations encountered in his lifetime in the United States.

Robeson’s sensitivity to economic and racial injustice led him to the Soviet Union in the 1930s, which he visited many times and where he and his family lived for a short time.  But a stopover in Berlin on his initial trip to Moscow in 1934 opened Robeson’s eyes to the Nazis’ undisguised racism.  Nazism to Robeson was a “close cousin of the white supremacy prevailing in the United States,” representing a “lethal menace” to black people.  For Robeson, the suffering of African Americans in their own country was no justification for staying aloof from international politics, but rather a “reason to oppose fascism everywhere” (p.153).

With the outbreak of the Spanish Civil War in 1936, Spain became the key battleground to oppose fascism, the place where “revolution and reaction contested openly” and “Europe’s fate would be settled” (p.160).  After speaking and raising money on behalf of the Spanish Republican cause in the United States and Britain, Robeson traveled to Barcelona, where he sang frequently.  Robeson’s brief experience in Spain transformed him into a “fervent anti-fascist, committed to an international Popular Front: a global movement uniting democrats and radicals against Hitler, Mussolini, and their allies” that would also extend democracy within the United States, end colonialism abroad, and “abolish racism everywhere” (p.196-97).

Along with many progressives of the 1930s, Robeson looked to the Soviet Union to lead the global fight against racism and fascism.  Robeson once said in Moscow, “I feel like a human being for the first time since I grew up.  Here I am not a Negro but a human being” (p.198).  Robeson’s conviction that the Soviet Union was a place where  a non-racist society was possible “sustained him for the rest of his political life” (p.202).   Although he never joined a communist party, from the 1930s onward Robeson accepted most of the party’s ideas and “loyally followed its doctrinal twists and turns” (p.215).  It is easy, Sparrow indicates, to see Robeson’s enthusiasm for the Soviet Union as the “drearily familiar tale of a gullible celebrity flattered by the attentions of a dictatorship” (p.199).

Sparrow wrestles with the question of the extent to which Robeson was aware of the Stalinist terror campaigns that by the late 1930s were taking the lives of millions of innocent Soviet citizens.  He provides no definitive answer to this question, but Robeson never wavered publicly in his support for the Soviet Union.  Had he acknowledged Soviet atrocities, Sparrow writes, he would have besmirched the “vision that had inspired him and all the people like him – the conviction that a better society was an immediate possibility” (p.264).

Robeson devoted himself to the Allied cause when the United States and the Soviet Union found themselves on the same side fighting Nazi aggression during World War II, “doing whatever he could to help the American government win what he considered an anti-fascist crusade” (p.190).  His passion for Soviet Russia “suddenly seemed patriotic rather than subversive” (p.196-97).  But that quickly changed during the intense anti-Soviet Cold War that followed the defeat of Nazi Germany.  Almost overnight in the United States, communist party members and their sympathizers became associated “not only with a radical political agenda but also with a hostile state.  An accusation of communist sympathies thus implied disloyalty – and possibly treason and espionage” (p.215).

The FBI, which had been monitoring Robeson for years, intensified its scrutiny in 1948.   It warned concert organizers and venue owners not to allow Robeson to perform “communist songs.”  If a planned tour went ahead, Sparrow writes, proprietors were told that they would be:

judged Red sympathizers themselves. The same operation was conducted in all the art forms in which Paul excelled.  All at once, Paul could no longer record music, and the radio would not play his songs.  Cinemas would not screen his movies. The film industry had already recognized that Paul was too dangerous; major theatres arrived at the same conclusion. The mere rumor that an opera company was thinking about casting him led to cries for a boycott.  With remarkable speed, Paul’s career within the country of his birth came to an end (p.216).

In 1950, the US State Department revoked Robeson’s passport after he declined to sign an affidavit denying membership in the Communist Party.  When Robeson testified before the House Un-American Affairs Committee (HUAC) in 1956, a Committee member asked Robeson why he didn’t go back to the Soviet Union if he liked it so much.  Roberson replied: “Because my father was a slave . . . and my people died to build this country, and I am going to stay here, and have a part of it just like you.  And no fascist-minded people will drive me from it. Is that clear?” (p.228). Needless to say, this was not what Committee members wanted to hear, and Robeson’s remarks “brought the moral weight of the African-American struggle crashing down upon the session” (p.228-29).

Robeson was forced to stay on the sidelines in early 1956 when the leadership of the fledgling Montgomery bus boycott movement (which included a young Dr. Martin Luther King, Jr.) concluded that his presence would undermine the movement’s fragile political credibility.  On the other side of the Cold War divide, Soviet leader Nikita Khrushchev delivered a not-so-secret speech that winter to party loyalists in which he denounced Stalinist purges.   Sparrow hints but doesn’t quite say that Robeson’s exclusion from the bus boycott and Khrushchev’s acknowledgment of the crimes committed in the name of the USSR had a deleterious effect on Robeson’s internal well-being.   He had suffered from bouts of mental depression throughout his adult life, most notably when a love affair with an English actress in the 1930s ended badly (one of several Robeson extra-marital affairs). But his mental health deteriorated during the 1950s, with “periods of mania alternating with debilitating lassitude” (p.225).

Even after Robeson’s passport was restored in 1958 as a result of a Supreme Court decision, he never fully regained his former zest.  A broken man, he spent his final decade nearly invisible, living in his sister’s care before dying of a stroke in 1976.

* * *

                     Sparrow describes his book as something other than a conventional biography, more of a “ghost story” in which particular associations in the places he visited form an “eerie bridge” (p.5) between Robeson’s time and our own.  But his travels to the places where Robeson once lived and his interactions with his local guides have the effect of obscuring the full majesty and tragedy of Robeson’s life.  With too much attention given to Sparrow’s search for what remains of Robeson’s legacy on our side of the bridge, Sparrow’s part biography, part travel memoir comes up short in helping readers discover Robeson himself on the other side.

 

 

Thomas H. Peebles

Paris, France

October 21, 2019

 

 

6 Comments

Filed under American Society, Biography, European History, History, Politics, United States History