Reparations [4]:  The Essential Doubt

And so you see I have come to doubt
All that I once held as true

I stand alone without beliefs
The only truth I know is you.

Kathy’s Song[1]
Paul Simon

We saw last time that the U.S. government could waive its legal defense of sovereign immunity to pave the way for slavery reparations. It would take more than a legal reckoning for that to happen. Law lies on the surface of society, readily visible, but it has deep roots in history and ideology, national identity and mission, values and beliefs, ways of looking at the world and how life works.[2] These ancient root systems invoke fierce allegiances deeply embedded in human psyche and culture. Because the legal doctrine of sovereign immunity is grounded in Biblical doctrine,[3] laying it aside requires doubt and dissent of the highest order – national treason and religious apostasy in a single act.

Doubt of that magnitude is rare beyond description but not without precedent. Consider, for example, Germany’s reparations for World War II, which required not only the international banishment of Nazism, but also the German people’s moral renunciation of Nazism’s philosophical and political roots stretching back to the 19th Century.[4]; In comparison, the USA”s roots of slavery (and hence racism) extend back to the earliest New World settlements, which imported English common law, including the divine right of kings and its nationalistic version, sovereign immunity. Renouncing the latter to pave the way for slavery reparations would require a similar American moral renunciation of centuries of related social, economic, and political ideology and set new terms for a post-racism American state.

That, in turn, would require a reckoning with the “first cause” roots of the divine right of kings and sovereign immunity.

The First Cause Roots of Sovereign Immunity

A “first cause” satisfies the human desire for life to make sense by assigning a cause to every effect. Trouble is, as you trace the cause and effect chain to its remotest origins, you eventually run out of causes, leaving you with only effects. That’s when a first cause comes to the rescue. A first cause has no prior cause – it is so primary that nothing came before it but everything came after it. Since knowledge can’t reach that far back, a first cause is a matter of belief:  you take it on faith, declare the beginning into existence, and go from there.

Western civilization’s worldview historically identified God as the ultimate first cause.

“The classic Christian formulation of this argument came from the medieval theologian St. Thomas Aquinas, who was influenced by the thought of the ancient Greek philosopher Aristotle. Aquinas argued that the observable order of causation is not self-explanatory. It can only be accounted for by the existence of a first cause; this first cause, however, must not be considered simply as the first in a series of continuing causes, but rather as first cause in the sense of being the cause for the whole series of observable causes.

“The 18th-century German philosopher Immanuel Kant rejected the argument from causality because, according to one of his central theses, causality cannot legitimately be applied beyond the realm of possible experience to a transcendent cause.

“Protestantism generally has rejected the validity of the first-cause argument; nevertheless, for most Christians it remains an article of faith that God is the first cause of all that exists. The person who conceives of God in this way is apt to look upon the observable world as contingent—i.e., as something that could not exist by itself.”[5]

God is the ultimate Sovereign from which all lesser sovereigns – the king, the national government — derive their existence and legitimacy. God’s first cause Sovereignty justifies God’s right to rule as God sees fit. The king and the state, having been set into place by God, derive a comparable right of domination from God. The king and the national government are to the people what God is to them.

The Divine Right of Kings

When kings ruled countries, their divine line of authority took legal form as the Divine Right of Kings.

“The divine right of kings, divine right, or God’s mandate is a political and religious doctrine of royal and political legitimacy. It stems from a specific metaphysical framework in which the king (or queen) is pre-selected as an heir prior to their birth. By pre-selecting the king’s physical manifestation, the governed populace actively (rather than merely passively) hands the metaphysical selection of the king’s soul – which will inhabit the body and thereby rule them – over to God. In this way, the ‘divine right’ originates as a metaphysical act of humility or submission towards the Godhead.

“Consequentially, it asserts that a monarch (e.g. a king) is subject to no earthly authority, deriving the right to rule directly from divine authority, like the monotheist will of God. The monarch is thus not subject to the will of his people, of the aristocracy, or of any other estate of the realm. It implies that only divine authority can judge an unjust monarch and that any attempt to depose, dethrone or restrict their powers runs contrary to God’s will and may constitute a sacrilegious act.”[6]

The Divine Right of Kings was a favorite doctrine of the first King James of England, who commissioned what would become the King James Version of the Bible partly in response to Puritan challenges to the Church of England’s doctrine of an ordained clergy that could trace its lineage to the original Apostles.

Divine right of kings, in European history, a political doctrine in defense of monarchical ‘absolutism,’ which asserted that kings derived their authority from God and could not therefore be held accountable for their actions by any earthly authority such as a parliament. Originating in Europe, the divine-right theory can be traced to the medieval conception of God’s award of temporal power to the political ruler, paralleling the award of spiritual power to the church. By the 16th and 17th centuries, however, the new national monarchs were asserting their authority in matters of both church and state. King James I of England (reigned 1603–25) was the foremost exponent of the divine right of king….”[7]

“While throughout much of world history, deified potentates have been the rule, in England, absolute monarchy never got a solid foothold, but there certainly was the attempt. Elements of British political theory and practice encouraged absolutism—the idea and practice that the king is the absolute law and that there is no appeal beyond him. Several movements and ideas hurried along the idea of absolute monarchy in England. One of those ideas was the divine right of kings,

“In England, the idea of the divine right of kings will enter England with James VI of Scotland who will come and rule over both England and Scotland as James I in 1603 and will commence the line of several ‘Stuart’ monarchs. James had definite ideas about his role as monarch, and those ideas included the divine right of kings. Here are just a few of James’ statements that reflect his view that he ruled by divine right:

      • Kings are like gods— “…kings are not only God’s lieutenants upon earth, and sit upon God’s throne, but even by God himself are called gods.”
      • Kings are not to be disputed— “… That as to dispute what God may do is blasphemy….so is it sedition in subjects to dispute what a king may do in the height of his power.”
      • Governing is the business of the king, not the business of the subjects— “you do not meddle with the main points of government; that is my craft . . . to meddle with that were to lesson me . . . I must not be taught my office.”
      • Kings govern by ancient rights that are his to claim— “I would not have you meddle with such ancient rights of mine as I have received from my predecessors . . . .”
      • Kings should not be bothered with requests to change settled law— “…I pray you beware to exhibit for grievance anything that is established by a settled law…”
      • Don’t make a request of a king if you are confident he will say “no.”— “… for it is an undutiful part in subjects to press their king, wherein they know beforehand he will refuse them.”

“James’ views sound egotistical to us today, but he was not the only one that held them. These views were held by others, even some philosophers. For example, the English philosopher Thomas Hobbes wrote a work called Leviathan in 1651 in which he said that men must surrender their rights to a sovereign in exchange for protection. While Hobbes’ was not promoting the divine right of kings per se, he was providing a philosophy to justify a very strong absolute ruler, the kind that the divine right of kings prescribes. Sir Robert Filmer was a facilitator of the divine right of kings and wrote a book about it called Patriarcha (1660) in which he said that the state is like a family and that the king is a father to his people. Filmer also says that the first king was Adam and that Adam’s sons rule the nations of the world today. So, the King of England would be considered the eldest son of Adam in England or the King of France would be Adam’s eldest son in France.”[8]

King James, Witch Hunter

King James had no impartial academic interest in a Bible translation that supported his divine right:  during his reign, the “Cradle King” accumulated a long list of covered offenses that included mass murder, torture, injustice, tracheary, cruelty, and misogyny.

“The witch-hunts that swept across Europe from 1450 to 1750 were among the most controversial and terrifying phenomena in history – holocausts of their times. Historians have long attempted to explain why and how they took such rapid and enduring hold in communities as disparate and distant from one another as Navarre and Copenhagen. They resulted in the trial of around 100,000 people (most of them women), a little under half of whom were 
put to death.

“One of the most active centres of witch-hunting was Scotland, where perhaps 
4,000 people were consigned to the flames – 
a striking number for such a small country, 
and more than double the execution rate in England. The ferocity of these persecutions can be attributed to the most notorious royal witch-hunter: King James VI of Scotland, who in 1603 became James I of England.

“Most of the suspects soon confessed – under torture – to concocting a host of bizarre and gruesome spells and rituals in order to whip up the storm.… James was so appalled when he heard such tales that he decided to personally superintend the interrogations… while the king looked on with ‘great delight’.

“James’s beliefs had a dangerously misogynistic core. He grew up to scorn – even revile – women. Though he was by no means alone in his view of the natural weakness and inferiority of women, his aversion towards them was unusually intense. He took every opportunity to propound the view that they were far more likely than men to succumb to witchcraft…. He would later commission a new version of the Bible in which all references to witches were rewritten in the female gender.

“Most witchcraft trials constituted grave miscarriages of justice…. If the actual facts of a case were unsatisfactory, or did not teach a clear enough moral lesson, then they were enhanced, added to or simply changed.”[9]

When the new King James Bible substantiated the King’s divine right to carry on these activities, and when the USA imported the king’s divine right into its legal system as sovereign immunity, both acknowledged God as the first cause of these legal doctrines. Like the King, the U.S. government also has a long list of covered offenses:  the treatment of slaves during the reign of legal slavery mirrors King James’ obsession with brutalizing, lynching, and murdering witches.

In the U.S., where a 2019 Gallup Poll found that 64% – 87% of Americans believe in God  (depending on how the question was asked), there remain many ”Christians [for whom] it remains an article of faith that God is the first cause of all that exists.[10] As a result, we see in the USA’s current social and political climate both explicit and implicit affirmation of the following Bible passages (which the online source appropriately expresses in the King James version) to substantiate the ability of national leaders to avoid accountability for acts of governance that sponsor this kind of horrifying treatment of citizens.[11]:

“Let every soul be subject unto the higher powers. For there is no power but of God: the powers that be are ordained of God. Whosoever therefore resisteth the power, resisteth the ordinance of God: and they that resist shall receive to themselves damnation. For rulers are not a terror to good works, but to the evil. Wilt thou then not be afraid of the power? do that which is good, and thou shalt have praise of the same: For he is the minister of God to thee for good. But if thou do that which is evil, be afraid; for he beareth not the sword in vain: for he is the minister of God, a revenger to execute wrath upon him that doeth evil. Wherefore ye must needs be subject, not only for wrath, but also for conscience sake.” Romans 13:1-5, KJV

“Lift not up your horn on high: speak not with a stiff neck. For promotion cometh neither from the east, nor from the west, nor from the south. But God is the judge: he putteth down one, and setteth up another.” Psalms 75:5-7, KJV

“Daniel answered and said, Blessed be the name of God for ever and ever: for wisdom and might are his: And he changeth the times and the seasons: he removeth kings, and setteth up kings: he giveth wisdom unto the wise, and knowledge to them that know understanding:” Daniel 2:20-21, KJV

“This matter is by the decree of the watchers, and the demand by the word of the holy ones: to the intent that the living may know that the most High ruleth in the kingdom of men, and giveth it to whomsoever he will, and setteth up over it the basest of men.” Daniel 4:17, KJV

“I have made the earth, the man and the beast that are upon the ground, by my great power and by my outstretched arm, and have given it unto whom it seemed meet unto me.” Jeremiah 27:5, KJV

“The king’s heart is in the hand of the LORD, as the rivers of water: he turneth it whithersoever he will.” Proverbs 21:1, KJV

“For rebellion is as the sin of witchcraft, and stubbornness is as iniquity and idolatry. Because thou hast rejected the word of the LORD, he hath also rejected thee from being king. And Saul said unto Samuel, I have sinned: for I have transgressed the commandment of the LORD, and thy words: because I feared the people, and obeyed their voice. Now therefore, I pray thee, pardon my sin, and turn again with me, that I may worship the LORD. And Samuel said unto Saul, I will not return with thee: for thou hast rejected the word of the LORD, and the LORD hath rejected thee from being king over Israel.” 1 Samuel 15:23-26, KJV

“And upon a set day Herod, arrayed in royal apparel, sat upon his throne, and made an oration unto them. And the people gave a shout, saying, It is the voice of a god, and not of a man. And immediately the angel of the Lord smote him, because he gave not God the glory: and he was eaten of worms, and gave up the ghost.” Acts 12:21-23, KJV

The Ultimate Focus of Doubt:  God

In “Abrahamic” cultures — Jewish, Muslim, and Christian – the Biblical God is the first cause of the divine right of kings and sovereign immunity. The full force of patriotic nationalism and religious zeal therefore originates with God – which explains why a surprising number of European nations had blasphemy laws on the books until not that long ago, and why some nations still do.[12]

“Blasphemy is the act of insulting or showing contempt or lack of reverence to a deity, or sacred objects, or toward something considered sacred or inviolable.”[13]

God, it seems, like kings and sovereign nations, has much to be excused from. Aside from the Biblical God’s sponsorship of war, genocide, mass murder, rape, torture, and brutality to humans and animals, a list of modern labels would include misogynist, homophobe, and xenophobe. But of course you don’t think that way if you’re a believer, because that would be blasphemy, often punishable by death, often after the infliction of the kind of cruel and unusual punishment reserved for the faithful and unfaithful alike. As for the latter, the Bible makes it a badge of honor for the faithful to suffer in the name of God:

“Some were tortured, refusing to accept release, so that they might rise again to a better life. Others suffered mocking and flogging, and even chains and imprisonment. They were stoned, they were sawn in two, they were killed with the sword. They went about in skins of sheep and goats, destitute, afflicted, mistreated—of whom the world was not worthy—wandering about in deserts and mountains, and in dens and caves of the earth. And all these, though commended through their faith, did not receive what was promised,” Hebrews 11:  35-39.ESV

Transformation Made Possible by Doubt

Nonbelievers not vexed with these kinds of rights of the sovereign and duties of the governed are free to doubt God’s first cause status and its derivative doctrines, laws, and policies. In the USA, doubt embraced on that level would open the door to any number of contrary beliefs – for example:

    • The state does not enjoy superior status — historically, legally, morally, or otherwise – that gives it a right to act without consequence.
    • The people governed are therefore not bound – theologically, morally, or otherwise – to submit to government that is not responsible for its actions.

Once you’re no longer worried about breaking faith with God as the first cause of your national institutional structure, a while new “social contract” (also discussed last time) between government and the people becomes possible – a contract that would, in effect, not be satisfied with paying only descendants of slaves “damages” for past harm, but would look to establish a fresh national vision of the duties of those who govern and the rights and freedoms of the governed. The result, it would seem, is the possibility of ending the USA’s institutionalized racism for good.

[1] Who was Paul Simon’s Kathy? And whatever happened to her? See this article from The Guardian.

[2] See the Belief Systems and Culture category of posts in my Iconoclast.blog.

[3] The Founding Myth: Why Christian Nationalism Is Un-American, Andrew L. Seidel (2019). Although the USA was not founded as a Christian nation, its core values and beliefs, like those of other Western countries, are Classical and Biblical in origin.

[4]  See Alpha History and The Mises Institute on the historical origins of Nazism.

[5]  Encyclopedia Britannica. See also New World Encyclopedia and the Stanford Dictionary of Philosophy.

[6] Wikipedia – The Divine Right of Kings.

[7] Encyclopedia Britannica and Wikipedia.. See also the New World Encyclopedia

[8] Owlcation

[9] Borman, Tracy, James VI And I: The King Who Hunted Witches,  History Extra (BBC Historical Magazine)  (March 27, 2019)

[10]  Encyclopedia Britannica. See also New World Encyclopedia and the Stanford Dictionary of Philosophy.

[11]Bill’s Bible Basics.”

[12]  Wikipedia – Blasphemy law.

[13]  Wikipedia – Blasphemy.

America’s National Character, Revealed in its COVID-19 Response

“The entire man is… to be seen in the cradle of the child. The growth of nations presents something analogous to this; they all bear some marks of their origin. If we were able to go back… we should discover… the primal cause of the prejudices, the habits, the ruling passions, and, in short, all that constitutes what is called the national character.”

Alexis de Tocqueville, Democracy in America (1835)

“Begin as you would continue,” my new mother-in-law told my bride and me. Her advice was good beyond gold – a standard we return to in every new beginning, of which there’ve been many in 40+ years.

Alexis de Tocqueville didn’t offer the principle as advice, he recognized its operation in the America he famously toured and wrote about – a nation shaping itself around its founding principles – its “primal cause.” A country’s “national character,” he said, is revealed in the “prejudices,” “habits,” and “ruling passions” of the government and the people. The specifics may shift over time as certain founding values prevail over others due to political tradeoffs and changing circumstances, but in the long haul the country stays true to its origins. Countries, like marriages, continue as they began.

The same dynamics that apply to individuals and nations also apply to institutions, for example societal institutions of law, economics, academics, and commercial enterprise. And for all of them, there’s no such thing as a single beginning to be sustained forever. Personal, national, and institutional histories are shaped around many beginnings and endings. With every new beginning comes an invitation to return to “primal causes” and accept the transformation of historical into contemporary; i.e., each path forward requires a fresh look at how the past’s wisdom can help navigate today’s unprecedented challenges. Trouble is, transformation is perhaps the most difficult thing asked of a person, relationship, institution, nation. The opportunity to transform is therefore rarely recognized, much less embraced, but without it there will be hardening into what was but no longer is, and soon the person or entity under stress will fray under the strain of forcing the fluidity of today into the memory of yesterday.

The Covid-19 Policy-Making Triumvirate

Covid-19 has brought the entire world to an inescapable threshold of new beginning, with its commensurate invitation to transformation. America’s response reveals no embrace of the invitation, but rather a doubling down on the pre-pandemic version of a currently predominant ideological triumvirate of values.[1] Other “prejudices,” “habits,” and “ruling passions” of the “national character” are clearly evident in the nation’s response as well, but I chose to write about this triumvirate because I’ve previously done so here and in my other blog.[2]. The three prongs of the triumvirate we’ll look at today are as follows:

  1. Freemarketism: a hyper-competitive and hyper-privatized version of capitalism that enthrones individual and corporate agency over the centralized promotion of the public good.

Freemarketism is grounded in a belief that marketplace competition will not only prosper capitalists but also promote individual and communal welfare in all social and economic strata. Its essential prejudices and practices are rooted in the transmutation of the western, mostly Biblical worldview into the Protestant work ethic, which judges individual good character and communal virtue by individual initiative and success in “working for a living” and the ability to climb the upward mobility ladder. The state’s highest good is to sponsor a competitive market in which capitalists, freed from governmental regulation and taxation, will build vibrant businesses, generate wealth for themselves as a reward, and activate corollary ”trickle down” benefits to all. Granting the public good an independent seat at the policy-making table is considered detrimental to the market’s freedom.

Freemarketism skews Covid-19 relief toward business and charges the state with a duty to restore “business as usual” as quickly as possible. Direct benefit to citizens is considered only grudgingly, since it would encourage bad character and bad behavior among the masses. Particularly, it would destroy their incentive and willingness to work for a living. The employable populace must be kept hungry, on-edge, primed to get back to work in service to the capitalist engine that fuels the greater good of all.

  1. Beliefism: The denigration of science and intellect in favor of a form of secular post-truth fundamentalism.

Freemarketism is a belief system that emerged in the 1980’s, after the first three decades of post-WWII economic recovery played out in the 1970’s. Freemarketism addressed the economic malaise with its utopian promise of universal benefit, and its founders promoted it with religious zeal as a new economic science – the rationale being that it had been “proven” in ingenious, complex mathematical models. But math is not science, and however elegant its proofs of Freemarketism theory might have been, they were not the same as empirical testing . Freemarketism was therefore a new economic belief system — something you either believed or didn’t.

To gain widespread political and social acceptance, Freemarketism would need to displace the Keynesian economics that had pulled the U.S. out of the Great Depression of the 1930’s by massive federal investment in infrastructure, the creation of new social safety nets, and the regulation of securities markets. During the post-WWII recovery, neoliberal economic policy had struck its own balance between private enterprise and government intervention, creating both new commercial monoliths and a vibrant middle class. Freemarketism would eventually swing this balance entirely to the side of private enterprise. It did so thanks in part to auspicious good timing. At the dawn of the 1980’s, after a decade of Watergate, the oil embargo and energy crisis, runaway inflation, and the Iran hostage crisis, America was ripe for something to believe in. Its morale was suddenly boosted by the USA’s stunning Olympic hockey gold medal, Then, at the end of the decade, came the equally stunning collapse of the Soviet Union, brought on by Chernobyl and the fall of the Berlin Wall. These two bookend events ensured that Freemarketism had made a beginning that politicians and the populace wished to continue.

By then, Soviet-style Communism had been fully exposed as a horrific, dystopian, failed system. It had begun with Karl Marx’s angry empathy for the plight of the working stiff, but a century and a half later had morphed into a tyranny of fear, mind control, and brutality that turned its nominal beneficiaries into its victims, administered by a privileged, unthinking, corrupt, emotionally and morally paralyzed class of party bosses. When the failed system met its just desserts, the West’s storyline trumpeted that capitalism had won the Cold War. Freemarketism stepped up to receive the accolades, and its political devotees set about dismantling the social structures Keynesian economics had built before WWII.

From that point, as Freemarketism gained acceptance, it stomped the throttle toward fundamentalism, which is where every belief system, whether religious or secular, must inevitably end up. Belief by its very nature demands its own purification – the rooting out of doubt. To endure, belief must become irrefutable, must become certain to the point where doubt and discourse are demonized, conformity becomes the greatest social good, and ideological myths become determinants of patriotic duty and moral status. Accordingly, as Freemarketism evangelists increasingly installed their privatized solutions, any system of government based on state-sponsored promotion of the common good was quickly characterized as a threat of a resurgence of Communism. In the minds of Freemarketers – both priests and proles – the European social democracies were thrown into the same toxic waste dump as Communism, because the state could never again be trusted to know what is good for its citizens, or be given the power to carry out its agenda.

Freemarketism’s blind spot is now obvious: for all its demonization of government policy, it needed precisely that to create the conditions it needed to operate. Politicians from the 1990’s forward were happy to comply. Thus empowered, in the four decades since its inception, Freemarketism has ironically failed in the same manner as Soviet Communism, gutting the public good of the working masses and protectively sequestering the wealthy capitalist classes. Along the way, Beliefism as the cultural norm has displaced scientific rationalism with moment-by-moment inanity, expressed in the Covid-19 crisis by everything from drinking bleach to mask and supply shortages, lockdown protests and defiance of mask-wearing, terminating support of the World Health Organization, confusion and skepticism about statistics of infection rates and the value of mass testing, the public undercutting of medical authorities, and much more.

The post-truth flourishing of Beliefism is in turn held in place by the third prong of the triumvirate:

  1. Militarism: The American infatuation with military might and private armaments, and a proclivity towards resolving disputes and achieving policy outcomes through bullying, violence, and warfare.

Militarism is the enforcer for the other two prongs of the triumvirate. Its status as a pillar of the national character is on the one hand entirely understandable, given that the USA was formed because the colonists won their war, but on the other hand perhaps the most ideologically inexplicable when measured against the Founders’ rejection of a standing military in favor of a right to mobilize an armed militia as needed. The displacement of the latter with the former was fully complete only after WWII, grudgingly acknowledged by the General who masterminded .he D-Day invasion: “In the councils of government,” President Eisenhower said on the eve of leaving office, “we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military industrial complex,” He further warned that, “Only an alert and knowledgeable citizenry can compel the proper meshing of the huge industrial and military machinery of defense with our peaceful methods and goals, so that security and liberty may prosper together.”

The extent to which General Eisenhower’s warnings fell on deaf ears is by now obvious. Meanwhile, the Founders’ concept of the right to bear arms has metastasized into an absolute right to private armaments. The American national character now rests secure in its confidence that it has a big enough stick to forever defend its libertarian version of individual freedoms – including the freedoms of the marketplace – against all opposing beliefs, Communist or otherwise.

Militarism is evident in developments both expressly directed at the pandemic and coinciding with it, spanning both macro and micro responses from saber-rattling against Iran (against whom we apparently still we feel we have a score to settle), blame-shifting against China accompanied with rhetoric that has quickly escalated to the level of a new Cold War, Congress’s self-congratulatory passage of another record-setting new defense budget, and armed militias rallying against the lockdown and supporting protestors in their belligerent non-compliance.

In its Covid-19 response, America put its money where its mouth (ideology) is.

This ideological triumvirate is evident in the spending priorities of the USA’s legislative allocation of government speaking during the lockdown, as indicated in the following two graphs, which reveal that:

  1. The amount directed to business – mostly big business – was twice again as much as the defense budget;
  2. The amount directed to healthcare – during a pandemic – was least of all – half the amount directed to individuals;
  3. The 2020 defense budget approved during the lockdown was twice the size of the amount directed to individual citizens under the CARES relief act; and
  4. Meanwhile, defense spending dwarfs that of our seven nearest national “competitors.”

The Anatomy of the $2 Trillion COVID-19 Stimulus Bill[3]

CARES Act

U.S. Defense Spending Compared to Other Countries[4]

Defense Spending

Character Over Time

“True character is revealed in the choices a human being makes under pressure,” screenwriting guru Robert McKee wrote, “the greater the pressure, the deeper the revelation, the truer the choice to the character’s essential nature.”[5]

Pressure of the magnitude brought on by the pandemic catches national response off guard. It freezes time, demands instant responses to unprecedented demands. Pretense falls off, values and priorities leap from foundational to forefront. There is no time for analysis or spin, only the unguarded release of words and actions in the pressing moment. The result is national character, fully revealed.

The way out of this dizzying spiral is to embrace the invitation to character transformation, which begins in the awareness that something essential to maintaining the status quo has been lost, life has irreversibly changed, an ending has been reached. Every ending requires a new beginning, every new beginning requires a vision for how to continue, and every vision for continuing requires the perspective of newly-transformed character. If there is going to be systemic change, character must be the one to make concessions. The nation’s policy-makers made no such concession in their Covid-19 response.

Response Without Transformation

We’ve spent a few years in this forum discovering the triumvirate’s development and contemporary dominance of government policy-making, which in turn has been supported by enough of the electorate to keep the system in place. Now, the pandemic has put our “more perfect union” under extraordinary stress.

Given the recent racial issues now dominating the headlines, it isn’t far-fetched to compare the pandemic’s moral and legal challenges to those of the Civil War. Today’s post won’t try to do that topic justice, but it’s interesting to note that slavery was a dominant economic force from before America became the United States, especially buttressing capitalist/entrepreneurial wealth generated in tobacco and cotton, and was both expressly and implicitly adopted as a social, economic, and national norm, — for example in the U.S. Constitution’s denying slaves the right to vote and providing that each slave would count as 3/5 of a resident for purposes of determining seats in the House of Representatives. These “primary causes” remained intact for the nation’s first several decades, until a variety of pressures forced a reconsideration and transformation. Those pressures included, for example, a bubble in the pre-Civil War slave market that made slaves themselves into a valuable equity holding to be bought and sold for profit — a practice particularly outrageous to Northerners.[6]

The Covid-19 triumvirate is not Constitutionally recognized as slavery was, but clearly it is based on the current emphasis of certain aspects of the USA’s foundations to the exclusion of others. Many economists argue, for example, that the way out of the deepening pandemic economic depression is a return to a Keynesian-style massive governmental investment in public works and welfare – a strategy that even then was hugely controversial for the way it aggressively rebalanced the national character. The Covid-19 response, along with the military budget, makes no attempt at such a rebalancing – which, among other things, would require policy-makers to retreat from the common assumption that government support of the public good is Communism.

It took a Civil War and three Constitutional Amendments to remove nationalized slavery from the Constitution and begin the transformation of the nation’s character on the topic of race – a transformation which current events reveal is still sadly incomplete.

What would it take to similarly realign the national character in response to the pandemic?

[1] Since we’ve been discovering and examining these for several years in this forum, in this post I’m going to depart from my usual practice of quoting and citing sources. To do otherwise would have made this post far too redundant and far too long,

[2] My two blogs are The New Economy and the Future of Work and Iconoclast.blogt, Each has its counterpart on Medium – The Econoclast and Iconoclost.blog (recent articles only)..

[3] Visusalcapitalist.com

[4] Peter G. Peterson Foundation.

[5] McKee, Robert, Story: Substance, Structure, Style, and the Principles of Screenwriting (1997).

[6] See the analysis in Americana: A 400-Year History of American Capitalism, Bhu Srinivasan.(2017), and the author’s interview with the Wharton business school ,

Horatio Alger is Dead, America Has a New Class Structure, and it’s Not Your Fault

horatio alger

January 23, 2020

The member of the month at the gym where I work out is a guy who looks like he’s in his early 20’s. One of the “get to know me” questions asks “Who motivates you the most?” His answer: “My dad, who taught me that hard work can give you anything, as long as you can dedicate time and effort.”

The answer is predictably, utterly American. “Hard work can give you anything” — yes of course, everybody knows that. Parents tell it to their kids, and the kids believe it. America is the Land of Opportunity; it gives you every chance for success, and now it’s up to you. “Anything you want” is yours for the taking – and if you don’t take it, that’s your problem, not America’s.

Except it’s not true, and we know that, too. We know that you can work really, really hard and dedicate lots and lots of time and effort (and money), and still not get what you want.

Why do we keep saying and believing something that isn’t true? Why don’t we admit that things don’t actually work that way? Because that would be un-American. So instead we elevate the boast: America doesn’t just offer opportunity, it gives everybody equal opportunity — like Teddy Roosevelt said:

“I know perfectly well that men in a race run at unequal rates of speed.
I don’t want the prize given to the man who is not fast enough to win it on his merits, but I want them to start fair.”

Equal opportunity means everybody starts together. No, not everybody wins, but still… no matter who you are or where you’re from, everybody has the same odds. None of that landed gentry/inherited wealth class system here.

Except that’s not true either, and we know that, too.

But we love the equal opportunity myth. We love the feeling of personal power – agency, self-efficacy – it gives us. It’s been grooved into our American neural circuits since the beginning:

“We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the .pursuit of Happiness.–That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed.”[1]

We’re all equals here in America, divinely ordained to pursue the good life. That’s our creed, and we – “the governed” — declare that we believe it.

Even if it’s not true.

Equal opportunity is a foundational American cultural belief. Cultural myths are sacred – they’re afforded a special status that makes them off limits to examination. And national Founding Myths get the highest hands-off status there is.

Never mind that the Sacred doesn’t seem to mind being doubted – it’s the people who believe something is sacred you have to watch out for. And never mind that history and hindsight have a way of eventually outing cultural myths – exposing them as belief systems, not absolute truths. But it’s too late by the time history has its say: the fraud is perpetrated in the meantime, and attempts to expose it are shunned and punished as disloyal, unpatriotic, treasonous.

If we can’t out the myth, what do we do instead? We blame ourselves. If we don’t get “anything you want,” then we confess that we didn’t work hard enough, didn’t “dedicate the time and effort,” or maybe we did all that but in the wrong way or at the wrong time. Guilt, shame, embarrassment, frustration, depression… we take them all on as personal failings, in the name of preserving the myth.

You may have seen the Indeed commercial. (Go ahead, click it – it’s only 30 seconds.)

Indeed advert

It brilliantly taps the emotional power of the equal opportunity myth.

“With no choice but to move back home after college, they thought he’d be a little more motivated to find a job.”

The kid is glued to his phone, and it’s driving his parents crazy. He’s obviously a slacker, a freeloader. Household tensions mount. The phone dings at the dinner table. Dad snatches it up.

“Turns out, they were right.”

He’s using it to find a job! Faith and family harmony restored! That’s our hard-working boy!

Heartwarming, but still untrue.

But What About the Strong Job Numbers?

Yes, unemployment is low. But consider this analysis of those numbers[2], just out this month:

“Each month, the Bureau of Labor Statistics releases its Employment Situation report (better known as the ‘jobs report’) to outline the latest state of the nation’s economy. And with it, of late, have been plenty of positive headlines—with unemployment hovering around 3.5%, a decade of job growth, and recent upticks in wages, the report’s numbers have mostly been good news.

“But those numbers don’t tell the whole story. Are these jobs any good? How much do they pay? Do workers make enough to live on?

“Here, the story is less rosy.

“In a recent analysis, we found that 53 million workers ages 18 to 64—or 44% of all workers—earn barely enough to live on. Their median earnings are $10.22 per hour, and about $18,000 per year. These low-wage workers are concentrated in a relatively small number of occupations, including retail sales, cooks, food and beverage servers, janitors and housekeepers, personal care and service workers (such as child care workers and patient care assistants), and various administrative positions.

“Just how concerning are these figures? Some will say that not all low-wage workers are in dire economic straits or reliant on their earnings to support themselves, and that’s true. But as the following data points show, it would be a mistake to assume that most low-wage workers are young people just getting started, or students, or secondary earners, or otherwise financially secure:

      • Two-thirds (64%) of low-wage workers are in their prime working years of 25 to 54.
      • More than half (57%) work full-time year-round, the customary schedule for employment intended to provide financial security.
      • About half (51%) are primary earners or contribute substantially to family living expenses.
      • Thirty-seven percent have children. Of this group, 23% live below the federal poverty line.
      • Less than half (45%) of low-wage workers ages 18 to 24 are in school or already have a college degree.

“These statistics tell an important story: Millions of hardworking American adults struggle to eke out a living and support their families on very low wages.”

When the kid got a text at the dinner table, it was about one of these jobs. Mom and Dad better get used to the idea that he’ll be around for awhile. Even if he gets that job, it won’t offer benefits, could end at any moment, and won’t pay him enough to be self-sustaining. That’s not how Mom and Dad were raised or how things went for them, but that’s how the economy works nowadays.

Economics Begets Social Structure

The even bigger issue is that the equal opportunity myth has become a social norm: uber-competitive free market economics controls the collective American mindset about how adult life works, to the point that it’s become a nationalist doctrine.

The Chicago School of Economics – the Vatican of free marketism — believed so ardently in its on doctrines that its instructional approach took on the dynamics of fundamentalist indoctrination:

“Frank Knight, one of the founders of Chicago School economics, thought professors should ‘inculcate’ in their students the belief that economic belief is ‘a sacred feature of the system,’ not a debatable hypothesis.’”[3]

Free market ideology preaches that capitalism promotes both economic and social opportunity. It has had the past four decades to prove that claim, and has failed as spectacularly as Soviet-style communism failed to benefit the workers it was supposed to redeem. Instead, free market ideology has given America what it wasn’t ever supposed to have: a stratified socio-economic class system that skews rewards to the top 10% and leaves the rest in the grip of the dismal statistics listed above.

But we don’t see that – or if we do, we don’t say anything about it, we just keep reciting the “trickle down” mantra. Member of the month and his Dad and the parents in the Indeed commercial and most Americans still believe the myth. Ironically the ones who see through it are the top 10% members who got in before they closed the gates. Meanwhile, the lower 90% — the decimated middle class, the new poor, the hard-working wage-earners – keep blaming themselves.

Even though it’s not their fault. If the kid in the commercial can’t find a job to support himself, it’s not his fault.

“I can’t pay my bills, afford a house, a car, a family. I can’t afford healthcare, I have no savings. Retirement is a joke. I don’t know how I’ll ever pay off my student loans. I live paycheck to paycheck. I’m poor. But it’s not my fault.”

Try saying that to Dad at the dinner table.

But unlike “anything you want,” “it’s not your fault” is true: current economic policy and its companion social norms do not deliver equal opportunity. Horatio Alger is dead, but the equal opportunity myth lives on life support as we teach it to our children and elect politicians who perpetuate it, while all of us ignore the data.

Horatio Alger is Dead

There’s no more enduring version of the upward mobility ideal than the rags-to-riches story codified into the American Dream by Horatio Alger, Jr. during the Gilded Age of Andrew Mellon, John D. Rockefeller, Cornelius Vanderbilt, Andrew Carnegie, and the rest of the 19th Century Robber Barons. If they can do it, so can the rest of us, given enough vision, determination, hard work, and moral virtue — that was Alger’s message. Except it never worked that way, especially for the Robber Barons – opportunists aided by collusion and chronyism carried out in the absence of the antitrust and securities laws that would be enacted under the New Deal after history revealed the fraud.[4]

But never mind that — according to Roughrider Teddy and politicians like him, government’s job is to guarantee equal opportunity for all, then get out of the way and let the race to riches begin. Thanks to our devotion to that philosophy, a fair start has become is a thing of the past — so says Richard V. Reeves in his book Dream Hoarders.

Reeves begins by confessing that his disenchantment over the demise of the Horatio Alger ideal will no doubt seem disingenuous because he didn’t grow up American and is now a member of the economic elite himself:

“As a Brookings senior fellow and a resident of an affluent neighborhood in Montgomery County, Maryland, just outside of DC, I am, after all, writing about my own class.

“I am British by birth, but I have lived in the United States since 2012 and became a citizen in late 2016. (Also, I was born on the Fourth of July.) There are lots of reasons I have made America my home. But one of them is the American ideal of opportunity. I always hated the walls created by social class distinctions in the United Kingdom. The American ideal of a classless society is, to me, a deeply attractive one. It has been disheartening to learn that the class structure of my new homeland is, if anything, more rigid than the one I left behind and especially so at the top.

“My new country was founded on anti-hereditary principles. But while the inheritance of titles or positions remains forbidden, the persistence of class status across generations in the United States is very strong. Too strong, in fact, for a society that prides itself on social mobility.”

Reeves also wrote a Brookings Institute monograph called Saving Horatio Alger: Equality, Opportunity, and the American Dream, in which he said the following:

“Vivid stories of those who overcome the obstacles of poverty to achieve success are all the more impressive because they are so much the exceptions to the rule. Contrary to the Horatio Alger myth, social mobility rates in the United States are lower than in most of Europe. There are forces at work in America now — forces related not just to income and wealth but also to family structure and education – that put the country at risk of creating an ossified, self-perpetuating class structure, with disastrous implications for opportunity and, by extension, for the very idea of America.

“The moral claim that each individual has the right to succeed is implicit in our ‘creed,’ the Declaration of Independence, when it proclaims ‘All men are created equal.’

“There is a simple formula here — equality plus independence adds up to the promise of upward mobility — which creates an appealing image: the nation’s social, political, and economic landscape as a vast, level playing field upon which all individuals can exercise their freedom to succeed.

“Many countries support the idea of meritocracy, but only in America is equality of opportunity a virtual national religion, reconciling individual liberty — the freedom to get ahead and “make something of yourself” — with societal equality. It is a philosophy of egalitarian individualism. The measure of American equality is not the income gap between the poor and the rich, but the chance to trade places.

“The problem is not that the United States is failing to live up to European egalitarian principles, which use income as a measure of equality. It is that America is failing to live up to American egalitarian principles, measured by the promise of equal opportunity for all, the idea that every child born into poverty can rise to the top.”

There’s a lot of data to back up what Reeves is saying. See, e.g., this study from Stanford, which included these findings:

“Parents often expect that their kids will have a good shot at making more money than they ever did…. But young people entering the workforce today are far less likely to earn more than their parents when compared to children born two generations before them, according to a new study by Stanford researchers.”

The New American Meritocracy

Along with Richard Reeves, philosopher Matthew Stewart and entrepreneur Steven Brill cite the same economic and related social data to support their conclusion that the new meritocrat socio-economic class has barred the way for the rest of us. I’ll let Matthew Stewart speak for the others[5]:

“I’ve joined a new aristocracy now, even if we still call ourselves meritocratic winners. To be sure, there is a lot to admire about my new group, which I’ll call—for reasons you’ll soon see—the 9.9 percent. We’ve dropped the old dress codes, put our faith in facts, and are (somewhat) more varied in skin tone and ethnicity. People like me, who have waning memories of life in an earlier ruling caste, are the exception, not the rule.

“By any sociological or financial measure, it’s good to be us. It’s even better to be our kids. In our health, family life, friendship networks, and level of education, not to mention money, we are crushing the competition below.

“The meritocratic class has mastered the old trick of consolidating wealth and passing privilege along at the expense of other people’s children. We are not innocent bystanders to the growing concentration of wealth in our time. We are the principal accomplices in a process that is slowly strangling the economy, destabilizing American politics, and eroding democracy. Our delusions of merit now prevent us from recognizing the nature of the problem that our emergence as a class represents. We tend to think that the victims of our success are just the people excluded from the club. But history shows quite clearly that, in the kind of game we’re playing, everybody loses badly in the end.

“So what kind of characters are we, the 9.9 percent? We are mostly not like those flamboyant political manipulators from the 0.1 percent. We’re a well-behaved, flannel-suited crowd of lawyers, doctors, dentists, mid-level investment bankers, M.B.A.s with opaque job titles, and assorted other professionals—the kind of people you might invite to dinner. In fact, we’re so self-effacing, we deny our own existence. We keep insisting that we’re ‘middle class.’

“One of the hazards of life in the 9.9 percent is that our necks get stuck in the upward position. We gaze upon the 0.1 percent with a mixture of awe, envy, and eagerness to obey. As a consequence, we are missing the other big story of our time. We have left the 90 percent in the dust—and we’ve been quietly tossing down roadblocks behind us to make sure that they never catch up.”

Two Stories, One Man

In a remarkable display of self-awareness and historical-cultural insight, Stanford professor David Labaree admits that his own upward mobility story can be told two ways — one that illustrates the myth and one that doesn’t, depending on your point of view.[6]

“Occupants of the American meritocracy are accustomed to telling stirring stories about their lives. The standard one is a comforting tale about grit in the face of adversity – overcoming obstacles, honing skills, working hard – which then inevitably affords entry to the Promised Land. Once you have established yourself in the upper reaches of the occupational pyramid, this story of virtue rewarded rolls easily off the tongue. It makes you feel good (I got what I deserved) and it reassures others (the system really works).

“But you can also tell a different story, which is more about luck than pluck, and whose driving forces are less your own skill and motivation, and more the happy circumstances you emerged from and the accommodating structure you traversed. As an example, here I’ll tell my own story about my career negotiating the hierarchy in the highly stratified system of higher education in the United States. I ended up in a cushy job as a professor at Stanford University.

“Is there a moral to be drawn from these two stories of life in the meritocracy? The most obvious one is that this life is not fair. The fix is in. Children of parents who have already succeeded in the meritocracy have a big advantage over other children whose parents have not. They know how the game is played, and they have the cultural capital, the connections and the money to increase their children’s chances for success in this game.

“In fact, the only thing that’s less fair than the meritocracy is the system it displaced, in which people’s futures were determined strictly by the lottery of birth. Lords begat lords, and peasants begat peasants. In contrast, the meritocracy is sufficiently open that some children of the lower classes can prove themselves in school and win a place higher up the scale.

“The probability of doing so is markedly lower than the chances of success enjoyed by the offspring of the credentialed elite, but the possibility of upward mobility is nonetheless real. And this possibility is part of what motivates privileged parents to work so frantically to pull every string and milk every opportunity for their children.”

Pause for a moment and wonder, as I did, why would the new meritocrats write books and articles like these? Is it a case of Thriver (Survivor) Guilt? Maybe, but I think it’s because they’re dismayed that their success signals the end of the American equal opportunity ideology. You don’t trample on something sacred. They didn’t mean to. They’re sorry. But now that they have, maybe it wasn’t so sacred after all.

The new socio-economic class system was never supposed to happen in America. We weren’t supposed to be like the Old World our founders left behind. But now we are, although most of us don’t seem to know it, and only a few brave souls will admit it. Meanwhile the Horatio Alger mansions are all sold out, and the gate to the community is locked and guarded. That kind of thing just doesn’t happen in America.

Until it did.

[1] The Declaration of Independence.

[2] Low Employment Isn’t Worth Much if the Jobs Barely Pay, The Brookings Institute, Jan. 8, 2020.

[3] The Shock Doctrine: The Rise of Disaster Capitalism, Naomi Klein (2017).

[4] The best source I’ve found for the American history we never learned is Americana: A 400-Year History of American Capitalism, Bhu Srinivasan (2017).

[5] Matthew Stewart is the author of numerous books and a recent article for The Atlantic called The 9.9 Percent is the New American Meritocracy. Steven Brill is the founder of The American Lawyer and Court TV, and is the author of the book Tailspin: The People and Forces Behind America’s Fifty-Year Fall–and Those Fighting to Reverse It and also the writer of a Time Magazine feature called How Baby Boomers Broke America. The quoted text is from Stewart’s Atlantic article.

[6] Pluck Versus Luck, Aeon Magazine (Dec. 4. 2019) –“Meritocracy emphasises the power of the individual to overcome obstacles, but the real story is quite a different one.”

Economic Fundamentalism

We saw last time that the goal of Chicago School free market economics was to promote “noncontaminated capitalism,” which in turn would generate societal economic utopia:

“The market, left to its own devices, would create just the right number of products at precisely the right prices, produced by workers at just the right wages to buy those products — an Eden of plentiful employment, boundless creativity and zero inflation.”

The Shock Doctrine:  The Rise of Disaster Capitalism, Naomi Klein (2017)

To the School’s free market advocates, these ideas were pure science:

“The starting premise is that the free market is a perfect scientific system, one in which individuals, acting on their own self-interested desires, create the maximum benefits for all. If follows ineluctably that if something is wrong with a free-market economy — high inflation or soaring unemployment — it has to be because the market is not truly free.”

The Shock Doctrine

Scientific method requires that theories be falsifiable:  you have to be able to objectively prove them wrong.

“The philosopher Karl Popper argued that what distinguishes a scientific theory from pseudoscience and pure metaphysics is the possibility that it might be falsified on exposure to empirical data. In other words, a theory is scientific if it has the potential to be proved wrong.”

But Is It Science? Aeon Magazine, Oct. 7, 2019.

But how do you prove an economic theory based on “uncontaminated capitalism” in an economically contaminated world?

“The challenge for Friedman and his colleagues was not to prove that a real work market could live up to their rapturous imaginings…. Friedman could not point to any living economy that proved if all ‘distortions’ were stripped away, what would be left would be a society in perfect health and bounteous, since no country in the world met the criteria for perfect laissez-faire. Unable to test their theories in central banks and ministries of trade, Friedman and his colleagues had to settle for elaborate and ingenious mathematical equations and computer models.”

The Shock Doctrine

Mathematical equations and computer models aren’t the same as empirical data collected in the real (“contaminated”) world. If falsifiability is what separates scientific knowledge from belief-based ideology, then Friedman’s free market theory is the latter. Some scientists are worried that this spin on scientific theorizing has become too prevalent nowadays:

 “In our post-truth age of casual lies, fake news and alternative facts, society is under extraordinary pressure from those pushing potentially dangerous antiscientific propaganda – ranging from climate-change denial to the anti-vaxxer movement to homeopathic medicines. I, for one, prefer a science that is rational and based on evidence, a science that is concerned with theories and empirical facts, a science that promotes the search for truth, no matter how transient or contingent. I prefer a science that does not readily admit theories so vague and slippery that empirical tests are either impossible or they mean absolutely nothing at all…. For me at least, there has to be a difference between science and pseudoscience; between science and pure metaphysics, or just plain ordinary bullshit.”

But Is It Science?

The Chicago School believed so ardently in the free market theory that its instructional approach took on the dynamics of belief-based indoctrination:

“Frank Knight, one of the founders of Chicago School economics, thought professors should ‘inculcate’ in their students the belief that economic belief is ‘a sacred feature of the system,’ not a debatable hypothesis.’”

The Shock Doctrine

This dynamic applies to every ideology that can’t be falsified — verified empirically. The ideology then becomes a fundamentalist belief system:

“Like all fundamentalist faiths, Chicago School economics is, for its true believers a closed loop. The Chicago solution is always the same:  a stricter and more complete application of the fundamentals.:

The Shock Doctrine

Journalist Chris Hedges describes the dynamics of “secular fundamentalism” in I Don’t Believe in Atheists. (The book’s title is too clever for its own good — a later version adds the subtitle “The Dangerous Rise of the Secular Fundamentalist.”)

“Fundamentalism is a mind-set. The iconography and language it employs can be either religious or secular or both, but because it dismisses all alternative viewpoints as inferior and unworthy of consideration it is anti-thought. This is part of its attraction. It fills a human desire for self-importance, for hope and the dream of finally attaining paradise. It creates a binary world of absolutes, of good and evil. It provides a comforting emotional certitude. It is used to elevate our cultural, social, and economic systems above others…. The core belief systems of these secular and religious antagonists are identical.”

Thus we have Nobel prize-winning economist Milton Friedman famously saying, “Underlying most arguments against the free market is a lack of belief in freedom itself” — a statement entirely in keeping with the Mont Pelerin  Society’s idealistic Statement of Aims, which we looked at last time.

And thus we also have Nobel prize-winning economist Joseph Stiglitz countering with his thoughts about economics in a contaminated (“pathological”) world:

“The advocates of free markets in all their versions say that crises are rare events, though they have been happening with increasing frequency as we change the rules to reflect beliefs in perfect markets. I would argue that economists, like doctors, have much to learn from pathology. We see more clearly in these unusual events how the economy really functions. In the aftermath of the Great Depression, a peculiar doctrine came to be accepted, the so-called ‘neoclassical synthesis.’ It argued that once markets were restored to full employment, neoclassical principles would apply. The economy would be efficient. We should be clear: this was not a theorem but a religious belief.”

As we also saw last time, historical socialism and communism join free market capitalism in their fundamentalist zeal. In fact, some think that economics in general has become today’s dominant cultural form of belief-based thinking. More on that next time.