Reparations [4]:  The Essential Doubt

And so you see I have come to doubt
All that I once held as true

I stand alone without beliefs
The only truth I know is you.

Kathy’s Song[1]
Paul Simon

We saw last time that the U.S. government could waive its legal defense of sovereign immunity to pave the way for slavery reparations. It would take more than a legal reckoning for that to happen. Law lies on the surface of society, readily visible, but it has deep roots in history and ideology, national identity and mission, values and beliefs, ways of looking at the world and how life works.[2] These ancient root systems invoke fierce allegiances deeply embedded in human psyche and culture. Because the legal doctrine of sovereign immunity is grounded in Biblical doctrine,[3] laying it aside requires doubt and dissent of the highest order – national treason and religious apostasy in a single act.

Doubt of that magnitude is rare beyond description but not without precedent. Consider, for example, Germany’s reparations for World War II, which required not only the international banishment of Nazism, but also the German people’s moral renunciation of Nazism’s philosophical and political roots stretching back to the 19th Century.[4]; In comparison, the USA”s roots of slavery (and hence racism) extend back to the earliest New World settlements, which imported English common law, including the divine right of kings and its nationalistic version, sovereign immunity. Renouncing the latter to pave the way for slavery reparations would require a similar American moral renunciation of centuries of related social, economic, and political ideology and set new terms for a post-racism American state.

That, in turn, would require a reckoning with the “first cause” roots of the divine right of kings and sovereign immunity.

The First Cause Roots of Sovereign Immunity

A “first cause” satisfies the human desire for life to make sense by assigning a cause to every effect. Trouble is, as you trace the cause and effect chain to its remotest origins, you eventually run out of causes, leaving you with only effects. That’s when a first cause comes to the rescue. A first cause has no prior cause – it is so primary that nothing came before it but everything came after it. Since knowledge can’t reach that far back, a first cause is a matter of belief:  you take it on faith, declare the beginning into existence, and go from there.

Western civilization’s worldview historically identified God as the ultimate first cause.

“The classic Christian formulation of this argument came from the medieval theologian St. Thomas Aquinas, who was influenced by the thought of the ancient Greek philosopher Aristotle. Aquinas argued that the observable order of causation is not self-explanatory. It can only be accounted for by the existence of a first cause; this first cause, however, must not be considered simply as the first in a series of continuing causes, but rather as first cause in the sense of being the cause for the whole series of observable causes.

“The 18th-century German philosopher Immanuel Kant rejected the argument from causality because, according to one of his central theses, causality cannot legitimately be applied beyond the realm of possible experience to a transcendent cause.

“Protestantism generally has rejected the validity of the first-cause argument; nevertheless, for most Christians it remains an article of faith that God is the first cause of all that exists. The person who conceives of God in this way is apt to look upon the observable world as contingent—i.e., as something that could not exist by itself.”[5]

God is the ultimate Sovereign from which all lesser sovereigns – the king, the national government — derive their existence and legitimacy. God’s first cause Sovereignty justifies God’s right to rule as God sees fit. The king and the state, having been set into place by God, derive a comparable right of domination from God. The king and the national government are to the people what God is to them.

The Divine Right of Kings

When kings ruled countries, their divine line of authority took legal form as the Divine Right of Kings.

“The divine right of kings, divine right, or God’s mandate is a political and religious doctrine of royal and political legitimacy. It stems from a specific metaphysical framework in which the king (or queen) is pre-selected as an heir prior to their birth. By pre-selecting the king’s physical manifestation, the governed populace actively (rather than merely passively) hands the metaphysical selection of the king’s soul – which will inhabit the body and thereby rule them – over to God. In this way, the ‘divine right’ originates as a metaphysical act of humility or submission towards the Godhead.

“Consequentially, it asserts that a monarch (e.g. a king) is subject to no earthly authority, deriving the right to rule directly from divine authority, like the monotheist will of God. The monarch is thus not subject to the will of his people, of the aristocracy, or of any other estate of the realm. It implies that only divine authority can judge an unjust monarch and that any attempt to depose, dethrone or restrict their powers runs contrary to God’s will and may constitute a sacrilegious act.”[6]

The Divine Right of Kings was a favorite doctrine of the first King James of England, who commissioned what would become the King James Version of the Bible partly in response to Puritan challenges to the Church of England’s doctrine of an ordained clergy that could trace its lineage to the original Apostles.

Divine right of kings, in European history, a political doctrine in defense of monarchical ‘absolutism,’ which asserted that kings derived their authority from God and could not therefore be held accountable for their actions by any earthly authority such as a parliament. Originating in Europe, the divine-right theory can be traced to the medieval conception of God’s award of temporal power to the political ruler, paralleling the award of spiritual power to the church. By the 16th and 17th centuries, however, the new national monarchs were asserting their authority in matters of both church and state. King James I of England (reigned 1603–25) was the foremost exponent of the divine right of king….”[7]

“While throughout much of world history, deified potentates have been the rule, in England, absolute monarchy never got a solid foothold, but there certainly was the attempt. Elements of British political theory and practice encouraged absolutism—the idea and practice that the king is the absolute law and that there is no appeal beyond him. Several movements and ideas hurried along the idea of absolute monarchy in England. One of those ideas was the divine right of kings,

“In England, the idea of the divine right of kings will enter England with James VI of Scotland who will come and rule over both England and Scotland as James I in 1603 and will commence the line of several ‘Stuart’ monarchs. James had definite ideas about his role as monarch, and those ideas included the divine right of kings. Here are just a few of James’ statements that reflect his view that he ruled by divine right:

      • Kings are like gods— “…kings are not only God’s lieutenants upon earth, and sit upon God’s throne, but even by God himself are called gods.”
      • Kings are not to be disputed— “… That as to dispute what God may do is blasphemy….so is it sedition in subjects to dispute what a king may do in the height of his power.”
      • Governing is the business of the king, not the business of the subjects— “you do not meddle with the main points of government; that is my craft . . . to meddle with that were to lesson me . . . I must not be taught my office.”
      • Kings govern by ancient rights that are his to claim— “I would not have you meddle with such ancient rights of mine as I have received from my predecessors . . . .”
      • Kings should not be bothered with requests to change settled law— “…I pray you beware to exhibit for grievance anything that is established by a settled law…”
      • Don’t make a request of a king if you are confident he will say “no.”— “… for it is an undutiful part in subjects to press their king, wherein they know beforehand he will refuse them.”

“James’ views sound egotistical to us today, but he was not the only one that held them. These views were held by others, even some philosophers. For example, the English philosopher Thomas Hobbes wrote a work called Leviathan in 1651 in which he said that men must surrender their rights to a sovereign in exchange for protection. While Hobbes’ was not promoting the divine right of kings per se, he was providing a philosophy to justify a very strong absolute ruler, the kind that the divine right of kings prescribes. Sir Robert Filmer was a facilitator of the divine right of kings and wrote a book about it called Patriarcha (1660) in which he said that the state is like a family and that the king is a father to his people. Filmer also says that the first king was Adam and that Adam’s sons rule the nations of the world today. So, the King of England would be considered the eldest son of Adam in England or the King of France would be Adam’s eldest son in France.”[8]

King James, Witch Hunter

King James had no impartial academic interest in a Bible translation that supported his divine right:  during his reign, the “Cradle King” accumulated a long list of covered offenses that included mass murder, torture, injustice, tracheary, cruelty, and misogyny.

“The witch-hunts that swept across Europe from 1450 to 1750 were among the most controversial and terrifying phenomena in history – holocausts of their times. Historians have long attempted to explain why and how they took such rapid and enduring hold in communities as disparate and distant from one another as Navarre and Copenhagen. They resulted in the trial of around 100,000 people (most of them women), a little under half of whom were 
put to death.

“One of the most active centres of witch-hunting was Scotland, where perhaps 
4,000 people were consigned to the flames – 
a striking number for such a small country, 
and more than double the execution rate in England. The ferocity of these persecutions can be attributed to the most notorious royal witch-hunter: King James VI of Scotland, who in 1603 became James I of England.

“Most of the suspects soon confessed – under torture – to concocting a host of bizarre and gruesome spells and rituals in order to whip up the storm.… James was so appalled when he heard such tales that he decided to personally superintend the interrogations… while the king looked on with ‘great delight’.

“James’s beliefs had a dangerously misogynistic core. He grew up to scorn – even revile – women. Though he was by no means alone in his view of the natural weakness and inferiority of women, his aversion towards them was unusually intense. He took every opportunity to propound the view that they were far more likely than men to succumb to witchcraft…. He would later commission a new version of the Bible in which all references to witches were rewritten in the female gender.

“Most witchcraft trials constituted grave miscarriages of justice…. If the actual facts of a case were unsatisfactory, or did not teach a clear enough moral lesson, then they were enhanced, added to or simply changed.”[9]

When the new King James Bible substantiated the King’s divine right to carry on these activities, and when the USA imported the king’s divine right into its legal system as sovereign immunity, both acknowledged God as the first cause of these legal doctrines. Like the King, the U.S. government also has a long list of covered offenses:  the treatment of slaves during the reign of legal slavery mirrors King James’ obsession with brutalizing, lynching, and murdering witches.

In the U.S., where a 2019 Gallup Poll found that 64% – 87% of Americans believe in God  (depending on how the question was asked), there remain many ”Christians [for whom] it remains an article of faith that God is the first cause of all that exists.[10] As a result, we see in the USA’s current social and political climate both explicit and implicit affirmation of the following Bible passages (which the online source appropriately expresses in the King James version) to substantiate the ability of national leaders to avoid accountability for acts of governance that sponsor this kind of horrifying treatment of citizens.[11]:

“Let every soul be subject unto the higher powers. For there is no power but of God: the powers that be are ordained of God. Whosoever therefore resisteth the power, resisteth the ordinance of God: and they that resist shall receive to themselves damnation. For rulers are not a terror to good works, but to the evil. Wilt thou then not be afraid of the power? do that which is good, and thou shalt have praise of the same: For he is the minister of God to thee for good. But if thou do that which is evil, be afraid; for he beareth not the sword in vain: for he is the minister of God, a revenger to execute wrath upon him that doeth evil. Wherefore ye must needs be subject, not only for wrath, but also for conscience sake.” Romans 13:1-5, KJV

“Lift not up your horn on high: speak not with a stiff neck. For promotion cometh neither from the east, nor from the west, nor from the south. But God is the judge: he putteth down one, and setteth up another.” Psalms 75:5-7, KJV

“Daniel answered and said, Blessed be the name of God for ever and ever: for wisdom and might are his: And he changeth the times and the seasons: he removeth kings, and setteth up kings: he giveth wisdom unto the wise, and knowledge to them that know understanding:” Daniel 2:20-21, KJV

“This matter is by the decree of the watchers, and the demand by the word of the holy ones: to the intent that the living may know that the most High ruleth in the kingdom of men, and giveth it to whomsoever he will, and setteth up over it the basest of men.” Daniel 4:17, KJV

“I have made the earth, the man and the beast that are upon the ground, by my great power and by my outstretched arm, and have given it unto whom it seemed meet unto me.” Jeremiah 27:5, KJV

“The king’s heart is in the hand of the LORD, as the rivers of water: he turneth it whithersoever he will.” Proverbs 21:1, KJV

“For rebellion is as the sin of witchcraft, and stubbornness is as iniquity and idolatry. Because thou hast rejected the word of the LORD, he hath also rejected thee from being king. And Saul said unto Samuel, I have sinned: for I have transgressed the commandment of the LORD, and thy words: because I feared the people, and obeyed their voice. Now therefore, I pray thee, pardon my sin, and turn again with me, that I may worship the LORD. And Samuel said unto Saul, I will not return with thee: for thou hast rejected the word of the LORD, and the LORD hath rejected thee from being king over Israel.” 1 Samuel 15:23-26, KJV

“And upon a set day Herod, arrayed in royal apparel, sat upon his throne, and made an oration unto them. And the people gave a shout, saying, It is the voice of a god, and not of a man. And immediately the angel of the Lord smote him, because he gave not God the glory: and he was eaten of worms, and gave up the ghost.” Acts 12:21-23, KJV

The Ultimate Focus of Doubt:  God

In “Abrahamic” cultures — Jewish, Muslim, and Christian – the Biblical God is the first cause of the divine right of kings and sovereign immunity. The full force of patriotic nationalism and religious zeal therefore originates with God – which explains why a surprising number of European nations had blasphemy laws on the books until not that long ago, and why some nations still do.[12]

“Blasphemy is the act of insulting or showing contempt or lack of reverence to a deity, or sacred objects, or toward something considered sacred or inviolable.”[13]

God, it seems, like kings and sovereign nations, has much to be excused from. Aside from the Biblical God’s sponsorship of war, genocide, mass murder, rape, torture, and brutality to humans and animals, a list of modern labels would include misogynist, homophobe, and xenophobe. But of course you don’t think that way if you’re a believer, because that would be blasphemy, often punishable by death, often after the infliction of the kind of cruel and unusual punishment reserved for the faithful and unfaithful alike. As for the latter, the Bible makes it a badge of honor for the faithful to suffer in the name of God:

“Some were tortured, refusing to accept release, so that they might rise again to a better life. Others suffered mocking and flogging, and even chains and imprisonment. They were stoned, they were sawn in two, they were killed with the sword. They went about in skins of sheep and goats, destitute, afflicted, mistreated—of whom the world was not worthy—wandering about in deserts and mountains, and in dens and caves of the earth. And all these, though commended through their faith, did not receive what was promised,” Hebrews 11:  35-39.ESV

Transformation Made Possible by Doubt

Nonbelievers not vexed with these kinds of rights of the sovereign and duties of the governed are free to doubt God’s first cause status and its derivative doctrines, laws, and policies. In the USA, doubt embraced on that level would open the door to any number of contrary beliefs – for example:

    • The state does not enjoy superior status — historically, legally, morally, or otherwise – that gives it a right to act without consequence.
    • The people governed are therefore not bound – theologically, morally, or otherwise – to submit to government that is not responsible for its actions.

Once you’re no longer worried about breaking faith with God as the first cause of your national institutional structure, a while new “social contract” (also discussed last time) between government and the people becomes possible – a contract that would, in effect, not be satisfied with paying only descendants of slaves “damages” for past harm, but would look to establish a fresh national vision of the duties of those who govern and the rights and freedoms of the governed. The result, it would seem, is the possibility of ending the USA’s institutionalized racism for good.

[1] Who was Paul Simon’s Kathy? And whatever happened to her? See this article from The Guardian.

[2] See the Belief Systems and Culture category of posts in my Iconoclast.blog.

[3] The Founding Myth: Why Christian Nationalism Is Un-American, Andrew L. Seidel (2019). Although the USA was not founded as a Christian nation, its core values and beliefs, like those of other Western countries, are Classical and Biblical in origin.

[4]  See Alpha History and The Mises Institute on the historical origins of Nazism.

[5]  Encyclopedia Britannica. See also New World Encyclopedia and the Stanford Dictionary of Philosophy.

[6] Wikipedia – The Divine Right of Kings.

[7] Encyclopedia Britannica and Wikipedia.. See also the New World Encyclopedia

[8] Owlcation

[9] Borman, Tracy, James VI And I: The King Who Hunted Witches,  History Extra (BBC Historical Magazine)  (March 27, 2019)

[10]  Encyclopedia Britannica. See also New World Encyclopedia and the Stanford Dictionary of Philosophy.

[11]Bill’s Bible Basics.”

[12]  Wikipedia – Blasphemy law.

[13]  Wikipedia – Blasphemy.

America’s National Character, Revealed in its COVID-19 Response

“The entire man is… to be seen in the cradle of the child. The growth of nations presents something analogous to this; they all bear some marks of their origin. If we were able to go back… we should discover… the primal cause of the prejudices, the habits, the ruling passions, and, in short, all that constitutes what is called the national character.”

Alexis de Tocqueville, Democracy in America (1835)

“Begin as you would continue,” my new mother-in-law told my bride and me. Her advice was good beyond gold – a standard we return to in every new beginning, of which there’ve been many in 40+ years.

Alexis de Tocqueville didn’t offer the principle as advice, he recognized its operation in the America he famously toured and wrote about – a nation shaping itself around its founding principles – its “primal cause.” A country’s “national character,” he said, is revealed in the “prejudices,” “habits,” and “ruling passions” of the government and the people. The specifics may shift over time as certain founding values prevail over others due to political tradeoffs and changing circumstances, but in the long haul the country stays true to its origins. Countries, like marriages, continue as they began.

The same dynamics that apply to individuals and nations also apply to institutions, for example societal institutions of law, economics, academics, and commercial enterprise. And for all of them, there’s no such thing as a single beginning to be sustained forever. Personal, national, and institutional histories are shaped around many beginnings and endings. With every new beginning comes an invitation to return to “primal causes” and accept the transformation of historical into contemporary; i.e., each path forward requires a fresh look at how the past’s wisdom can help navigate today’s unprecedented challenges. Trouble is, transformation is perhaps the most difficult thing asked of a person, relationship, institution, nation. The opportunity to transform is therefore rarely recognized, much less embraced, but without it there will be hardening into what was but no longer is, and soon the person or entity under stress will fray under the strain of forcing the fluidity of today into the memory of yesterday.

The Covid-19 Policy-Making Triumvirate

Covid-19 has brought the entire world to an inescapable threshold of new beginning, with its commensurate invitation to transformation. America’s response reveals no embrace of the invitation, but rather a doubling down on the pre-pandemic version of a currently predominant ideological triumvirate of values.[1] Other “prejudices,” “habits,” and “ruling passions” of the “national character” are clearly evident in the nation’s response as well, but I chose to write about this triumvirate because I’ve previously done so here and in my other blog.[2]. The three prongs of the triumvirate we’ll look at today are as follows:

  1. Freemarketism: a hyper-competitive and hyper-privatized version of capitalism that enthrones individual and corporate agency over the centralized promotion of the public good.

Freemarketism is grounded in a belief that marketplace competition will not only prosper capitalists but also promote individual and communal welfare in all social and economic strata. Its essential prejudices and practices are rooted in the transmutation of the western, mostly Biblical worldview into the Protestant work ethic, which judges individual good character and communal virtue by individual initiative and success in “working for a living” and the ability to climb the upward mobility ladder. The state’s highest good is to sponsor a competitive market in which capitalists, freed from governmental regulation and taxation, will build vibrant businesses, generate wealth for themselves as a reward, and activate corollary ”trickle down” benefits to all. Granting the public good an independent seat at the policy-making table is considered detrimental to the market’s freedom.

Freemarketism skews Covid-19 relief toward business and charges the state with a duty to restore “business as usual” as quickly as possible. Direct benefit to citizens is considered only grudgingly, since it would encourage bad character and bad behavior among the masses. Particularly, it would destroy their incentive and willingness to work for a living. The employable populace must be kept hungry, on-edge, primed to get back to work in service to the capitalist engine that fuels the greater good of all.

  1. Beliefism: The denigration of science and intellect in favor of a form of secular post-truth fundamentalism.

Freemarketism is a belief system that emerged in the 1980’s, after the first three decades of post-WWII economic recovery played out in the 1970’s. Freemarketism addressed the economic malaise with its utopian promise of universal benefit, and its founders promoted it with religious zeal as a new economic science – the rationale being that it had been “proven” in ingenious, complex mathematical models. But math is not science, and however elegant its proofs of Freemarketism theory might have been, they were not the same as empirical testing . Freemarketism was therefore a new economic belief system — something you either believed or didn’t.

To gain widespread political and social acceptance, Freemarketism would need to displace the Keynesian economics that had pulled the U.S. out of the Great Depression of the 1930’s by massive federal investment in infrastructure, the creation of new social safety nets, and the regulation of securities markets. During the post-WWII recovery, neoliberal economic policy had struck its own balance between private enterprise and government intervention, creating both new commercial monoliths and a vibrant middle class. Freemarketism would eventually swing this balance entirely to the side of private enterprise. It did so thanks in part to auspicious good timing. At the dawn of the 1980’s, after a decade of Watergate, the oil embargo and energy crisis, runaway inflation, and the Iran hostage crisis, America was ripe for something to believe in. Its morale was suddenly boosted by the USA’s stunning Olympic hockey gold medal, Then, at the end of the decade, came the equally stunning collapse of the Soviet Union, brought on by Chernobyl and the fall of the Berlin Wall. These two bookend events ensured that Freemarketism had made a beginning that politicians and the populace wished to continue.

By then, Soviet-style Communism had been fully exposed as a horrific, dystopian, failed system. It had begun with Karl Marx’s angry empathy for the plight of the working stiff, but a century and a half later had morphed into a tyranny of fear, mind control, and brutality that turned its nominal beneficiaries into its victims, administered by a privileged, unthinking, corrupt, emotionally and morally paralyzed class of party bosses. When the failed system met its just desserts, the West’s storyline trumpeted that capitalism had won the Cold War. Freemarketism stepped up to receive the accolades, and its political devotees set about dismantling the social structures Keynesian economics had built before WWII.

From that point, as Freemarketism gained acceptance, it stomped the throttle toward fundamentalism, which is where every belief system, whether religious or secular, must inevitably end up. Belief by its very nature demands its own purification – the rooting out of doubt. To endure, belief must become irrefutable, must become certain to the point where doubt and discourse are demonized, conformity becomes the greatest social good, and ideological myths become determinants of patriotic duty and moral status. Accordingly, as Freemarketism evangelists increasingly installed their privatized solutions, any system of government based on state-sponsored promotion of the common good was quickly characterized as a threat of a resurgence of Communism. In the minds of Freemarketers – both priests and proles – the European social democracies were thrown into the same toxic waste dump as Communism, because the state could never again be trusted to know what is good for its citizens, or be given the power to carry out its agenda.

Freemarketism’s blind spot is now obvious: for all its demonization of government policy, it needed precisely that to create the conditions it needed to operate. Politicians from the 1990’s forward were happy to comply. Thus empowered, in the four decades since its inception, Freemarketism has ironically failed in the same manner as Soviet Communism, gutting the public good of the working masses and protectively sequestering the wealthy capitalist classes. Along the way, Beliefism as the cultural norm has displaced scientific rationalism with moment-by-moment inanity, expressed in the Covid-19 crisis by everything from drinking bleach to mask and supply shortages, lockdown protests and defiance of mask-wearing, terminating support of the World Health Organization, confusion and skepticism about statistics of infection rates and the value of mass testing, the public undercutting of medical authorities, and much more.

The post-truth flourishing of Beliefism is in turn held in place by the third prong of the triumvirate:

  1. Militarism: The American infatuation with military might and private armaments, and a proclivity towards resolving disputes and achieving policy outcomes through bullying, violence, and warfare.

Militarism is the enforcer for the other two prongs of the triumvirate. Its status as a pillar of the national character is on the one hand entirely understandable, given that the USA was formed because the colonists won their war, but on the other hand perhaps the most ideologically inexplicable when measured against the Founders’ rejection of a standing military in favor of a right to mobilize an armed militia as needed. The displacement of the latter with the former was fully complete only after WWII, grudgingly acknowledged by the General who masterminded .he D-Day invasion: “In the councils of government,” President Eisenhower said on the eve of leaving office, “we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military industrial complex,” He further warned that, “Only an alert and knowledgeable citizenry can compel the proper meshing of the huge industrial and military machinery of defense with our peaceful methods and goals, so that security and liberty may prosper together.”

The extent to which General Eisenhower’s warnings fell on deaf ears is by now obvious. Meanwhile, the Founders’ concept of the right to bear arms has metastasized into an absolute right to private armaments. The American national character now rests secure in its confidence that it has a big enough stick to forever defend its libertarian version of individual freedoms – including the freedoms of the marketplace – against all opposing beliefs, Communist or otherwise.

Militarism is evident in developments both expressly directed at the pandemic and coinciding with it, spanning both macro and micro responses from saber-rattling against Iran (against whom we apparently still we feel we have a score to settle), blame-shifting against China accompanied with rhetoric that has quickly escalated to the level of a new Cold War, Congress’s self-congratulatory passage of another record-setting new defense budget, and armed militias rallying against the lockdown and supporting protestors in their belligerent non-compliance.

In its Covid-19 response, America put its money where its mouth (ideology) is.

This ideological triumvirate is evident in the spending priorities of the USA’s legislative allocation of government speaking during the lockdown, as indicated in the following two graphs, which reveal that:

  1. The amount directed to business – mostly big business – was twice again as much as the defense budget;
  2. The amount directed to healthcare – during a pandemic – was least of all – half the amount directed to individuals;
  3. The 2020 defense budget approved during the lockdown was twice the size of the amount directed to individual citizens under the CARES relief act; and
  4. Meanwhile, defense spending dwarfs that of our seven nearest national “competitors.”

The Anatomy of the $2 Trillion COVID-19 Stimulus Bill[3]

CARES Act

U.S. Defense Spending Compared to Other Countries[4]

Defense Spending

Character Over Time

“True character is revealed in the choices a human being makes under pressure,” screenwriting guru Robert McKee wrote, “the greater the pressure, the deeper the revelation, the truer the choice to the character’s essential nature.”[5]

Pressure of the magnitude brought on by the pandemic catches national response off guard. It freezes time, demands instant responses to unprecedented demands. Pretense falls off, values and priorities leap from foundational to forefront. There is no time for analysis or spin, only the unguarded release of words and actions in the pressing moment. The result is national character, fully revealed.

The way out of this dizzying spiral is to embrace the invitation to character transformation, which begins in the awareness that something essential to maintaining the status quo has been lost, life has irreversibly changed, an ending has been reached. Every ending requires a new beginning, every new beginning requires a vision for how to continue, and every vision for continuing requires the perspective of newly-transformed character. If there is going to be systemic change, character must be the one to make concessions. The nation’s policy-makers made no such concession in their Covid-19 response.

Response Without Transformation

We’ve spent a few years in this forum discovering the triumvirate’s development and contemporary dominance of government policy-making, which in turn has been supported by enough of the electorate to keep the system in place. Now, the pandemic has put our “more perfect union” under extraordinary stress.

Given the recent racial issues now dominating the headlines, it isn’t far-fetched to compare the pandemic’s moral and legal challenges to those of the Civil War. Today’s post won’t try to do that topic justice, but it’s interesting to note that slavery was a dominant economic force from before America became the United States, especially buttressing capitalist/entrepreneurial wealth generated in tobacco and cotton, and was both expressly and implicitly adopted as a social, economic, and national norm, — for example in the U.S. Constitution’s denying slaves the right to vote and providing that each slave would count as 3/5 of a resident for purposes of determining seats in the House of Representatives. These “primary causes” remained intact for the nation’s first several decades, until a variety of pressures forced a reconsideration and transformation. Those pressures included, for example, a bubble in the pre-Civil War slave market that made slaves themselves into a valuable equity holding to be bought and sold for profit — a practice particularly outrageous to Northerners.[6]

The Covid-19 triumvirate is not Constitutionally recognized as slavery was, but clearly it is based on the current emphasis of certain aspects of the USA’s foundations to the exclusion of others. Many economists argue, for example, that the way out of the deepening pandemic economic depression is a return to a Keynesian-style massive governmental investment in public works and welfare – a strategy that even then was hugely controversial for the way it aggressively rebalanced the national character. The Covid-19 response, along with the military budget, makes no attempt at such a rebalancing – which, among other things, would require policy-makers to retreat from the common assumption that government support of the public good is Communism.

It took a Civil War and three Constitutional Amendments to remove nationalized slavery from the Constitution and begin the transformation of the nation’s character on the topic of race – a transformation which current events reveal is still sadly incomplete.

What would it take to similarly realign the national character in response to the pandemic?

[1] Since we’ve been discovering and examining these for several years in this forum, in this post I’m going to depart from my usual practice of quoting and citing sources. To do otherwise would have made this post far too redundant and far too long,

[2] My two blogs are The New Economy and the Future of Work and Iconoclast.blogt, Each has its counterpart on Medium – The Econoclast and Iconoclost.blog (recent articles only)..

[3] Visusalcapitalist.com

[4] Peter G. Peterson Foundation.

[5] McKee, Robert, Story: Substance, Structure, Style, and the Principles of Screenwriting (1997).

[6] See the analysis in Americana: A 400-Year History of American Capitalism, Bhu Srinivasan.(2017), and the author’s interview with the Wharton business school ,

Reckoning With Competitive Capitalism

“There exists an obvious fact that seems utterly moral:
namely, that a man is always prey to his truths”

Albert Camus, The Myth of Sisyphus and Other Essays (1955)

I wrote a post about 2½ years ago (Aug. 31, 2017) with the same title as this one. It referred to University of Connecticut law professor James Kwak’s book Economism, which warns against “the pernicious influence of economism in contemporary society.” Prof. Kwak defines “economism” as “a distorted worldview based on a misleading caricature of economic knowledge,” and makes the case that free market ideology is guilty of it:

“The competitive market model can be a powerful tool, but it is only starting point in illuminating complex real-world issues, not the final word. In the real world, many other factors complicate the picture, sometimes beyond recognition.”

As we’ve seen, free market economic theory is based on the assumption of a “pure” capitalist state. Prof. Kwak calls for a new approach that meets the complex challenges of real life:

“Real change will not be achieved by mastering the details of marginal costs and marginal benefits, but by constructing a new, controlling narrative about how the world works.”

“Reckoning” means “a narrative account” and “a settling of accounts,” as in “Day of reckoning.”[1] A reckoning on economic policy therefore begins with an examination of  whether the prevailing ideology actually delivers what it theoretically promises. Honest reckoning is hard, because the neural circuits of our brains are predisposed to maintain status quo and resist change to both individual and cultural belief systems. The difficulty is amplified when fundamentalist ideology is at play, because  reckoning threatens historical cultural mythology, which is tantamount to sacrilege.

 “History is powerful. George Santayana’s warning that ‘those who cannot remember the past are condemned to repeat it’ rings true because the past influences the present.

“Unfortunately, history’s power does not depend on its accuracy:  A widely believed historical lie can have as much impact as a historical truth.

“President John F. Kennedy explained to Yale’s graduating class of 1962 that ‘the great enemy of the truth is very often not the lie — deliberate, contrived, and dishonest —  but the myth — persistent, persuasive, and unrealistic. Too often we hold fast to the clichés of our forebears…. We enjoy the comfort of opinion without the discomfort of thought.’”

The Founding Myth, by Andrew L. Seidel (2019)

Change that breaks with predominant ideologies and historical cultural myths requires more than individual changes of opinion:  it needs shifts in cultural belief and practice, and a willingness to learn from history. The odd are stacked against it, for reasons Pulitzer prize winning war correspondent Chris Hedges describes in War is a Force That Gives Us Meaning (2014):

“Every society, ethnic group or religion nurtures certain myths, often centered around the creation of the nation or the movement itself. These myths lie unseen beneath the surface, waiting for the moment to rise ascendant, to define and glorify followers or member in times of crisis. National myths are largely benign in times of peace…. They do not pose a major challenge to real historical study or a studied tolerance of others in peacetime.

“But national myths ignite a collective amnesia in war. They give past generations a nobility and greatness they never possessed…. They are stoked by the entertainment industry, in school lessons, stories, and quasi-historical ballads, preached in mosques, or championed in absurd historical dramas that are always wildly popular during war.

“Almost every group, and especially every nation, has such myths. These myths are the kindling nationalists use to light a conflict.

“Archeology, folklore, and the search for what is defined as authenticity are the tools used by nationalists to assail others and promote themselves. They dress it up as history, but it is myth.

“Real historical inquiry, in the process, is corrupted, assaulted, and often destroyed. Facts become interchangeable as opinions. Those facts that are inconvenient are discarded or denied. The obvious inconsistencies are ignored by those intoxicated with a newly found sense of national pride, and the exciting prospect of war.”

All of this makes the Business Roundtable’s Statement on the Purpose of a Corporation and the World Economic Forum’s Davos Manifesto (we looked at them last time) all the more remarkable, since they defy four decades of the prevailing economic myth that “The [sole] social responsibility of business is to increase its profits.”

On the other hand, a recent administrative order imposing work requirements on food stamps recipients offers an equally remarkable example of myth-driven policy-making. According to ABC News (Dec. 4, 2019), proponents say the move will “restore the dignity of work to a sizable segment of our population” — clearly a nod to the cultural myth that anybody with enough gumption (and enough education, funded by the newly nationalized student loan industry) can work their way out of poverty, and if they don’t, it’s their own fault. As we’ve seen, data to support this way of thinking has long been absent, but the myth prevails, and never mind that “all the rule change does is strip people from accessing the benefit,” that the food stamp program “is intended to address hunger and not compel people to work,” and that “those affected are impoverished, tend to live in rural areas, often face mental health issues and disabilities.”

Economism was published on January 10, 2017, just shy of three years ago as I write this. Today’s “Reckoning” post was inspired by a Time Magazine cover story last month:  How the Elites Lost Their Grip: in 2019, America’s 1% behaved badly and helped bring about a reckoning with capitalism, Time Magazine (Dec. 2-9, 2019). We’ll look at what it says about economic reckoning next time.

[1] Etymology Online.

Belief in the Free Market

Mammon

1909 painting The Worship of Mammon by Evelyn De Morgan.
https://en.wikipedia.org/wiki/Mammon

We saw last time that Milton Friedman and his colleagues at the Chicago School of Economics promoted the free market with fundamentalist zeal — an approach to economics that Joseph Stiglitz said was based on “religious belief.” Turns out that using religious-sounding language to talk about believing in capitalism isn’t as farfetched as it sounds on first hearing.

In the history of ideas, the “Disenchantment” refers to the idea that the Enlightenment ushered in an era when scientific knowledge would displace religious and philosophical belief. Reason, rationality, and objectivity would make the world less magical, spiritual, and subjective, and therefore “disenchanted.” You don’t need to know much history to know the Disenchantment never really played out — at least, certainly not in America.

“Each of us is on a spectrum somewhere between the poles of rational and irrational. We all have hunches we can’t prove and superstitions that make no sense. What’s problematic is going overboard—letting the subjective entirely override the objective; thinking and acting as if opinions and feelings are just as true as facts. The American experiment, the original embodiment of the great Enlightenment idea of intellectual freedom, whereby every individual is welcome to believe anything she wishes, has metastasized out of control. In America nowadays, those more exciting parts of the Enlightenment idea have swamped the sober, rational, empirical parts. Little by little for centuries, then more and more and faster and faster during the past half century, we Americans have given ourselves over to all kinds of magical thinking, anything-goes relativism, and belief in fanciful explanation—small and large fantasies that console or thrill or terrify us. And most of us haven’t realized how far-reaching our strange new normal has become.

“Why are we like this?

“The short answer is because we’re Americans—because being American means we can believe anything we want; that our beliefs are equal or superior to anyone else’s, experts be damned.

“America was created by true believers and passionate dreamers, and by hucksters and their suckers, which made America successful—but also by a people uniquely susceptible to fantasy, as epitomized by everything from Salem’s hunting witches to Joseph Smith’s creating Mormonism, from P. T. Barnum to speaking in tongues, from Hollywood to Scientology to conspiracy theories, from Walt Disney to Billy Graham to Ronald Reagan to Oprah Winfrey to Trump. In other words: Mix epic individualism with extreme religion; mix show business with everything else; let all that ferment for a few centuries; then run it through the anything-goes ’60s and the internet age. The result is the America we inhabit today, with reality and fantasy weirdly and dangerously blurred and commingled.”

Fantasyland:  How American Went Haywire, a 500-Year History, Kurt Andersen (2017)[1]

Villanova professor Eugene McCarraher makes the case that capitalism stepped up to fill the belief void created by Disenchantment enthusiasts, and became the new world religion.

Mammon book“Perhaps the grandest tale of capitalist modernity is entitled ‘The Disenchantment of the World’. Crystallised in the work of Max Weber but eloquently anticipated by Karl Marx, the story goes something like this: before the advent of capitalism, people believed that the world was enchanted, pervaded by mysterious, incalculable forces that ruled and animated the cosmos. Gods, spirits and other supernatural beings infused the material world, anchoring the most sublime and ultimate values in the ontological architecture of the Universe.

“In premodern Europe, Catholic Christianity epitomised enchantment in its sacramental cosmology and rituals, in which matter could serve as a conduit or mediator of God’s immeasurable grace. But as Calvinism, science and especially capitalism eroded this sacramental worldview, matter became nothing more than dumb, inert and manipulable stuff, disenchanted raw material open to the discovery of scientists, the mastery of technicians, and the exploitation of merchants and industrialists.

“Discredited in the course of enlightenment, the enchanted cosmos either withered into historical oblivion or went into the exile of private belief in liberal democracies…. With slight variations, ‘The Disenchantment of the World’ is the orthodox account of the birth and denouement of modernity, certified not only by secular intellectuals but by the religious intelligentsia as well.”

Mammon:  Far from representing rationality and logic, capitalism is modernity’s most beguiling and dangerous form of enchantment, Aeon Magazine (Oct. 22, 2019)

Prof. McCarraher develops his ideas further in his book The Enchantments of Mammon: How Capitalism Became the Religion of Modernity (2019). This is from the Amazon book blurb:

“If socialists and Wall Street bankers can agree on anything, it is the extreme rationalism of capital. At least since Max Weber, capitalism has been understood as part of the “disenchantment” of the world, stripping material objects and social relations of their mystery and sacredness. Ignoring the motive force of the spirit, capitalism rejects the awe-inspiring divine for the economics of supply and demand.

“Eugene McCarraher challenges this conventional view. Capitalism, he argues, is full of sacrament, whether or not it is acknowledged. Capitalist enchantment first flowered in the fields and factories of England and was brought to America by Puritans and evangelicals whose doctrine made ample room for industry and profit. Later, the corporation was mystically animated with human personhood, to preside over the Fordist endeavor to build a heavenly city of mechanized production and communion. By the twenty-first century, capitalism has become thoroughly enchanted by the neoliberal deification of ‘the market.’”

Economic theories — capitalism, Marxism, socialism — are ideologies:  they’re based on ideas that can’t be proven scientifically; they require belief. The reason thinkers like Kurt Andersen and Eugene McCarraher both use the term “dangerous” in connection with economic belief is because of the fundamentalist dynamics that invariably accompany ideological belief, secular or otherwise. We’ll look at that next time.

[1] The book is another case of American history as we never learned it. For the shorter version, see this Atlantic article.

Economic Fundamentalism

We saw last time that the goal of Chicago School free market economics was to promote “noncontaminated capitalism,” which in turn would generate societal economic utopia:

“The market, left to its own devices, would create just the right number of products at precisely the right prices, produced by workers at just the right wages to buy those products — an Eden of plentiful employment, boundless creativity and zero inflation.”

The Shock Doctrine:  The Rise of Disaster Capitalism, Naomi Klein (2017)

To the School’s free market advocates, these ideas were pure science:

“The starting premise is that the free market is a perfect scientific system, one in which individuals, acting on their own self-interested desires, create the maximum benefits for all. If follows ineluctably that if something is wrong with a free-market economy — high inflation or soaring unemployment — it has to be because the market is not truly free.”

The Shock Doctrine

Scientific method requires that theories be falsifiable:  you have to be able to objectively prove them wrong.

“The philosopher Karl Popper argued that what distinguishes a scientific theory from pseudoscience and pure metaphysics is the possibility that it might be falsified on exposure to empirical data. In other words, a theory is scientific if it has the potential to be proved wrong.”

But Is It Science? Aeon Magazine, Oct. 7, 2019.

But how do you prove an economic theory based on “uncontaminated capitalism” in an economically contaminated world?

“The challenge for Friedman and his colleagues was not to prove that a real work market could live up to their rapturous imaginings…. Friedman could not point to any living economy that proved if all ‘distortions’ were stripped away, what would be left would be a society in perfect health and bounteous, since no country in the world met the criteria for perfect laissez-faire. Unable to test their theories in central banks and ministries of trade, Friedman and his colleagues had to settle for elaborate and ingenious mathematical equations and computer models.”

The Shock Doctrine

Mathematical equations and computer models aren’t the same as empirical data collected in the real (“contaminated”) world. If falsifiability is what separates scientific knowledge from belief-based ideology, then Friedman’s free market theory is the latter. Some scientists are worried that this spin on scientific theorizing has become too prevalent nowadays:

 “In our post-truth age of casual lies, fake news and alternative facts, society is under extraordinary pressure from those pushing potentially dangerous antiscientific propaganda – ranging from climate-change denial to the anti-vaxxer movement to homeopathic medicines. I, for one, prefer a science that is rational and based on evidence, a science that is concerned with theories and empirical facts, a science that promotes the search for truth, no matter how transient or contingent. I prefer a science that does not readily admit theories so vague and slippery that empirical tests are either impossible or they mean absolutely nothing at all…. For me at least, there has to be a difference between science and pseudoscience; between science and pure metaphysics, or just plain ordinary bullshit.”

But Is It Science?

The Chicago School believed so ardently in the free market theory that its instructional approach took on the dynamics of belief-based indoctrination:

“Frank Knight, one of the founders of Chicago School economics, thought professors should ‘inculcate’ in their students the belief that economic belief is ‘a sacred feature of the system,’ not a debatable hypothesis.’”

The Shock Doctrine

This dynamic applies to every ideology that can’t be falsified — verified empirically. The ideology then becomes a fundamentalist belief system:

“Like all fundamentalist faiths, Chicago School economics is, for its true believers a closed loop. The Chicago solution is always the same:  a stricter and more complete application of the fundamentals.:

The Shock Doctrine

Journalist Chris Hedges describes the dynamics of “secular fundamentalism” in I Don’t Believe in Atheists. (The book’s title is too clever for its own good — a later version adds the subtitle “The Dangerous Rise of the Secular Fundamentalist.”)

“Fundamentalism is a mind-set. The iconography and language it employs can be either religious or secular or both, but because it dismisses all alternative viewpoints as inferior and unworthy of consideration it is anti-thought. This is part of its attraction. It fills a human desire for self-importance, for hope and the dream of finally attaining paradise. It creates a binary world of absolutes, of good and evil. It provides a comforting emotional certitude. It is used to elevate our cultural, social, and economic systems above others…. The core belief systems of these secular and religious antagonists are identical.”

Thus we have Nobel prize-winning economist Milton Friedman famously saying, “Underlying most arguments against the free market is a lack of belief in freedom itself” — a statement entirely in keeping with the Mont Pelerin  Society’s idealistic Statement of Aims, which we looked at last time.

And thus we also have Nobel prize-winning economist Joseph Stiglitz countering with his thoughts about economics in a contaminated (“pathological”) world:

“The advocates of free markets in all their versions say that crises are rare events, though they have been happening with increasing frequency as we change the rules to reflect beliefs in perfect markets. I would argue that economists, like doctors, have much to learn from pathology. We see more clearly in these unusual events how the economy really functions. In the aftermath of the Great Depression, a peculiar doctrine came to be accepted, the so-called ‘neoclassical synthesis.’ It argued that once markets were restored to full employment, neoclassical principles would apply. The economy would be efficient. We should be clear: this was not a theorem but a religious belief.”

As we also saw last time, historical socialism and communism join free market capitalism in their fundamentalist zeal. In fact, some think that economics in general has become today’s dominant cultural form of belief-based thinking. More on that next time.

If You Like This, You Might Like…

icono2

I created a new blog. I want to tell you about it, and invite you to follow it.

I’ve spent the past ten years writing books, blogs, and articles on technology, jobs, economics, law, personal growth, cultural transformation, psychology, neurology, fitness and health… all sprinkled with futurism. In all those seemingly unrelated topics, I’ve been drawn to a common theme:  change. One lesson stands out:

Beliefs create who we are individually and collectively.
The first step of change is to be aware of them.
The second step is to leave them behind.

Beliefs inform personal and collective identity, establish perspective, explain biases, screen out inconsistent information, attract conforming experience, deflect non-conforming information and experience, and make decisions for us that we only rationalize in hindsight. Those things are useful:  beliefs help us locate our bewildered selves and draw us into protective communities.

We need that to survive and thrive.  But if we’re after change, beliefs can be too much of a good thing. They make us willfully blind, show us only what we will see and hide what we won’t. They build our silos, sort us into polarities, close our minds, cut us off from compassion, empathy, and meaningful discourse.

Those things are useful:  they tame the wild, advance civilization, help us locate our bewildered selves and draw us into protective communities. We need that to survive and thrive.  But if we’re after change, they’re too much of a good thing. They make us willfully blind, show us only what we will see and hide what we won’t. They build our silos, sort us into polarities, close our minds, cut us off from compassion, empathy, and meaningful discourse.

We need to become iconoclasts.

The Online Etymology Dictionary says that “iconoclast” originally meant “breaker or destroyer of images,” originally referring to religious zealots who vandalized icons in Catholic and Orthodox churches because they were “idols.” Later, the meaning was broadened to “one who attacks orthodox beliefs or cherished institutions.”

Our beliefs are reflected, transmitted, and reinforced in our religious, national, economic, and other cultural institutions. These become our icons, and we cherish them, invest them with great dignity, revere them as divine, respect them as Truth with a capital T, and fear their wrath if we neglect or resist them. We confer otherworldly status on them, treat them as handed down from an untouchable level of reality that supersedes our personal agency and self-efficacy. We devote ourselves to them, grant them unquestioned allegiance, and chastise those who don’t bow to them alongside us.

Doing that, we forget that our icons only exist because they were created out of belief in the first place. In the beginning, we made them up. From there, they evolved with us. To now and then examine, challenge, and reconfigure them and the institutions that sustain them is an act of creative empowerment — one of the highest and most difficult gifts of being human.

Change often begins when that still small voice pipes up and says, “Maybe not. Maybe something else is possible.” We are practiced in ignoring it; to become an iconoclast requires that we listen, and question the icons that warn us not to. From there, thinking back to the word’s origins, I like “challenge” better than “attack.”  I’m not an attacker by nature, I’m an essayist — a reflective, slow thinker who weighs things and tries to make sense of them. I’m especially not a debater or an evangelist — I’m not out to convince or convert anyone, and besides, I lack the quick-thinking mental skillset.

I’m also not an anarchist, libertarian, revolutionary… not even a wannabe Star Wars rebel hero, cool as that sounds. I was old enough in the 60’s to party at the dawning of the Age of Aquarius, but then it failed like all the other botched utopias — exposed as one more bogus roadmap claiming to chart the way back to the Garden.

Sorry, but the Garden has been closed for a long, long time.

garden closed

A friend used to say, “Some open minds ought to close for business.” Becoming an iconoclast requires enough open-mindedness to suspend status quo long enough to consider that something else is possible. That isn’t easy, but it is the essential beginning of change, and it can be done.

Change needs us to be okay with changing our minds.

All the above is what I had in mind when I created Iconoclast.blog. I am aware of its obvious potential for inviting scoffing on a good day, embarrassment and shaming on a worse, and vituperation, viciousness, trolling, and general spam and nastiness on the worst. (Which is why I disabled comments on the blog, and instead set up a Facebook page that offers ample raving opportunity.) Despite those risks, I plan to pick up some cherished icons and wonder out loud what might be possible in their absence. If you’re inclined to join me, then please click the follow button. I would enjoy the company.

There’s No Such Thing as a Free Lunch — True or False?

free lunch - mIlton friedman

free lunch - steven hawking

We can assume that the pros and cons of a universal basic income (UBI) have been thoroughly researched and reasonably analyzed, and that each side holds its position with utmost conviction.

We can also assume that none of that reasonableness and conviction will convert anyone from one side to the other, or win over the uncommitted. Reason doesn’t move us:  we use it to justify what we already decided, based on what we believe. SeeWhy Facts Don’t Change Our Minds,” The New Yorker (February 2017) and “This Article Won’t Change Your Mind,” The Atlantic (March 2017).

History doesn’t guide us either — see Why We Refuse to Learn From History, from Big Think and Why Don’t We Learn From History, from military historian Sir Basil Henry Liddell Hart. The latter is full of conventional wisdom:

“The most instructive, indeed the only method of learning to bear with dignity the vicissitude of fortune, is to recall the catastrophes of others.

“History is the best help, being a record of how things usually go wrong.

“There are two roads to the reformation for mankind— one through misfortunes of their own, the other through the misfortunes of others; the former is the most unmistakable, the latter the less painful.

“I would add that the only hope for humanity, now, is that my particular field of study, warfare, will become purely a subject of antiquarian interest. For with the advent of atomic weapons we have come either to the last page of war, at any rate on the major international scale we have known in the past, or to the last page of history.

Good advice maybe, but we’ve heard it before and besides, most of us would rather make our own mistakes.

If reasoned analysis and historical perspective don’t inform our responses to radically new ideas like UBI, then what does? Many things, but cultural belief is high on the list. Policy is rooted in culture, culture is rooted in shared beliefs, and beliefs are rooted in history. Cultural beliefs shape individual bias, and the whole belief system becomes sacred in the culture’s mythology. Try to subvert cultural beliefs, and the response is outrage and entrenchment.

All of which means that each of us probably had a quick true or false answer to the question in this week’s blog post title, and were ready to defend it with something that sounded reasonable. Our answer likely signals our knee jerk response to the idea of UBI. The “free lunch”– or, more accurately, “free money” — issue appears to be the UBI Great Divide:  get to that point, and you’re either pro or con, and there’s no neutral option. (See this for more about where the “no free lunch” phrase came from.[1])

The Great Divide is what tanked President Nixon’s UBI legislation. The plan, which would have paid a family of four $1,600/year (equivalent to $10,428 today) was set to launch in the midst of an outpouring of political self-congratulation and media endorsement, only to be scuttled by a memo from a White House staffer that described the failure of a British UBI experiment 150 years earlier. UBI apparently was in fact a free lunch, with no redeeming social purpose; thus its fate was sealed.

As it turns out, whether the experiment  failed or not was lost in a 19th Century fog of cultural belief which enabled opponents of the experiment to pounce on a bogus report about its impact to justify passing the Poor Law Amendment Act of 1834 — which is what they wanted to do anyway. The new Poor Law was that era’s version of workfare, and was generated by the worst kind of scarcity mentality applied to the worst kind of scarcity. Besides creating the backdrop to Charles Dickens’ writing, the new Poor Law’s philosophical roots still support today’s welfare system:

“The new Poor Law introduced perhaps the most heinous form of ‘public assistance’ that the world has ever witnessed. Believing the workhouses to be the only effective remedy against sloth and depravity, the Royal Commission forced the poor into senseless slave labor, from breaking stones to walking on treadmills….”

From “The Bizarre Tale Of President Nixon’s Basic Income Plan.”

If UBI is a free lunch, then it’s an affront to a culture that values self-sufficiency. If it isn’t, then it requires a vastly different cultural value system to support it. The former believes that doing something — “making a living” at a job — is how you earn your daily bread. The latter believes you’re entitled do sustenance if you are something:  i.e., a citizen or member of the nation, state, city, or other institution or community providing the UBI. The former is about activity, the latter is about identity. This Wired article captures the distinction:

“The idea [of UBI] is not exactly new—Thomas Paine proposed a form of basic income back in 1797—but in this country, aside from Social Security and Medicare, most government payouts are based on individual need rather than simply citizenship.”

UBI is about “simply citizenship.” It requires a cultural belief that everybody in the group shares its prosperity.  Cultural identity alone ensures basic sustenance — it’s a right, and that right makes Poor Laws and workfare obsolete.

The notion of cultural identity invites comparison between UBI and the “casino money” some Native American tribes pay their members. How’s that working? We’ll look at that next time.

[1] Yes, Milton Friedman did in fact say it, although he wasn’t the only one. And in a surprising twist, he has been criticized for advocating his own version of UBI.