Losing My Religion, by Michael Bigelow

Here’s the link to this article.

From Elder in the Jehovah’s Witnesses religion to proponent of scientific naturalism, by Michael Bigelow

In several of my books I have recounted my own journey from born-again Christian to religious skeptic, in the context of understanding how beliefs are formed and change (in The Believing Brain), how religious and faith-based beliefs differ (or at least should differ) from scientific and empirical beliefs (in Why Darwin Matters), and the relationship of science and religion: same-worlds model, separate-worlds model, conflicting-worlds model (in Why People Believe Weird Things). As a result, over the years I have received a considerable amount of correspondence from Christians who want to convince me to come back to the faith, along with one-time believers who recount their own pathway to non-belief. At my urging after emails revealing autobiographical fragments of his own loss of faith, in this edition of Skeptic guest contributor Michael Bigelow narrates his sojourn from Elder in the Jehovah’s Witnesses religion to proponent of scientific naturalism. As he recalls in this revealing passage from the essay below:

A literal interpretation of the Bible proved incorrect. Humans were not created in 4026 BC, nor was the earth engulfed by a flood 4,400 years ago. I was wrong. My personal discovery categorically rendered the Bible’s account of natural history as false. This revelation cast doubt on the entirety of the Bible.

—Michael Shermer

Losing My Religion

In 2008, I faced one of the most uncomfortable moments of my life. For Jehovah’s Witnesses, the memorial of Jesus’ death is the most significant event of the year. It is a solemn occasion in which a respected Elder addresses a packed Kingdom Hall, filled with believers and visitors. That year, I delivered the talk and managed the ritual passing of the wine and unleavened bread. Many attendees praised me afterward, claiming it was clear that God’s spirit was upon me. However, for nearly four years before that evening, I had ceased praying and believing and was deeply troubled by the hypocrisy of teaching things I could no longer accept as true.

An even more distressing day occurred in 2012 when I publicly renounced my ties with Jehovah’s Witnesses. A former friend described my departure as a “nuclear blast” that devastated the three congregations I had once served. My decision to leave based on conscience resulted in immediate and complete shunning by all my former friends, my family, and even my two adult sons.

Turning Points

I was born into a large extended family of Jehovah’s Witnesses in 1961 in San Diego, California. Our family of eight led a life typical for Witness households: we didn’t celebrate holidays, birthdays, or participate in patriotic events and after-school activities. My entire social circle was within the religious community, and our Saturdays were dedicated to door-to-door preaching. This lifestyle felt completely normal to me as a child.

During the 1960s, Jehovah’s Witnesses were taught that Armageddon was imminent and strongly suggested this would occur in 1975. We believed that on this day, God would destroy all who were not part of our faith, creating a deep sense of urgency to save not just ourselves but others. Driven by this belief, my father moved our family from the comfort of Southern California to frugal living in Northern New England, aiming to reach an underserved community with our teachings.

In New Hampshire, at our new Kingdom Hall, I met a young lady who would become my wife. At just twelve years old, we were both deeply committed to our faith, known among Jehovah’s Witnesses as “The Truth,” and we knew we would eventually wed. Despite our parents’ efforts to keep us apart, it seemed inevitable that we would be together. After graduating high school, we married young, securing minimum wage jobs and making ends meet with second-hand furniture and tight budgets.

I progressed through various positions of responsibility, both in our congregation and at the manufacturing company where I worked. Together, we raised and homeschooled our two sons, continually striving to live up to the commitments of our dedication to God.

In 1991, I was appointed as an Elder in our local congregation, a role laden with significant responsibilities. My duties included teaching at congregation meetings, providing guidance to those struggling, leading preaching efforts, and speaking at large conventions. Following in my father’s footsteps, I established a reputation as a dedicated minister. Despite my commitment, I harbored private doubts. Natural disasters and biblical accounts like the Noachian flood, which seemed improbable, troubled me. I was also disturbed by the notion of God allowing Satan to corrupt His perfect creation. We were taught to manage such doubts through prayer and meditation, a strategy that sufficed until innovations like Google Earth introduced new perspectives that challenged my views further.

Deconversions

Accounts of shunning, deconversion, and abandoning supernatural beliefs are increasingly common today. While my story isn’t unique in its occurrence, it is distinct in its unfolding. Many are leaving Jehovah’s Witnesses due to the organization’s strict control, unfulfilled prophecies, evolving doctrines, prohibition of blood transfusions, biased translations of the Bible, and mishandling of child sexual abuse cases.

These are valid reasons to leave, but my departure was driven by something else. When I realized the biblical narrative of natural history couldn’t possibly be true, my entire belief system collapsed. Yet, I continued to serve as a teacher, shepherd, and public figure in the organization for five more years, knowing I was an atheist. This period was a personal torment for which I still feel remorse. Looking back, I can’t see how I could have chosen differently. Here is how it all unfolded.

I have always had a profound love for the outdoors, and spending time in the mountains has been a significant part of my life. In the late 1960s, my grandparents took my brothers and me on a road trip from San Diego to the Sierra Nevada mountains in Central California. The majestic, snow-capped and rugged terrain captivated my imagination. This experience left a lasting impression, and by the mid 1980s I began organizing annual backpacking and climbing trips to the Sierras. Each spring, I would meticulously plan these trips from New Hampshire, pouring over maps, guidebooks, and equipment lists.

By the early-1990s, the Palisade range of the Sierra had become my personal sanctuary, notable for the ranges’ largest active glacier. The first time I observed the glacier from an elevated viewpoint, I noticed it was shrinking. At the glacier’s base lay a horseshoe-shaped moraine nearly a hundred feet high. Below the moraine’s rim, a glacial pond of milky, silty water formed, scattered with broken granite and ice. This observation troubled me; something significant was amiss, yet it remained just beyond my understanding.

Palisade Glacier. Photo by the author.

From the perspective of day-age, fundamentalist Christians, it is believed that the entire planet was submerged underwater 4,400 years ago during Noah’s flood. Thus, every existing landform—whether a canyon, glacier, desert, cavern, or mountain—either existed under water at that time or formed naturally afterward. I didn’t contemplate these ideas when I first saw that glacier or when I climbed the 14,000-foot mountains surrounding it. Yet, a seed of new doubt was planted.

By the late 1990s, I had a new tool for planning my trips to the Sierra: Google Earth. This technology allowed me to view satellite imagery of the entire mountain range. I could meticulously plan climbing routes, select camping spots, and observe glaciers—not only those that were still active but, more intriguingly, those that had vanished. The disappearance of glaciers suggested ice ages, a concept that I was not ready to accept as it contradicts Jehovah’s Witnesses’ teachings, which deny such geological periods.

Business Interlude and Deep Questioning of the Faith

In 1999, the company I had been with since my teenage years offered me a job in Asia. At that time, I was managing two of their operations in New England. They had recently acquired a company in Taiwan and wanted me to oversee their manufacturing in China. My wife and I deliberated over this opportunity, with our primary concern being the ability to maintain our spiritual commitments and contribute to a local congregation in Taiwan. After reaching out to the headquarters of our religious organization, we learned there was an English-speaking group in the city we would be moving to, and they welcomed our participation. Encouraged by this, we decided to relocate.

Upon moving to Taiwan, I soon realized the necessity of learning Mandarin Chinese to succeed in my role. Motivated by this challenge, I dedicated myself to studying with an intensity I had never shown before. In high school, my focus had been on my future wife rather than academics, making me a lackluster student. However, in Taiwan, I quickly learned Mandarin and developed effective study habits that significantly changed my life’s direction. This deep dive into the language not only helped in my immediate job but also enabled me and my business partners to eventually acquire the Asian company, securing our financial future. Additionally, working in locations away from my family provided me with the private space and time to deeply research and reflect on significant topics, further enriching my understanding and perspectives.

In the early 2000s, while living in Asia, I continued planning trips to the Sierra Nevada. By then, Google Earth’s satellite imagery had greatly improved, allowing me to see individual boulders and trees. This tool became indispensable for both planning excursions and simply enjoying the landscapes from afar. Around 2003, a particular land feature near Bishop, California, caught my attention and profoundly shifted my perspective. There, a small river emerges from the high country and runs through a wide, empty glacial moraine into the arid Owens Valley (see Google Earth image below). The moraine, a pristine trench once filled by a glacier, is starkly visible, stretching nearly to the desert floor. This observation challenged my previous beliefs: it seemed highly unlikely that this landform was ever submerged underwater or formed shortly after a flood. As I reviewed images from all the earth’s great mountain ranges, I found similar features. This realization opened a floodgate of curiosity and skepticism about the traditional narratives I had accepted.

Religious Dogma vs. Carbon Dating

The first research book I purchased was Glaciers of California: Modern Glaciers, Ice Age Glaciers, the Origin of Yosemite Valley, and a Glacier Tour in the Sierra Nevada by Bill Guyton. This book ignited a thirst for knowledge that grew exponentially. Studying glaciers led me to explore broader geology, which in turn introduced me to plate tectonics and scientific dating methods. These concepts opened the door to pre-history and the works of scholars like Jared Diamond, Steven Mithen, and many others. As I delved deeper, consuming books, downloading scientific papers, and visiting field sites, I was desperately seeking any evidence to affirm the Bible’s accounts of natural history. Internally, I struggled with my faith-based commitment that “I can’t be wrong,” but the mounting evidence made me fear that I was losing the argument against established scientific consensus.

As I delved into pre-history, I frequently encountered carbon dating—a method I had been taught to distrust. From the 1960s until the early 1990s, Jehovah’s Witnesses employed pseudo-scientific arguments to discredit the reliability of carbon dating. Reflecting on these apologetics with a better understanding of logical fallacies and flawed reasoning, I now recognize those arguments as circular, appealing to authority, and rooted in motivated reasoning. Despite my resistance, the evidence supporting carbon dating seemed overwhelming. In my quest to align my beliefs with factual accuracy, I had to personally validate carbon dating’s efficacy. I came across a statement from Carl Sagan, who said, “When you make the finding yourself—even if you’re the last person on Earth to see the light—you’ll never forget it.” This sentiment resonated with me deeply; I had to experience this realization firsthand. Sagan was right—I will never forget the moment I accepted the truth of carbon dating.

During the early 2000s, part of my research turned to the peopling of the Americas, a captivating area of paleontology that held particular significance for me at the time. According to 17th-century biblical chronologist James Ussher, humans were created from dirt in 4004 BC, specifically on October 22. Jehovah’s Witnesses adopt a similar timeline, placing human creation at 4026 BC. Arriving at Ussher’s date involves recording and counting forward or backward from known events based on the ages of biblical kings and patriarchs. This chronology is accepted as accurate by many biblical literalists. However, if evidence showed that the Americas were populated thousands of years before these dates, it would profoundly challenge this timeline and compel me to reconsider my beliefs further.

As I delved into the peopling of the Americas, I discovered that the ash and pumice layer from the eruption of Mt. Mazama (now Crater Lake) serves as a precise stratigraphic marker. At Paisly Cave and Fort Rock Cave, archaeologists found human artifacts both within and beneath the Mt. Mazama volcanic tephra layer. These artifacts included campfire remains, hand-woven sagebrush sandals, grinding stones, projectile points, basketry, cordage, human hair, and the butchered remains of now-extinct animals, such as camelids and equids. Many of these artifacts were carbon dated, with results ranging from 9,100 to 14,280 years before present (BP). Therein lies a hurdle—those pesky carbon dating references. I struggled to reconcile these dates with my previous beliefs, as they suggested human presence in the Americas long before the biblical timeline of human creation.

The abundant artifacts found within and beneath the debris from Mt. Mazama prompted researchers to pinpoint the eruption’s date more accurately. In 1983, Charles Bacon estimated the eruption occurred around 6,845 years +/-50 (BP) using the beta counting method of carbon dating on burned wood samples found in Mazama’s lava flows. A more refined date was published in 1996 by D.J. Hallet, who dated the eruption to approximately 6,730 BP, with a margin of error of +/- 40 years. This estimate utilized the more advanced Accelerator Mass Spectrometry (AMS) carbon dating technique on burned leaves and twigs mixed with Mazama tephra in nearby lakebed sediments. Despite the improved methodology, my skepticism persisted because it still relied on carbon dating, a technique I was still reluctant to trust fully.

The quest for a more accurate date of the Mt. Mazama eruption led to significant advancements in 1999 when C.M. Zdanowicz and his team published a paper with a revised eruption date. They leveraged the precise nature of annual layers in Greenland Ice Cores, hypothesizing that they could pinpoint a near absolute year for the eruption by identifying Mazama’s volcanic signatures within the ice. Starting with calibrated carbon dates from previous research as a baseline, they sampled layers above and below the target area, searching for traces of Mazama.

The team found volcanic glass and other chemical markers consistent with those found near the eruption site. Zdanowicz published a date range of 7,545 to 7,711 years before present, aligning closely with previous carbon dating results. This discovery was a profound moment of humility and awakening for me; the precision of carbon dating not only pinpointed the location of Mazama tephra in the Greenland ice core but also demonstrated the reliability of this dating method. It confirmed what many scientists had long understood: carbon dating is a powerful tool for establishing historical timelines, and these were in direct conflict with my religious beliefs.

The End of the End

A literal interpretation of the Bible proved incorrect. Humans were not created in 4026 BC, nor was the earth engulfed by a flood 4,400 years ago. I was wrong. My personal discovery categorically rendered the Bible’s account of natural history as false. This revelation cast doubt on the entirety of the Bible. When biblical authors wrote of a literal flood and Adam and Eve as the first humans, they were unaware of their inaccuracies. This prompted me to investigate the origins of the Old Testament. I concluded that this collection of books was crafted to forge a grand narrative, one that provided the people of Israel with a national identity and a distinguished status before God as His chosen people, dating back to the creation of the first humans.

If there was a definitive End of Faith date for me, it would be December 26, 2004. Witnessing the catastrophic effects of the Sumatra earthquake and tsunamis, and having experienced another earlier and massive earthquake in Taiwan firsthand, I was deeply shaken. During a period when I was already grappling with new and challenging information, I saw our volatile planet claim hundreds of thousands of lives. This led me to a stark realization: “This is God’s planet. Either He caused this, or He allowed it to happen.” Just days after the disaster, I considered a third, more profound possibility: God does not exist. He didn’t cause the disaster nor did He allow it; He simply isn’t there. With this realization, my constant wondering, doubting, and blaming ceased. The peace I found in accepting this personal truth is indescribable.

Despite realizing that truth, fear of the unknown future and the potential devastation to my loved ones and their trust in me as a teacher and shepherd kept me living a lie. For many more years, I endured the heavy burden of this deceit, which led to terrifying, public panic attacks, some of which occurred before large audiences. This period was marked by intense internal conflict as I struggled to reconcile my public persona with my private understanding. Although it took years, I eventually had to leave the religion.

The Aftermath

Rejecting the Bible, which had been the cornerstone of my faith, propelled me toward scientific skepticism. Like many before me, I was drawn to the writings of Michael Shermer and works featured in Skeptic magazine. My departure from biblical teachings spurred me to explore questions about belief, the brain, and supernatural claims. The insights of thinkers like Carl Sagan, Richard Dawkins, Sam Harris, Daniel Dennett, Bertrand Russell, Guy P. Harrison, Robert Green Ingersoll, and Thomas Paine solidified my embrace of scientific naturalism. Their eloquent articulations reinforced and expanded upon the truths I had come to recognize on my own.

But what of my life now? What about my former hope of living forever on a paradisiacal earth? What of my loved ones and my marriage? I have witnessed many who have left their faith struggle to cope with the reality that this life is all there is. Our purpose is what we decide to make it. No one has, or likely ever will, live forever. The concept of a religious afterlife is a comforting illusion, a fortified barrier constructed to shield us from the fear of death.

I’ve discovered that I’ve become a better person as a non-believer than I ever was as a believer. There’s a kind of grotesque self-assuredness that comes from believing you have the only true answers to the universe’s most important questions. Such certainty naturally breeds a tendency toward dogmatism in all aspects of life. Regrettably, some of this dogmatic attitude lingered even after I abandoned my faith. Initially, I felt compelled to make my immediate family—especially my wife—understand what I had learned. This approach was unwise and unkind. I have since moved past that phase. My wife and sons are aware of my beliefs and my rejection of what I consider falsehoods. I desire their happiness within their faith as Jehovah’s Witnesses, striving to be the best people they can be. This is particularly important for my wife, who deeply needs and cherishes her beliefs. I know of no other couple who have managed to survive and thrive under similar circumstances, and I am committed to not letting go of that.

Since renouncing my supernatural beliefs, I’ve grown more tolerant of others’ faiths, though I still cannot condone the terrible acts or political agendas that sometimes arise from religious doctrines. However, I remain acutely aware that many people on this “pale blue dot” rely deeply on the hope and peace their faith provides. As long as these beliefs do not result in harm, I see them as fundamentally benign. This perspective allows for a respectful coexistence in our diverse world.

As for what the future holds, I cannot say. If someone had described my current life to me 25 years ago, I would have been incredulous. Yet here I am, leading a life full of wonder and satisfaction. I intend to make the most of each day until the very end—when the sun goes dark on my last day, so will I.

Ten Commandments for Our Time

Here’s the link to this article.

Toward a Provisional Rational Decalogue

MICHAEL SHERMER

AUG 18, 2023

In my previous Skeptic column, Deconstructing the Decalogue, I offered a personal view on how to think about the Ten Commandments from the perspective of 3,000 years of moral progress since they were first presented in two books of the Old Testament (Exodus 20:1-17 and Deuteronomy 5:4-21). Here I would like to reconstruct them from the perspective of a science- and reason-based moral system, a fuller version of which I developed in my 2015 book The Moral Arc, from which this material is partially excerpted.

Note: This is a purely intellectual exercise. I am not a preacher or teacher of moral values, nor do I hold myself up as some standard-bearer of morality. Since I do not believe in God, nor do I think that there are any rational reasons to believe that morals derive from any source outside of ourselves, I feel the necessity to offer an alternative to religious- and faith-based morality, both descriptively (where do morals come from if not God?) and prescriptively (how should we act if there is no God?), which I have done in 30 years of publishing Skeptic magazine and in a number of my books, including How We Believe (1999), The Science of Good and Evil (2004), and the aforementioned The Moral Arc. Here I am building on the work of secular philosophers and scholars from the ancient Greeks through the Enlightenment and into the modern era where a massive literature exists addressing these deep and important matters.

Galileo Demonstrating the New Astronomical Theories at the University of Padua. Painting by Félix Parra, 1873. Museo Nacional de Arte, Mexico City.

Skeptic is a reader-supported publication. Your subscription goes to the Skeptics Society, a 501(c)(3) nonprofit. To receive new posts and support my work, consider becoming a free or paid subscriber.

Upgrade to paid

The problem with any religious moral code that is set in stone is just that—it is set in stone. Anything that can never be changed has within its DNA the seeds of its own extinction. A science-based morality has the virtue of having built into it a self-correcting mechanism that does not just allow redaction, correction, and improvement; it insists upon it. Science and reason can be employed to inform—and in some cases even determine—moral values.

Science thrives on change, on improvement, on updating and upgrading its methods and conclusions. So it should be for a science of morality. No one knows for sure what is right and wrong in all circumstances for all people everywhere, so the goal of a science-based morality should be to construct a set of provisional moral precepts that are true for most people in most circumstances most of the time—as assessed by empirical inquiry and rational analysis—but admit exceptions and revisions where appropriate. Indeed, as humanity’s concept of “who and what is human, and entitled to protection” has expanded over the centuries, so we have extended moral protection to categories once thought beneath our notice.

Here are some suggested commandments for our time. Feel free to add your own in the comments section below.

1. The Golden-Rule Principle: Behave toward others as you would desire that they behave toward you.

The golden rule is a derivative of the basic principle of exchange reciprocity and reciprocal altruism, and thus evolved in our Paleolithic ancestors as one of the primary moral sentiments. In this principle there are two moral agents: the moral doer and the moral receiver. A moral question arises when the moral doer is uncertain how the moral receiver will accept and respond to the action in question. In its essence this is what the golden rule is telling us to do. By asking yourself, “how would I feel if this were done unto me?” you are asking “how would others feel if I did it unto them?”

2. The Ask-First Principle: To find out whether an action is right or wrong, ask first.

The Golden Rule principle has a limitation to it: what if the moral receiver thinks differently from the moral doer? What if you would not mind having action X done unto you, but someone else would mind it? Smokers cannot ask themselves how they would feel if other people smoked in a restaurant where they were dining because they probably wouldn’t mind. It’s the nonsmokers who must be asked how they feel. That is, the moral doer should ask the moral receiver whether the behavior in question is moral or immoral. In other words, the Golden Rule is still about you. But morality is more than just about you, and the Ask-First Principle makes morality about others.

3. The Happiness Principle: It is a higher moral principle to always seek happiness with someone else’s happiness in mind, and never seek happiness when it leads to someone else’s unhappiness through force or fraud.

Humans have a host of moral and immoral passions, including being selfless and selfish, cooperative and competitive, nice and nasty. It is natural and normal to try to increase our own happiness by whatever means available, even if that means being selfish, competitive, and nasty. Fortunately, evolution created both sets of passions, such that by nature we also seek to increase our own happiness by being selfless, cooperative, and nice. Since we have within us both moral and immoral sentiments, and we have the capacity to think rationally and intuitively to override our baser instincts, and we have the freedom to choose to do so, at the core of morality is choosing to do the right thing by acting morally and applying the happiness principle. (The modifier “force or fraud” was added to clarify that there are many activities that do not involve morality, such as a sporting contest, in which the goal is not to seek happiness with your opponent’s happiness in mind, but simply to win, fairly of course.)

4. The Liberty Principle: It is a higher moral principle to always seek liberty with someone else’s liberty in mind, and never seek liberty when it leads to someone else’s loss of liberty through force or fraud.

The Liberty Principle is an extrapolation from the fundamental principle of all liberty as practiced in Western society: The freedom to think, believe, and act as we choose so long as our thoughts, beliefs, and actions do not infringe on the equal freedom of others. What makes the Liberty Principle a moral principle is that in addition to asking the moral receiver how he or she might respond to a moral action, and considering how that action might lead to your own and the moral receiver’s happiness or unhappiness, there is an even higher moral level toward which we can strive, and that is the freedom and autonomy of yourself and the moral receiver, or what we shall simply refer to here as liberty. Liberty is the freedom to pursue happiness and the autonomy to make decisions and act on them in order to achieve that happiness.

Only in the last couple of centuries have we witnessed the worldwide spread of liberty as a concept that applies to all peoples everywhere, regardless of their race, religion, rank or social and political status in the power hierarchy. Liberty has yet to achieve worldwide status, particularly among those states dominated by theocracies and autocracies that encourage intolerance, and dictate that only some people deserve liberty, but the overall trend since the Enlightenment has been to grant greater liberty, for more people, everywhere. Although there are setbacks still, and periodically violations of liberties disrupt the overall historical flow from less to more liberty for all, the general trajectory of increasing liberty for all continues, so every time you apply the liberty principle you have advanced humanity one small step forward.

5. The Fairness Principle: When contemplating a moral action imagine that you do not know if you will be the moral doer or receiver, and when in doubt err on the side of the other person.

This is based on the philosopher John Rawls’ concepts of the “veil of ignorance” and the “original position” in which moral actors are ignorant of their position in society when determining rules and laws that affect everyone, because of the self-serving bias in human decision making. Given a choice, most people who enact moral rules and legislative laws would do so based on their position in society (their gender, race, class, sexual orientation, religion, political party, etc.) in a way that would most benefit themselves and their kin and kind. Not knowing ahead of time how the moral precept or legal law will affect you pushes you to strive for greater fairness for all. A simpler version is in the example of cutting a cake fairly: if I cut the cake you choose which piece you want, and if you cut the cake then I choose which piece I want.

6. The Reason Principle: Try to find rational reasons for your moral actions that are not self-justifications or rationalizations by consulting others first.

Ever since the Enlightenment the study of morality has shifted from considering moral principles as based on God-given, Divinely-inspired, Holy book-derived, Authority-dictated precepts from the top down, to bottom-up individual-considered, reason-based, rationality-constructed, science-grounded propositions in which one is expected to have reasons for one’s moral actions, especially reasons that consider the other person affected by the moral act. This is an especially difficult moral commandment to carry out because of the all-too natural propensity to slip from rationality to rationalization, from justification to self-justification, from reason to emotion. As in the first commandment to “ask first,” whenever possible one should consult others about one’s reasons for a moral action in order to get constructive feedback and to pull oneself out of a moral bubble in which whatever you want to do happens to be the most moral thing to do.

7. The Responsibility and Forgiveness Principle: Take full responsibility for your own moral actions and be prepared to be genuinely sorry and make restitution for your own wrong doing to others; hold others fully accountable for their moral actions and be open to forgiving moral transgressors who are genuinely sorry and prepared to make restitution for their wrong doing.

This is another difficult commandment to uphold in both directions. First, there is the “moralization gap” between victims and perpetrators, in which victims almost always perceive themselves as innocent and thus any injustice committed against them must be the result of nothing more than evil on the part of the perpetrator; and in which perpetrators may perceive themselves to have been acting morally in righting a wrong, redressing an immoral act, or defending the honor of oneself or family and friends. The self-serving bias, the hindsight bias, and the confirmation bias practically ensure that we all feel we didn’t do anything wrong, and whatever we did was justified, and thus there is no need to apologize and ask for forgiveness.

As well, the sense of justice and revenge is a deeply evolved moral emotion that serves three primary purposes: (1) to right wrongs committed by transgressors, (2) as a deterrent to possible future bad behavior, (3) to serve as a social signal to others that should they commit a similar moral transgression the same fate of your moral indignation and revenge awaits them.

8. The Defend Others Principle: Stand up to evil people and moral transgressors, and defend the defenseless when they are victimized.

There are people in the world who will commit moral transgressions against us and our fellow group members. Either through the logic of violence and aggression in which perpetrators of evil always feel justified in their acts, or through such conditions as psychopathy, a non-negligible portion of a population will commit selfish or cruel acts. We must stand up against them.

9. The Expanding Moral Category Principle: Try to consider other people not of your gender, sexual orientation, class, family, tribe, race, religion, or nation as an honorary group member equal to you in moral standing.

We have a moral obligation not only to ourselves, our kin and kind, our family and friends, and our fellow in-group members; we also owe it to those people who are different from us in a variety of ways, who in the past have been discriminated against for no other reason than that they were different in some measurable way. Even though our first moral obligation is to take care of ourselves and our immediate family and friends, it is a higher moral value to consider the moral values of others, and in the long run it is better for yourself, your kin and kind, and your in-group to consider members of other groups to be honorary members of your own group, as long as they so honor you and your group (see #8 above).

10. The Biophilia Principle: Try to contribute to the survival and flourishing of other sentient beings, their ecosystems, and the biosphere as a whole.

Biophilia is the love of nature, of which we are a part. Expanding the moral sphere to include the environments that sustain sentient beings is the loftiest of moral commandments.

If by fiat I had to reduce these Ten Commandments to just one it would be this:

Try to expand the moral sphere and to push the arc of the moral universe just a bit further toward truth, justice, and freedom for more sentient beings in more places more of the time.

Bending the Moral Arc

Here’s the link to this article.

Honoring Dr. Martin Luther King, Jr. nearly 60 years after his “How Long” speech in Montgomery, Alabama on March 25, 1965

MICHAEL SHERMER

JAN 16, 2023

Sunday, March 21st, 1965. Selma, Alabama.

About 8,000 people gather at Brown Chapel and begin to march from the town of Selma to the city of Montgomery, Alabama. The demonstrators are predominantly African-American and they’re marching on the capitol for one reason. Justice. They want simply to be given the right to vote. But they’re not alone in their struggle. Demonstrators of “every race, religion, and class,” representing almost every state, have come to march with their black brothers and sisters.[1] And at the front of the march is the Reverend Dr. Martin Luther King Jr., Nobel Prize winner, preacher, and civil rights activist leading the march like Moses leading his people out of Egypt.

In the teeth of racial opposition backed by armed police and riot squads, they had tried to march twice before, but both times were met with violence by state troopers and a deputized posse. The first time—known as Bloody Sunday—the marchers were ordered to turn back but refused and, as onlookers cheered, they were met with tear gas, billy clubs, and rubber tubing wrapped in barbed wire. The second time they were again met by a line of state troopers and ordered to turn around, and after asking for permission to pray, King led them back.

But not this time. This time President Lyndon B. Johnson, finally having seen the writing on the wall, ordered that the marchers should be protected by 2,000 National Guard troops and federal marshals. And so they marched. For five days, over a span of 53 miles, through biting cold and frequent rain, they marched. Word spread, the number of demonstrators grew, and by the time they reached the steps of the capitol in Montgomery on March 25, their numbers had swelled to at least 25,000.

But King wasn’t allowed on the steps of the capitol—the marchers weren’t allowed on state property. Sitting in the capitol dome like Pontius Pilate, Alabama Governor George Wallace refused to come out and address the marchers, and Dr. King delivered his speech from a platform constructed on a flatbed truck parked on the street in front of the building.[2] And from that platform, King delivered his stirring anthem to freedom, first recalling how they had marched through “desolate valleys,” rested on “rocky byways,” were scorched by the sun, slept in mud, and were drenched by rains.

The crowd, consisting of freedom-seeking people who had assembled from around the United States listened intently as Dr. King implored them to remain committed to the nonviolent philosophy of civil disobedience, knowing that the patience of oppressed peoples wears thin and that our natural inclination is to hit back when struck. He asked, rhetorically, “How long will prejudice blind the visions of men, darken their understanding, and drive bright-eyed wisdom from her sacred throne?” And “How long will justice be crucified and truth bear it?” In response, Dr. King offered words of counsel, comfort, and assurance, saying that no matter the obstacles it wouldn’t be long before freedom was realized because, he said, quoting religious and biblical tropes, “truth crushed to earth will rise again,” “no lie can live forever,” “you shall reap what you sow,” and “the arc of the moral universe is long, but it bends toward justice.”[3]

Skeptic is a reader-supported publication. Paid subscriptions go to the Skeptics Society, a 501(c)(3) nonprofit educational organization. Please consider becoming a paid subscriber.

Upgrade to paid

It was one of the greatest speeches of Dr. King’s career, and arguably one of the greatest in the history of public oratory. And it worked. Less than five months later, on August 6th, 1965, President Johnson signed the voting rights act into law. It was just as Dr. King had said—the arc of the moral universe is long but it bends toward justice. The climatic end of the speech can be seen on YouTube:

Dr. King’s reference—the title inspiration for my 2015 book The Moral Arc—comes from the 19th-century abolitionist preacher Theodore Parker, who penned this piece of moral optimism in 1853, at a time when, if anything, pessimism would have been more appropriate as America was inexorably sliding toward civil war over the very institution Parker sought to abolish:

I do not pretend to understand the moral universe; the arc is a long one, my eye reaches but little ways; I cannot calculate the curve and complete the figure by the experience of sight; I can divine it by conscience. And from what I see I am sure it bends towards justice.[4]

The aim of my book is to show that the Reverends Parker and King were right—that the arc of the moral universe does indeed bend toward justice. In addition to religious conscience and stirring rhetoric, however, we can trace the moral arc through science with data from many different lines of inquiry, all of which demonstrate that in general, as a species, we are becoming increasingly moral. As well, I argue that most of the moral development of the past several centuries has been the result of secular, not religious forces, and that the most important of these that emerged from the Age of Reason and the Enlightenment are science and reason, terms that I use in the broadest sense to mean reasoning through a series of arguments and then confirming that the conclusions are true through empirical verification. (You can order The Moral Arc here.)

Further, I demonstrate that the arc of the moral universe bends not merely toward justice, but toward truth and freedom, and that these positive outcomes have largely been the product of societies moving toward more secular forms of governance and politics, law and jurisprudence, moral reasoning and ethical analysis. Over time it has become less acceptable to argue that my beliefs, morals, and ways of life are better than yours simply because they are mine, or because they are traditional, or because my religion is better than your religion, or because my God is the One True God and yours is not, or because my nation can pound the crap out of your nation. It is no longer acceptable to simply assert your moral beliefs; you have to provide reasons for them, and those reasons had better be grounded in rational arguments and empirical evidence or else they will likely be ignored or rejected.  

Historically, we can look back and see that we have been steadily—albeit at times haltingly—expanding the moral sphere to include more members of our species (and now even other species) as legitimate participants in the moral community. The burgeoning conscience of humanity has grown to the point where we no longer consider the wellbeing only of our family, extended family, and local community; rather, our consideration now extends to people quite unlike ourselves, with whom we gladly trade goods and ideas and exchange sentiments and genes, rather than beating, enslaving, raping, or killing them (as our sorry species was wont to do with reckless abandon not so long ago). Nailing down the cause-and-effect relationship between human action and moral progress—that is, determining why it’s happened—is the other primary theme of this book, with the implied application of what we can do to adjust the variables in the equation to continue expanding the moral sphere and push our civilization further along the moral arc. Improvements in the domain of morality are evident in many areas of life:

  • governance (the rise of liberal democracies and the decline of theocracies and autocracies)
  • economics (broader property rights and the freedom to trade goods and services with others without oppressive restrictions)
  • rights (to life, liberty, property, marriage, reproduction, voting, speech, worship, assembly, protest, autonomy, and the pursuit of happiness)
  • prosperity (the explosion of wealth and increasing affluence for more people in more places; and the decline of poverty worldwide in which a smaller percentage of the world’s people are impoverished than at any time in history)
  • health and longevity (more people in more places more of the time live longer healthier lives than at any time in the past)
  • war (a smaller percentage of people die as a result of violent confict today than at any time since our species began)
  • slavery (outlawed everywhere in the world and practiced in only a few places in the form of sexual slavery and slave labor that are now being targeted for total abolition)
  • homicide (rates have fallen precipitously from over 100 murders per 100,000 people in the Middle Ages to less than 1 per 100,000 today in the Industrial West, and the chances of an individual dying violently is the lowest it has ever been in history)
  • rape and sexual assault (trending downward, and while still too prevalent, it is outlawed by all Western states and increasingly prosecuted)
  • judicial restraint (torture and the death penalty have been almost universally outlawed by states, and where it is still legal is less frequently practiced)
  • judicial equality (citizens of nations are treated more equally under the law than any time in the past)
  • civility (people are kinder, more civilized, and less violent to one another than ever before).

In short, we are living in the most moral period in our species’ history.

I do not go so far as to argue that these favorable developments are inevitable or the result of an inexorable unfolding of a moral law of the universe—this is not an “end of history” argument—but there are identifiable causal relationships between social, political, and economic factors and moral outcomes. As Steven Pinker wrote in The Better Angels of Our Nature, a work of breathtaking erudition that was one of the inspirations for my book:

Man’s inhumanity to man has long been a subject for moralization. With the knowledge that something has driven it down we can also treat it as a matter of cause and effect. Instead of asking “Why is there war?” we might ask “Why is there peace?” We can obsess not just over what we have been doing wrong but also what we have been doing right. Because we have been doing something right and it would be good to know what exactly it is.[5]

For tens of millennia moral regress best described our species, and hundreds of millions of people suffered as a result. But then something happened half a millennium ago. The Scientific Revolution led to the Age of Reason and the Enlightenment, and that changed everything. As a result, we ought to understand what happened, how and why these changes reversed our species historical trend downward, and that we can do more to elevate humanity, extend the arc, and bend it ever upwards.

                                     *                                  *                                  *

During the years I spent researching and writing The Moral Arc, when I told people that the subject was moral progress, to describe the responses I received as incredulous would be an understatement; most people thought I was hallucinatory. A quick rundown of the week’s bad news would seem to confirm the diagnosis.

The reaction is understandable because our brains evolved to notice and remember immediate and emotionally salient events, short-term trends, and personal anecdotes. And our sense of time ranges from the psychological “now” of three seconds to the few decades of a human lifetime, which is far too short to track long-term incremental trends unfolding over centuries and millennia, such as evolution, climate change, and—to my thesis—moral progress. If you only ever watched the evening news you would soon have ample evidence that the antithesis of my thesis is true—that things are bad and getting worse. But news agencies are tasked with reporting only the bad news—the ten thousand acts of kindness that happen every day go unreported. But one act of violence—a mass public shooting, a violent murder, a terrorist suicide bombing—are covered in excruciating detail with reporters on the scene, exclusive interviews with eyewitnesses, long shots of ambulances and police squad cars, and the thwap thwap thwap of news choppers overhead providing an aerial perspective on the mayhem. Rarely do news anchors remind their viewers that school shootings are still incredibly rare, that crime rates are hovering around an all-time low, and that acts of terror almost always fail to achieve their objective and their death tolls are negligible compared to other forms of death.

News agencies also report what happens, not what doesn’t happen—we will never see a headline that reads…

ANOTHER YEAR WITHOUT NUCLEAR WAR

This too is a sign of moral progress in that such negative news is still so uncommon that it is worth reporting. Were school shootings, murders, and terrorist attacks as commonplace as charity events, peacekeeping missions, and disease cures, our species would not be long for this world.

As well, not everyone shares my sanguine view of science and reason, which has found itself in recent decades under attack on many fronts: right-wing ideologues who do not understand science; religious-right conservatives who fear science; left-wing postmodernists who do not trust science when it doesn’t support progressive tenets about human nature; extreme environmentalists who want to return to a pre-scientific and pre-industrial agrarian society; anti-vaxxers who wrongly imagine that vaccinations cause autism and other maladies; anti-GMO (genetically modified food) activists who worry about Frankenfoods; and educators of all stripes who cannot articulate why Science, Technology, Engineering, and Math (STEM) are so vital to a modern democratic nation.

Evidence-based reasoning is the hallmark of science today. It embodies the principles of objective data, theoretical explanation, experimental methodology, peer review, public transparency and open criticism, and trial and error as the most reliable means of determining who is right—not only about the natural world, but about the social and moral worlds as well. In this sense many apparently immoral beliefs are actually factual errors based on incorrect causal theories. Today we hold that it is immoral to burn women as witches, but the reason our European ancestors in the Middle Ages strapped women on a pyre and torched them was because they believed that witches caused crop failures, weather anomalies, diseases, and various other maladies and misfortunes. Now that we have a scientific understanding of agriculture, climate, disease, and other causal vectors—including the role of chance—the witch theory of causality has fallen into disuse; what was a seemingly moral matter was actually a factual mistake.

This conflation of facts and values explains a lot about our history, in which it was once (erroneously) believed that gods need animal and human sacrifices, that demons possess people and cause them to act crazy, that Jews cause plagues and poison wells, that African blacks are better off as slaves, that some races are inferior or superior to other races, that women want to be controlled or dominated by men, that animals are automata and feel no pain, that Kings rule by divine right, and other beliefs no rational scientifically-literate person today would hold, much less proffer as a viable idea to be taken seriously. The Enlightenment philosopher Voltaire explicated the problem succinctly: “Those who can make you believe absurdities, can make you commit atrocities.”[6] 

Thus, one path (among many) to a more moral world is to get people to quit believing in absurdities. Science and reason are the best methods for doing that. As a methodology, science has no parallel; it is the ultimate means by which we can understand how the world works, including the moral world. Thus, employing science to determine the conditions that best expand the moral sphere is itself a moral act. The experimental methods and analytical reasoning of science—when applied to the social world toward an end of solving social problems and the betterment of humanity in a civilized state—created the modern world of liberal democracies, civil rights and civil liberties, equal justice under the law, open political and economic borders, free markets and free minds, and prosperity the likes of which no human society in history has ever enjoyed. More people in more places more of the time have more rights, freedoms, liberties, literacy, education, and prosperity than at any time in the past. We have many social and moral problems left to solve, to be sure, and the direction of the arc will hopefully continue upwards long after our epoch so we are by no means at the apex, but there is much evidence for progress and many good reasons for optimism.

                                     *                                  *                                  *

Three years after Dr. King’s “How Long” speech, on April 3, 1968, the civil rights crusader delivered his final speech, I’ve Been to the Mountaintop, in Memphis, Tennessee in which he exhorted his followers to work together to make America the nation its founding documents decreed it would be, foreseeing that he might not live to see the dream realized. “I’ve seen the Promised Land. I may not get there with you,” he hinted ominously. “But I want you to know tonight, that we, as a people, will get to the promised land!” The next day Dr. King was assassinated.

It is to his legacy, and the legacies of all champions of truth, justice, and freedom throughout history, that we owe our allegiance and our efforts at making the world a better place. “Each of us is two selves,” Dr. King wrote. “The great burden of life is to always try to keep that higher self in command. And every time that old lower self acts up and tells us to do wrong, let us allow that higher self to tell us that we were made for the stars, created for the everlasting, born for eternity.”

We are, in fact, made from the stars. Our atoms were forged in the interior of ancient stars that ended their lives in spectacular paroxysms of supernova explosions that dispersed those atoms into space where they coalesced into new solar systems with planets, life, and sentient beings capable of such sublime knowledge and moral wisdom. “We are stardust, we are golden, we are billion-year old carbon…” (from the lyrics of “Woodstock”, by Joni Mitchell).

Morality is something that carbon atoms can embody given a billion years of evolution—the moral arc.

Skeptic is a reader-supported publication. Subscriptions go to the Skeptics Society, a 501(c)3) nonprofit educational organization. Please consider becoming a paid subscriber.

Upgrade to paid

###

Michael Shermer is the Publisher of Skeptic magazine, the host of The Michael Shermer Show, and a Presidential Fellow at Chapman University. His many books include Why People Believe Weird ThingsThe Science of Good and EvilThe Believing BrainThe Moral Arc, and Heavens on EarthHis new book is Conspiracy: Why the Rational Believe the Irrational. You can order The Moral Arc here.

References


[1] King, Coretta Scott. 1969. My Life With Martin Luther King Jr. New York: Holt, Rinehart, and Winston. 267.

[2] Many accounts describe King as being either at the top of the capitol steps, on the steps, or at the bottom of the steps. There are eyewitnesses accounts in which it is claimed that King delivered his famous speech from the steps. For example, John N. Pawelek recalls: “When we arrived at the state capitol, the area was filled with throngs of marchers. Martin Luther King was on the steps. He gave a fiery speech which only a Baptist minister can give.” The Alabama Byways site tells its patrons reliving the Selma to Montgomery march to “walk on the steps of the capitol, where King delivered his ‘How Long, Not Long’ speech to a crowd of nearly 30,000 people. In his book Getting Better: Television and Moral Progress (Transaction Publishers, 1991, p. 48), Henry J. Perkinson writes: “By Thursday, the marchers, who now had swelled to twenty-five thousand, reached Montgomery, where the national networks provided live coverage as Martin Luther King strode up the capital [sic] steps with many of the movement’s heroes alongside. From the top of the steps, King delivered a stunning address to the nation.” Even the Martin Luther King Encyclopedia puts him “on the steps.”

            This is incorrect. The BBC reports of the day, for example, say that King “has taken a crowd of nearly 25,000 people to the steps of the state capital” but was stopped from climbing the steps and so “addressed the protesters from a podium in the square.” The New York Times reports that “The Alabama Freedom March from Selma to Montgomery ended shortly after noon at the foot of the Capitol steps” and that “the rally never got on to state property. It was confined to the street in front of the steps.” The original caption to the aerial photograph included in the text, from an educational online source, reads: “King was not allowed to speak from the steps of the Capitol. Can you find the line of state troopers that blocked the way?” This is confirmed by these firsthand accounts: “A few state employees stood on the steps. They watched a construction crew building a speaker’s platform on a truck bed in the street.” And: “The speakers platform is a flatbed truck equipped with microphones and loudspeakers. The rally begins with songs by Odetta, Oscar Brand, Joan Baez, Len Chandler, Peter, Paul & Mary, and Leon Bibb. From his truck-bed podium, King can clearly see Dexter Avenue Baptist Church.”

[3] The speech is commonly known as the “How Long, Not Long” speech (or sometimes “Our God is Marching On”) and is considered one of King’s three most important and impactful speeches, along with “I Have a Dream” and the tragically prescient “I’ve Been to the Mountaintop.” It can be read in its entirety at http://mlk-kpp01.stanford.edu/index.php/kingpapers/article/our_god_is_marching_on/

[4] Parker, Theodore. 1852/2005. Ten Sermons of Religion. Sermon III: Of Justice and Conscience. Ann Arbor: University of Michigan Library.

[5] Pinker, Steven. 2011. The Better Angels of Our Nature: Why Violence Has Declined. New York: Viking, xxvi.

[6] Voltaire, 1765. “Question of Miracles.” Miracles and Idolotry. Penguin.

Frequent Infrequencies

Here’s the link to this article.

May 12, 2018

Do anomalies prove the existence of God?

This op-ed was originally published on Slate.com as part of a Big Ideas series on the question “What is the Future of Religion” in 2015.

For a quarter century I have investigated and attempted to explain anomalous events that people report experiencing, and I have written about a few of my own, such as being abducted by aliens (caused by extreme fatigue and sleep deprivation), hallucinating inside a sensory deprivation tank, and having an out-of-body experience while my temporal lobes were stimulated with electro-magnetic fields. Most people interpret such experiences as evidence for the supernatural, the afterlife, or even God, but since mine all had clear and obvious natural explanations few readers took them to be evidentiary.

In my October, 2014 column in Scientific American entitled “Infrequencies” however, I wrote about an anomalous experience for which I have no explanation. In brief, my fiancé, Jennifer Graf, moved to Southern California from Köln, Germany, bringing with her a 1978 Phillips 070 transistor radio that belonged to her late grandfather Walter, a surrogate father figure as she was raised by a single mom. She had fond memories of listening to music with him through that radio so I did my best to resurrect it, without success. With new batteries and the power switch left in the “on” position, we gave up and tossed it in a desk drawer where it lay dormant for months. During a quiet moment after our vows at a small wedding ceremony at our home, Jennifer was feeling sad being so far from home and wishing she had some connection to loved ones—most notably her mother and her grandfather—with whom to share this special occasion. We left my family to find a quiet moment alone elsewhere in the house when we heard music emanating from the bedroom, which turned out to be a love song playing on that radio in the desk drawer. It was a spine-tingling experience. The radio played for the rest of the evening but went quiescent the next day. It’s been silent ever since, despite repeated attempts to revive it.

Ever since the column appeared in Scientific American I’ve been deluged with letters. A few grumpy skeptics chided me for lowering my skeptical shields, most notably for my closing line: “And if we are to take seriously the scientific credo to keep an open mind and remain agnostic when the evidence is indecisive or the riddle unsolved, we should not shut the doors of perception when they may be opened to us to marvel in the mysterious.” I was simply trying to be a little poetic in my interpretation, which I qualified by noting “The emotional interpretations of such anomalous events grant them significance regardless of their causal account.”

A few cranky believers were dismissive of my openness, one insisting “that no human being, nor any living thing, is only their body. Also, no inanimate object is only that object. The dead do not die, and the living are not free but bound and enslaved each to his or her own ignorance—a condition which you work to maintain. Shame on you, sir.” Above her signature she signed off: “With kind intentions.”

Friendlier believers sent encouraging notes, not all of which I understand, such as this sentiment from a psychologist: “The central importance of latent, neglected shared spiritual capabilities was indeed a wedding blessing, eloquently and vividly enacted, resulting in very valuable sharing for a world culture remarkably crippled in appreciation of actual multidimensional reality.” Does 3D count? A neurophysiologist imagined what the implications would be if no natural explanation were forthcoming for my anomalous event. “Should consciousness survive the death of the brain, there are exciting implications for the role of consciousness in the living brain.” Indeed there is, but a lack of causal explanation for my story does not imply this.

A geologist wrote to suggest that “There are many explanations that can be posited; I would favor solar flares or the geoparticles of Holub and Smrz [authors of a paper that some claim proves that nanoparticles between neurons may allow for quantum fields to influence other brains], but rather than seek one, this coincidental occurrence should be enjoyed in the supernatural or paranormal vein as it was meant to be…simply a blessing for a long and happy union.” I agree, but without the supernatural or paranormal vein in the rock.

Another correspondent said he would be convinced of the miraculous nature of the event if the radio played for the next 20 years with no power source. That would impress me too, and maybe Elon Musk is working on such technology for his next generation of Tesla cars.

Most of the correspondence I received, however, was from people recounting their own anomalous experiences that had deep personal meaning for them, some pages long in rich detail. One woman told me the story of her rare blue opal pendant that she wore 24/7 for 15 years, until her ex-husband swiped it out of spite during their divorce. (So I guess this would be a case of negative emotions influencing events at a distance.) She felt so bad that while on vacation in Bali she had a jeweler create a simulacrum of it, which led to a successful jewelry business. One day 15 years later, a woman named Lucy came into her store and they got to talking about the lost opal pendant, which Lucy suddenly realized that she now owned. “In 1990 her best friend was dating a guy who was going through a divorce and he had given it to her. Her friend never felt comfortable wearing it so she offered it to Lucy. Lucy accepted, and wore it the following weekend on her wedding day. Soon after, she discovered her new husband had a girlfriend, and she never wore the opal again, thinking it might be bad luck. It remained in her drawer for 15 years. When I asked why she hadn’t sold it (it was now extremely valuable), she said ‘I tried to—every time I went to get it out of the drawer to have it appraised, something happened to distract me. Phone calls, dogs fighting, package deliveries—I tried many times, but never succeeded. Now I know why—it wanted to come back to you!’” This woman’s sister, whom she characterized as a “medical intuitive and remote healer,” called this story “Epic Synchronicity.” She described it as “fantastic and statistically improbable, but it is explainable.”

I agree, but what is the explanation for this, or for any of such highly improbable events? And what do they mean? For Jennifer and me, it was the propitious timing of the radio’s revival—at the moment she was thinking about family—that made it such an emotionally salient event, enabling her to feel as if her beloved grandfather was there with us, sharing in our commitment. Is it proof of life after death? No. As I wrote (and many readers apparently chose to overlook) in Scientific American, “such anecdotes do not constitute scientific evidence that the dead survive or that they can communicate with us via electronic equipment.”

The reason is that in science it isn’t enough to just compile anecdotes in support of a preferred belief. After all, who wouldn’t want to know that we survive bodily death and live for eternity elsewhere? We are all subject to the confirmation bias in which we look for and find confirming evidence and ignore disconfirming evidence. We remember one-off highly unusual coincidences that have deep meaning for us, and forget all the countless meaningless coincidences that flow past our senses every day. Then there is the law of large numbers: with seven billion people having, say, 10 experiences a day of any kind, even million-to-one odds will happen 70,000 times a day. It would be a miracle if at least a few of those events did not get remembered, recounted, reported, and recorded somewhere, leaving us with a legacy of frequent infrequencies. Add to this the hindsight bias, in which we are impressed by the improbability of an event after-the-fact, but in science we should only be impressed by events whose occurrence was predicted in advance. And don’t forget the recall bias, in which we remember things that happened differently depending on what we now believe, retrieving from memory circumstances that favor the preferred interpretation of the event in question. Then there is the matter of what didn’t happen that would have been equally spine-tingling in emotional impact on that day, or some other important day, and in my case I can’t think of any because they didn’t happen. Finally, just because I can’t explain something doesn’t mean it is inexplicable by natural means. The argument from personal incredulity doesn’t hold water on the skeptical seas.

As for plausible explanations, one correspondent suggested “that the on-off switch contacts were probably heavily oxidized and that the radio itself was turned on and then stay, as you have inserted the new batteries. By heating and cooling and vibration or small metal parts in a typical 1970s transistor suddenly corrode and make contact. The timing of this process…well, that is just simply remarkable.” A physicist and engineer from Athens, Greece, thought perhaps after my “percussive” technique of smacking the radio on a hard surface, “A critical capacitor at the flow of the current, maybe at the power stage, or at the receiving stage, or at the final amplifier’s stage may had been left in a just quasi-stable soldering state and by the aid of the ambient EM fields may had reach a charging state (leave an empty capacitor for some days out in the yard and you’ll get it almost fully charged) that by the presence of the supply voltage at the soldering spot could have bridged the possible gap of the old or disturbed soldering contact and then sustained this conduction for some hours until by a simple sock may had fully discharged.”

I’m not sure what this means, exactly, because my attempts to resuscitate the radio happened months before, but I can well imagine some electrical glitch, a particle of dust, an EM (electromagnetic) fluctuation from the batteries—something in the natural world—caused the radio to come to life. Why it would happen at that particular moment, and be perfectly tuned to a station playing love songs, and be loud enough to hear out of the desk drawer, is what made the event stand out for us. Which reminds me of an account I read of witchcraft and magic among the Azande, a traditional society in the Southern Sudan in Africa, by the anthropologist E. E. Evans-Pritchard. He explained that the Zande employ natural causes when they are readily available. When an old granary collapses, for example, the Zande understand that termites are the probable cause. But when the granary crumples with people inside who are thereby injured, the Zande wonder, in Evans-Pritchard’s words, “why should these particular people have been sitting under this particular granary at the particular moment when it collapsed? That it should collapse is easily intelligible, but why should it have collapsed at the particular moment when these particular people were sitting beneath it?” That timing is explained by magic.

Deepak Chopra suggested something similar to us when he wrote “The radio coming on and off almost certainly has a mechanical explanation (a change in humidity, a speck of dust falling off a rusty wire, etc.). What is uncanny is the timing and emotional significance to those participating in the experience. The two of you falling in love is part of the synchronicity!” The Azande magical explanation is not too dissimilar to Deepak’s synchronicity, which he enumerated thusly: “(1) Synchronicity is a conspiracy of improbabilities (the events break the boundaries of statistical probability). (2) The improbable events conspiring to create the synchronistic event are acausally related to each other. (3) Synchronistic events are orchestrated in the non-local domain. … (9) Synchronistic events are messages from our non local self and are clues to the essential unity of our inner world of thoughts, feelings, memories, fantasies, desires, and intentions, and our outer world of space time events.” From this, and my many debates with Deepak, I take him to mean that consciousness exists separately from substance and can interact with it, the interactions governed by strong emotions like love, which can apparently act across space and time to cause effects meaningful to associated participants.

A psychologist named Michael Jawer would seem to agree in his explanation to me “that strong and underlying feelings are central to anomalous happenings.” His approach “doesn’t rely on barely-understood quantum woo,” he cautioned, “but assesses the way feelings work within our biology and physiology and the way emotions knit human beings together.” That certainly sounds reasonable, although how emotional energy could be transmitted from inside a body (or from the other side) into, say, a radio, is not clear. But I appreciated the close of his letter in which he quoted the late physicist John Wheeler: “In any field, find the strangest thing and then explore it.” 

That is precisely what the eminent Caltech physicist Kip Thorne did in the blockbuster film Interstellar, for which he was the scientific consultant. In order to save humanity from imminent extinction Matthew McConaughey’s character has to find a suitable planet by passing through a wormhole to another galaxy. In order to return, however, he must slingshot around a black hole, thereby causing a massive time dilation relative to his daughter back home on Earth (one hour near the black hole equals seven years on Earth), such that by the time he returns she is much older than he. In the interim, in order to get the humans off Earth he needs to transmit information to his now adult scientist daughter on quantum fluctuations from the singularity inside of the black hole. To do so he uses an extra-dimensional “tesseract” in which time appears as a spatial dimension that includes portals into the daughter’s childhood bedroom at a moment when (earlier in the film) she thought she experienced ghosts and poltergeists, which turned out to be her father from the future reaching back in time through extra-dimensions via gravitational waves (which he uses to send the critical data via Morse code dots and dashes on the second hand of the watch he left her). It’s a farfetched plot, but according to Thorne in his companion book to the film, it’s all grounded in natural law and forces.

This is another way of saying—as I have often—that there is no such thing as the supernatural or the paranormal. There is just the natural and the normal and mysteries we have yet to solve with natural and normal explanations. If it turns out, say, that Walter exists in a 5th dimensional tesseract and is using gravitational waves to turn on his old radio for his granddaughter, that would be fully explicable by physical laws and forces as we understand them. It would not be ESP or Psi or anything of the paranormal or supernatural sort; it would just be a deeper understanding of physics.

The same applies to God. As I’ve also said (in what I facetiously call Shermer’s Last Law), “any sufficiently advanced extra-terrestrial intelligence is indistinguishable from God.” By this I mean that if we ever did encounter an ETI the chances are that they would be vastly far ahead of us on a technological time scale, given the odds against another intelligent species evolving at precisely the same rate as us on another planet. At the rate of change today we have advanced more in the past century than in all previous centuries combined. Think of the progress in computing that has been made in just the last 50 years, and then imagine where we will be in, say, 50,000 years or 50 million years, and we get some sense of just how far advanced an ETI could be. The intelligent beings who created the wormhole in Kip Thorne’s fictional universe would almost assuredly seem to us as Gods if we did not understand the science and technologies they used. Imagine an ETI millions of years more advanced than us who could engineer the creation of planets and stars by manipulating clouds of interstellar gas, or even create new universes out of collapsing black holes. If that’s not God-like I don’t know what is, but it’s just advanced science and technology and nothing more.

Until such time when science can explain even the most spectacularly unlikely events, what should we do with such stories? Enjoy them. Appreciate their emotional significance. But we do not need to fill in the explanatory gaps with gods or any such preternatural forces. We can’t explain everything, and it’s always okay to say “I don’t know” and leave it at that until a natural explanation presents itself. Until then, revel in the mystery and drink in the unknown. It is where science and wonder meet.

The Non-Magisterium of Religion

Here’s the link to this article.

Why Faith Is Not a Reliable Method for Determining Moral Values

MICHAEL SHERMER

DEC 7, 2022

In my previous Skeptic column I acknowledged the magisterium of religion, noting the power of faith in a pre-modern world lit only by fire and plagued by poverty, disease, misery, and early death. To this I would add that it was Jesus who said to help the poor, to turn the other cheek, to love thine enemies, to judge not lest ye be judged, to forgive sinners, and to give people a second chance. Many modern Christian conservatives seem to have forgotten this message.

In the name of their religion, people have helped the poor and needy in developed nations around the world, and in America they are the leading supporters of food banks for the hungry and post-disaster relief. Many Christian theologians, along with Christian churches and preachers, advocated the abolition of the slave trade, and continued to press for justice in modern times. Some civil rights leaders were motivated by their religion, most notably the Reverend Martin Luther King, Jr., whose speeches were filled with passionate religious tropes and quotes. I have deeply religious friends who are highly driven to do good and, though they may have a complex variety of motives, they often act in the name of their particular religion.

So religion can and does motivate people to do good works, and we should always acknowledge any person or institution that pushes humanity further along the path of progress, expands the moral sphere, or even just makes the life of one other person a little easier. To that end we would do well to emulate the ecumenicalism of the late astronomer Carl Sagan, who appealed to all religious faiths to join scientists in working to preserve the environment and to end the nuclear arms race. He did so because, he said, we are all in this together; our problems are “transnational, transgenerational and transideological. So are all conceivable solutions. To escape these traps requires a perspective that embraces the peoples of the planet and all the generations yet to come.”

That stirring rhetoric urges all of us—secularists and believers—to work together toward the common goal of making the world a better place.

But as I document in my 2015 book The Moral Arc, for too long the scales of morality have been weighed down by the religious thumb pressing on the side of the scale marked “Good”. Religion has also promoted, or justified, such catastrophic moral blunders as the Crusades (the People’s Crusade, the Northern Crusade, the Albigensian Crusade, and Crusades One through Nine); the Inquisitions (Spanish, Portuguese, and Roman); witch hunts (a product, in part, of the Inquisitions that ran from the Middle Ages through the Early Modern Period and executed tens of thousands of people, mostly women); Christian conquistadors who exterminated native peoples by the millions through their guns, germs, and steel; the endless European Wars of Religion (the Nine Years War, the Thirty Years War, the Eighty Years War, the French Wars of Religion, the Wars of the Three Kingdoms, the English Civil War, to name just a few); the American Civil War, in which Northern Christians and Southern Christians slaughtered one another over the issue of slavery and states’ rights; and the First World War, in which German Christians fought French, British, and American Christians, all of whom believed that God was on their side. And that’s just in the Western world. There are the seemingly endless religious conflicts in India, Indonesia, Afghanistan, Pakistan, Iraq, Sudan, and numerous countries in Africa, the Coptic Christian persecution in Egypt, and of course Islamist terrorism has been a scourge on societal peace and security in recent decades and a day doesn’t go by without some act of violence committed in the name of Islam.

All of these events have political, economic, and social causes, but the underlying justification they share is religion.

Once moral progress in a particular area is underway, most religions eventually get on board—as in the abolition of slavery in the 19th century, women’s rights in the 20th century, and gay rights in the 21st century—but this often happens after a shamefully protracted lag time. Why? There are three reasons for the sclerotic nature of religion:

(1) The foundation of the belief in an absolute morality is the belief in an absolute religion grounded in the One True God. This inexorably leads to the conclusion that anyone who believes differently has departed from this truth and thus is unprotected by our moral obligations.

(2) Unlike science, religion has no systematic process and no empirical method to employ to determine the verisimilitude of its claims and beliefs, much less right and wrong.

(3) The morality of holy books—most notably the Bible—is not the morality any of us would wish to live by, and thus it is not possible for the religious doctrines derived from holy books to be the catalyst for moral evolution.

The Bible, in fact, is one of the most immoral works in all literature. Woven throughout begats and chronicles, laws and customs, is a narrative of accounts written by, and about, a bunch of Middle Eastern tribal warlords who constantly fight over land and women, with the victors taking dominion over both. It features a jealous and vengeful God named Yahweh who decides to punish women for all eternity with the often intolerable pain of childbirth, and further condemns them to be little more than beasts of burden and sex slaves for the victorious warlords.

Why were women to be chastened this way? Why did they deserve an eternity of misery and submission? It was all for that one terrible sin, the first crime ever recorded in the history of humanity—a thought crime no less—when that audacious autodidact Eve dared to educate herself by partaking of the fruit of the tree of the knowledge of good and evil. Worse, she inveigled the first man—the unsuspecting Adam—to join her in choosing knowledge over ignorance. For the appalling crime of hearkening unto the voice of his wife, Yahweh condemned Adam to toil in thorn and thistle-infested fields, and further condemned him to death, to return to the dust from whence he came.

Yahweh then cast his first two delinquent children out of paradise, setting a Cherubim and a flaming sword at the entrance to be certain that they could never return. Then, in one of the many foul moods he was wont to fall into, Yahweh committed an epic hemoclysm of genocidal proportions by killing every sentient being on Earth—including unsuspecting adults, innocent children, and all the land animals—in a massive flood. In order to repopulate the planet after he decimated it of all life save those spared in the ark, Yahweh commanded the survivors—numerous times—to “be fruitful and multiply,” and rewarded his favorite warlords with as many wives as they desired. Thus was born the practice of polygamy and the keeping of harems, fully embraced and endorsed—along with slavery—in the so-called “good book.”

As an exercise in moral casuistry, and applying the principle of interchangeable perspectives, this question comes to mind: did anyone ask the women how they felt about this arrangement? What about the millions of people living in other parts of the world who had never heard of Yahweh? What about the animals and the innocent children who drowned in the flood? What did they do to deserve such a final solution to Yahweh’s anger problem?

Many Christians say that they get their morality from the Bible, but this cannot be true because as holy books go the Bible is possibly the most unhelpful guide ever written for determining right from wrong. It’s chockfull of bizarre stories about dysfunctional families, advice about how to beat your slaves, how to kill your headstrong kids, how to sell your virgin daughters, and other clearly outdated practices that most cultures gave up centuries ago.

Consider the morality of the biblical warlords who had no qualms about taking multiple wives, adultery, keeping concubines, and fathering countless children from their many polygamous arrangements. The anthropologist Laura Betzig has put these stories into an evolutionary context in noting that Darwin predicted that successful competition leads to successful reproduction. She analyzed the Old Testament and found no less than 41 named polygamists, not one of which was a powerless man. “In the Old Testament, powerful men—patriarchs, judges, and kings—have sex with more wives; they have more sex with other men’s women; they have sex with more concubines, servants, and slaves; and they father many children.” And not just the big names. According to Betzig’s analysis, “men with bigger herds of sheep and goats tend to have sex with more women, then to father more children.” Most of the polygynous patriarchs, judges, and kings had 2, 3, or 4 wives with a corresponding number of children, although King David had more than 8 wives and 20 children, King Abijah had 14 wives and 38 children, and King Rehoboam had 18 wives (and 60 other women) who bore him no fewer than 88 offspring. But they were all lightweights compared to King Solomon, who married at least 700 women, and for good measure added 300 concubines, which he called “man’s delight.” (What Solomon’s concubines called him was never recorded.)

Although many of these stories are fiction (there is no evidence, for example, that Moses ever existed, much less led his people for 40 years in the desert leaving behind not a single archaeological artifact), what these biblical patriarchs purportedly did to women was, in fact, how most men treated women at that time, and that’s the point. Put into context, the Bible’s moral prescriptions were for another time for another people and have little relevance for us today.

Upgrade to paid

In order to make the Bible relevant, believers must pick and choose biblical passages that suit their needs; thus the game of cherry picking from the Bible generally works to the advantage of the pickers. In the Old Testament, the believer might find guidance in Deuteronomy 5:17, which says, explicitly, “Thou shalt not kill”; or in Exodus 22:21, a verse that delivers a straightforward and indisputable prohibition: “You shall not wrong a stranger or oppress him, for you were strangers in the land of Egypt.”

These verses seem to set a high moral bar, but the handful of positive moral commands in the Old Testament are desultory and scattered among a sea of violent stories of murder, rape, torture, slavery, and all manner of violence, such as occurs in Deuteronomy 20:10-18, in which Yahweh instructs the Israelites on the precise etiquette of conquering another tribe:

When you draw near to a city to fight against it, offer terms of peace to it. And if its answer to you is peace and it opens to you, then all the people who are found in it shall do forced labor for you and shall serve you. But if it makes no peace with you, but makes war against you, then you shall besiege it; and when the LORD your God gives it into your hand you shall put all its males to the sword, but the women and the little ones, the cattle, and everything else in the city, all its spoil, you shall take as booty for yourselves…. But in the cities of these peoples that the LORD your God gives you for an inheritance you shall save alive nothing that breathes, but you shall utterly destroy them, the Hittites and the Amorites, the Canaanites and the Perizzites, the Hivites and the Jebusites, as the LORD your God has commanded.

Today, as the death penalty fades into history, Yahweh offers this list of actions punishable by death:

            • Blaspheming or cursing or the Lord: “And he that blasphemeth the name of the Lord, he shall surely be put to death, and all the congregation shall certainly stone him: as well the stranger, as he that is born in the land, when he blasphemeth the name of the Lord, shall be put to death.” (Leviticus 24:13-16)

            • Worshiping another god: “He that sacrificeth unto any god, save unto the Lord only, he shall be utterly destroyed.” (Exodus 22:20)

            • Witchcraft and wizardry: “Thou shalt not suffer a witch to live.” (Exodus 22:18) “A man also or woman that hath a familiar spirit, or that is a wizard, shall surely be put to death: they shall stone them with stones: their blood shall be upon them.” (Leviticus 20:27)

            • Female loss of virginity before marriage: “If any man take a wife [and find] her not a maid … Then they shall bring out the damsel to the door of her father’s house, and the men of her city shall stone her with stones that she die.” (Deuteronomy 22:13-21)

            • Homosexuality: “If a man also lie with mankind, as he lieth with a woman, both of them have committed an abomination: they shall surely be put to death; their blood shall be upon them.” (Leviticus 20:13)

            • Working on the Sabbath: “Six days shall work be done, but on the seventh day there shall be to you an holy day, a sabbath of rest to the Lord: whosoever doeth work therein shall be put to death.” (Exodus 35:2)

The book considered by over two billion people to be the greatest moral guide ever produced—inspired as it was by an all-knowing, totally benevolent deity—recommends the death penalty for saying the Lord’s name at the wrong moment or in the wrong context, for imaginary crimes like witchcraft, for commonplace sexual relations (adultery, fornication, homosexuality), and for the especially heinous crime of not resting on the Sabbath. How many of today’s two billion Christians agree with their own holy book on the application of capital punishment?

And how many would agree with this gem of moral turpitude from Deuteronomy 22:28-29: “If a man meets a virgin who is not engaged, and seizes her and lies with her, and they are caught in the act, the man who lay with her shall give fifty shekels of silver to the young woman’s father, and she shall become his wife. Because he violated her he shall not be permitted to divorce her as long as he lives.” I dare say no Christian today would follow this moral directive. No one today—Jew, Christian, atheist, or otherwise—would even think of such draconian punishment for such acts. That is how far the moral arc has bent in four millennia.

The comedian Julia Sweeney, in her luminous monologue Letting Go of God, makes the point when she recalls re-reading a familiar story she learned in her Catholic childhood upbringing:

This Old Testament God makes the grizzliest tests of people’s loyalty. Like when he asks Abraham to murder his son, Isaac. As a kid, we were taught to admire it. I caught my breath reading it. We were taught to admire it? What kind of sadistic test of loyalty is that, to ask someone to kill his or her own child? And isn’t the proper answer, “No! I will not kill my child, or any child, even if it means eternal punishment in hell!”?

Like so many other comedians who’ve struck the Bible’s rich vein of unintended comedic stories, Sweeney allows the material to write itself. Here she continues her tour through the Old Testament with its preposterous commandments:

Like if a man has sex with an animal, both the man and the animal should be killed. Which I could almost understand for the man, but the animal? Because the animal was a willing participant? Because now the animal’s had the taste of human sex and won’t be satisfied without it? Or my personal favorite law in the Bible: in Deuteronomy, it says if you’re a woman, married to a man, who gets into a fight with another man, and you try to help him out by grabbing onto the genitals of his opponent, the Bible says you immediately have to have your hand chopped off.

Richard Dawkins memorably characterized this God of the Old Testament as “arguably the most unpleasant character in all fiction: jealous and proud of it; a petty, unjust, unforgiving control-freak; a vindictive, bloodthirsty ethnic cleanser; a misogynistic, homophobic, racist, infanticidal, genocidal, filicidal, pestilential, megalomaniacal, sadomasochistic, capriciously malevolent bully.”  

Leave a comment

Most modern Christians, however, respond to arguments like mine and Dawkins’ by saying that the Old Testament’s cruel and fortunately outdated laws have nothing to do with how they live their lives or the moral precepts that guide them today. The angry, vengeful God Yahweh of the Old Testament, Christians claim, was displaced by the kinder, gentler New Testament God in the form of Jesus, who two millennia ago introduced a new and improved moral code. Turning the other cheek, loving one’s enemies, forgiving sinners, and giving to the poor is a great leap forward from the capricious commands and copious capital punishment found in the Old Testament.

That may be, but nowhere in the New Testament does Jesus revoke God’s death sentences or ludicrous laws. In fact, quite the opposite (Matthew 5:17-30 passim): “Think not that I am come to destroy the law, or the prophets: I am not come to destroy, but to fulfill.” He doesn’t even try to edit the commandments or soften them up: “Whosoever therefore shall break one of these least commandments, and shall teach men so, he shall be called the least in the kingdom of heaven.” In fact, if anything, Jesus’ morality is even more draconian than that of the Old Testament: “Ye have heard that it was said by them of old time, Thou shalt not kill; and whosoever shall kill shall be in danger of the judgment: But I say unto you, That whosoever is angry with his brother without a cause shall be in danger of the judgment.”

In other words, even thinking about killing someone is a capital offense. In fact, Jesus elevated thought crimes to an Orwellian new level (Matthew 9:28-29): “Ye have heard it was said by them of old time, Though shalt not commit adultery: But I say unto you, That whosoever looketh on a woman to lust after her hath committed adultery with her already in his heart.”

And if you don’t think you can control your sexual impulses Jesus has a practical solution: “If thy right eye offend thee, pluck it out, and cast it from thee: for it is profitable for thee that one of thy members should perish, and not that thy whole body should be cast into hell.”

President Bill Clinton may have physically sinned in the White House with an intern, but by Jesus’ moral code even the evangelical Christian Jimmy Carter sinned when he famously admitted in a 1976 Playboy magazine interview while running for President: “I’ve looked on a lot of women with lust. I’ve committed adultery in my heart many times.”

Share

As for Jesus’s own family values, he never married, never had children, and he turned away his own mother time and again. For example, at a wedding feast Jesus says to her (John 2:4): “Woman, what have I to do with you?” One biblical anecdote recounts the time that Mary waited patiently off to the side for Jesus to finish speaking so that she could have a moment with him, but Jesus told his disciples, “Send her away, you are my family now,” adding (Luke 14:26): “Whoever comes to me and does not hate father and mother, wife and children, brothers and sisters, yes, and even life itself, cannot be my disciple.”

Charming. This is what cultists do when they separate followers from their families in order to control both their thoughts and their actions, as when Jesus calls to his flock to follow him or else (John 15:4-7): “Abide in me as I abide in you. Just as the branch cannot bear fruit by itself unless it abides in the vine, neither can you unless you abide in me. I am the vine, you are the branches. Those who abide in me and I in them bear much fruit, because apart from me you can do nothing. Whoever does not abide in me is thrown away like a branch and withers; such branches are gathered, thrown into the fire, and burned.” But if a believer abandons his family and gives away his belongings (Mark 10:30), “he shall receive an hundredfold now in this time, houses, and brethren, and sisters, and mothers, and children, and lands.” In other passages Jesus also sounds like the tribal warlords of the Old Testament:

Do not think that I have come to bring peace to the earth; I have not come to bring peace, but a sword. For I have come to set a man against his father, and a daughter against her mother, and a daughter-in-law against her mother-in-law; and one’s foes will be members of one’s own household. Whoever loves father or mother more than me is not worthy of me; and whoever loves son or daughter more than me is not worthy of me; and whoever does not take up the cross and follow me is not worthy of me. (Matthew 10:34-39)

Even sincere Christians cannot agree on Jesus’ morality and the moral codes in the New Testament, holding legitimate differences of opinion on a number of moral issues that remain unresolved based on biblical scripture alone. These include dietary restrictions and the use of alcohol, tobacco, and caffeine; masturbation, pre-marital sex, contraception, and abortion; marriage, divorce, and sexuality; the role of women; capital punishment and voluntary euthanasia; gambling and other vices; international and civil wars; and many other matters of contention that were nowhere in sight when the Bible was written, such as stem-cell research, gay marriage, and the like. Indeed, the fact that Christians, as a community, keep arguing over their own contemporary question “WWJD” (What Would Jesus Do?) is evidence that the New Testament is silent on the answer.

Most notably, what are we to make of the Christian moral model of sin and forgiveness? By this account, we are all sinners, born into original sin because of the Fall in the Garden of Eden. The Christian solution to this problem is to accept Jesus as your savior, as in John 3:16: “For God so loved the world, that he gave his only begotten Son, that whosoever believeth in him should not perish, but have everlasting life.” I once said these words, and for seven years lived the life of a born-again Christian, until, among other things, I recognized the flawed syllogistic reasoning behind this proposition:

1.     We were originally created sinless, but because God gave us free will and Adam and Eve chose to eat the forbidden fruit of the knowledge of good and evil, we are all born with original sin.

2.     God could forgive the sins we never committed, but instead He sacrificed his son Jesus, who is actually God himself in the flesh because Christians believe in only one God (monotheism) of which Jesus and the Holy Spirit are just different manifestations, as in Father, Son, and Holy Ghost.

3.     The only way to avoid eternal punishment for sins we never committed from this all-loving and all-powerful God is to accept his son—who is actually himself—as our savior.

So…God sacrificed himself to himself to save us from himself.

In addition to being an exercise in twisted logic, the very idea runs contrary to centuries of Western jurisprudence, which is clear on the point that individuals cannot be blamed for something that they didn’t do. There is no such thing as a scapegoat in a court of law; pinning your crimes on an innocent person (like Jesus), and then expecting a judge (like God) to sentence the other person instead of you is what’s called redemption in the Bible, but in the real world it’s known as a miscarriage of justice. In the Western legal system, Jesus would never be allowed to bear the responsibility for anyone’s sins but his own. And blaming an innocent third party potentially leaves out the most important moral agent in the equation. If someone has been harmed by your actions, it isn’t God you should be asking for forgiveness. It is the injured party who deserves your supplications and entreaties, and only that person can forgive you and grant you absolution, assuming your apology is genuine and offered sincerely.

I could go on much more about this aspect of religion—and I do at length in Chapter 4 of The Moral Arc, but the point is made here that in addition to the acknowledged magisterium of religion documented in my previous column, faith is not the royal road to moral progress. Instead, reason, rationality, and empiricism as embodied in secular philosophy and science are the only reliable tools we have for determining the natural of reality, both physical and moral.

###

Skeptic is a reader-supported publication. T

The Magisterium of Religion

Here’s the link to this article.

The Köln Dom is a reminder of the power of faith in a pre-modern world lit only by fire and plagued by poverty, disease, misery, and early death

MICHAEL SHERMER

NOV 19, 2022

Every year for the past decade that my wife and I have returned to her home city of Köln, Germany, we make a point of visiting the magnificent cathedral in the city center that has defined the region for nearly eight centuries. Construction begun in 1248, this multi-generational project wasn’t officially completed until 1880 (and upgraded, repaired, and refurbished ever since)—six centuries of unfinished awe rising up from the banks of the mighty Rhine River that cuts through the heart of this ancient city whose pre-Medieval Roman ruins lie strewn about the landscape. It is nearly impossible for even the most jaded modern mind to be unimpressed by this architectural wonder whose ornamental details bring to life biblical chronicles and heroes.

Throughout three decades of countless articles and multiple books I have criticized religion, both its dependence on supernatural epistemology and its tribal divisiveness that led to centuries of wars, pogroms, purges, and witch hunts. But on this trip to the Cologne Cathedral I time-traveled back to the latter Middle Ages and into the late Medieval mind to imagine what it must have been like to experience the awe-inspiring magnificence of such a culturally-dominant edifice that literally and figuratively puts all other structures in the shade. Imagine walking into this sanctuary after a long and exhaustive journey from one’s provincial countryside and spartan abode…

And think about what it must have been like to hear the angelic voices of divine organ music with its 20 Hertz undertones of infrasound that unconsciously generates at once feelings of awe, fear, and trembling…

And picture the joy of children playing in the footsteps of the largest construction project anyone had ever seen or would ever experience…

To fully feel that world let’s go back to a time when civilization was lit only by fire, centuries ago when populations were sparse and 80 percent of everyone lived in the countryside and were engaged in food production, largely for themselves. (I reconstruct this worldview in detail in How We Believe and The Moral Arc.) Cottage industries were the only ones around in this pre-industrial and highly-stratified society, in which one-third to one-half of everyone lived at subsistence level and were chronically under-employed, underpaid, and undernourished. Food supplies were unpredictable and plagues decimated weakened populations.

Upgrade to paid

All major cities were hit hard by disease contagions. In the century spanning 1563 to 1665, for example, there were no fewer than six major epidemics that swept through London alone, each of which annihilated between a tenth and a sixth of the population. The death tolls are almost unimaginable by today’s standards: 20,000 in 1563, 15,000 in 1593, 36,000 in 1603, 41,000 in 1625, 10,000 in 1636, and 68,000 in 1665, all in one of the world’s major metropolitan cities that had only a tiny fraction of the populations of today. Childhood diseases were unforgiving, felling 60 percent of children before the age of 17. As one observer noted in 1635, “We shall find more who have died within thirty or thirty-five years of age than passed it.” The historian Charles de La Ronciére provides examples from 15th century Tuscany in which lives were routinely cut short:

Many died at home: children like Alberto (aged ten) and Orsino Lanfredini (six or seven); adolescents like Michele Verini (nineteen) and Lucrezia Lanfredini, Orsino’s sister (twelve); young women like beautiful Mea with the ivory hands (aged twenty-three, eight days after giving birth to her fourth child, who lived no longer than the other three, all of whom died before they reached the age of two); and of course adults and elderly people.

And this does not include, La Ronciére adds parenthetically, the deaths of newborns, which historians estimate could have been as high as 30 to 50 percent.

Since magical thinking is positively correlated with uncertainty and unpredictability, we should not be surprised at the level of superstition given the grim vagaries of pre-modern life. There were no banks for people to set up personal savings accounts during times of plenty to provide a cushion of comfort during times of scarcity. There were no insurance policies for risk management, and few people had much personal property to insure anyway. With homes constructed of thatched roofs and wooden chimneys in a darkness broken only by candles, fires would routinely devastate entire neighborhoods. As one chronicler noted: “He which at one o’clock was worth five thousand pounds and, as the prophet saith, drank his wine in bowls of fine silver plate, had not by two o’clock so much as a wooden dish left to eat his meat in, nor a house to cover his sorrowful head.” Alcohol and tobacco were essential anesthetics for the easing of pain and discomfort that people employed as a form of self-medication, along with the belief in magic and superstition to mitigate misfortune.

Under such conditions it’s no wonder that almost everyone believed in sorcery, werewolves, hobgoblins, astrology, black magic, demons, prayer, providence, and, of course, witches and witchcraft. As Bishop Hugh Latimer of Worcester explained in 1552: “A great many of us, when we be in trouble, or sickness, or lose anything, we run hither and thither to witches, or sorcerers, whom we call wise men…seeking aid and comfort at their hands.” Saints were worshiped and liturgical books provided rituals for blessing cattle, crops, houses, tools, ships, wells, and kilns, along with special prayers for sterile animals, the sick and infirm, and even infertile couples. In his 1621 book, Anatomy of Melancholy, Robert Burton noted, “Sorcerers are too common; cunning men, wizards, and white witches, as they call them, in every village, which, if they be sought unto, will help almost all infirmities of body and mind.”

Share

As well, in these late Medieval times 80-90 percent of people were illiterate. Those few who could read the local vernacular, could not read the Bible because it was written in Latin, guaranteeing that it would remain the exclusive intellectual property of an elite few. Almost everyone believed in some form of black magic. If a noble woman died, her servants ran around the house emptying all containers of water so her soul would not drown. Her Lord, in response to her death, faced east and formed a cross by laying prostrate on the ground, arms outstretched. If the left eye of a corpse did not close properly, the soul could spend extra time in purgatory (this belief led to the ritual closing of the eyes upon death). A man knew he was near death if he saw a shooting star or a vulture hovering over his home. If a wolf howled at night the one who heard him would disappear before dawn. Bloodletting was popular. Plagues were believed to be the result of an unfortunate conjuncture of the stars and planets. And the air was believed to be invested with such soulless spirits as unbaptized infants, ghouls who pulled out cadavers in graveyards and gnawed on their bones, water nymphs who lured knights to their deaths by drowning, drakes who drug children into their caves beneath the earth, and vampires who sucked the blood of stray children.

Was everyone in the pre-scientific world so superstitious? They were. As the historian Keith Thomas notes, “No one denied the influence of the heavens upon the weather or disputed the relevance of astrology to medicine or agriculture. Before the seventeenth century, total skepticism about astrological doctrine was highly exceptional, whether in England or elsewhere.” And it wasn’t just astrology. “Religion, astrology and magic all purported to help men with their daily problems by teaching them how to avoid misfortune and how to account for it when it struck.” With such sweeping power over people, Thomas concludes, “If magic is to be defined as the employment of ineffective techniques to allay anxiety when effective ones are not available, then we must recognize that no society will ever be free from it.”

That may well be, but the rise of science diminished this near universality of magical thinking by proffering natural explanations where before there were predominately supernatural ones. The decline of magic and the rise of science was a linear ascent out of the darkness and into the light. As empiricism gained status, there arose a drive to find empirical evidence for superstitious beliefs that previously needed no propping up with facts.

This attempt to naturalize the supernatural carried on for some time and spilled over into other areas. The analysis of portents was often done meticulously and quantitatively, albeit for purposes both natural and supernatural. As one diarist privately opined on the nature and meaning of comets: “I am not ignorant that such meteors proceed from natural causes, yet are frequently also the presages of imminent calamities.” Yet the propensity to portend the future through magic led to more formalized methods of ascertaining causality by connecting events in nature—the very basis of science.

In time, natural theology became wedded to natural philosophy and science arose out of magical beliefs, which it ultimately displaced. By the 18th and 19th centuries, astronomy replaced astrology, chemistry succeeded alchemy, probability theory displaced luck and fortune, insurance attenuated anxiety, banks replaced mattresses as the repository of people’s savings, city planning reduced the risks from fires, social hygiene and the germ theory dislodged disease, and the vagaries of life became less vague.

Before all this modernity came online, however, it was the magisterium of religion that soothed suffering souls, a power on poignant display in the Köln Dom.

Upgrade to paid

P.S. In my 2000 book How We Believe, I argued that one role of religion is to reinforce norms, customs, and mores of a culture—along with the moral tenets of the faith—through belief of an invisible eye in the sky. On this latest visit I noticed that on the plaza surrounding the Dom, modern eyes in the sky have been added, just in case…

###

Michael Shermer is the Publisher of Skeptic magazine, the host of The Michael Shermer Show, and a Presidential Fellow at Chapman University. His many books include Why People Believe Weird ThingsThe Science of Good and EvilThe Believing BrainThe Moral Arc, and Heavens on EarthHis new book is Conspiracy: Why the Rational Believe the Irrational.

Darwin Matters

Here’s the link.

On the 214th anniversary of Charles Darwin’s birth, February 12, 1809, why the sage of Down still matters

MICHAEL SHERMER

“Hence both in space and time, we seem to be brought somewhat near to that great fact—that mystery of mysteries—the first appearance of new beings on this earth.” —Charles Darwin, Journal of Researches, 1845

Today, February 12, 2023, is International Darwin Day, the 214th anniversary of the birth of Charles Darwin, the co-discoverer (along with Alfred Russel Wallace—see my biography In Darwin’s Shadow) of evolution by natural selection, and one of the most influential scientists in history. To honor the sage of Down I have pieced together excerpts from my 2006 book Why Darwin Matters, which attempts to answer the title question (and is my only book cover featuring full frontal nudity). His influence only continues to grow as the years pile up after his death on April 19, 1882 (age 73). (Photographs within courtesy of The Complete Photographs of Darwin by John van Wyhe, part of the Darwin Online project.)

Skeptic is a reader-supported publication. All monies go to the Skeptics Society, a 501(c)(3) nonprofit. To receive new posts and support my work, consider becoming a paid subscriber.

Upgrade to paid

The Myth of Darwin in the Galapagos

In June of 2004, historian of science Frank Sulloway and I began a month-long expedition to retrace Charles Darwin’s footsteps in the Galápagos Islands. The myth Frank set out to investigate years before was that Darwin became an evolutionist in the Galápagos when he discovered natural selection operating on finch beaks and tortoise carapaces, each species uniquely adapted by food type or island ecology. (Photos in this section from the author’s collection.)

The legend endures, Sulloway notes, because of its elegant fit into a Joseph Campbell-like tripartite myth of the hero who (1) leaves home on a great adventure (Darwin’s five-year voyage on the Beagle), (2) endures immeasurable hardship in the quest for noble truths (Darwin suffered seasickness and other maladies), and (3) returns to deliver a deep message (evolution). The myth is ubiquitous, appearing in everything from biology textbooks to travel brochures, the latter of which inveigle potential customers to come walk in the footsteps of Darwin. (See Sulloway’s papers: “Darwin and His Finches: The Evolution of a Legend.” Journal of the History of Biology, 15 (1982):1-53; “Darwin’s Conversion: The Beagle Voyage and Its Aftermath.” Journal of the History of Biology, 15 (1982):325-96; “The Legend of Darwin’s Finches.” Nature, 303 (1983):372; “Darwin and the Galapagos.” Biological Journal of the Linnean Society, 21 (1984):29-59.)

The Darwin Galápagos legend is emblematic of a broader myth that science proceeds by select eureka discoveries followed by sudden revolutionary revelations, where old theories fall before new facts. Not quite. Paradigms power percepts. Nine months after departing the Galápagos, Sulloway discovered, Darwin made the following entry in his ornithological catalogue about his mockingbird collection:

When I see these Islands in sight of each other, & possessed of but a scanty stock of animals, tenanted by these birds, but slightly differing in structure & filling the same place in Nature, I must suspect they are only varieties.

Similar varieties of fixed kinds, not evolution of separate species. Darwin was still a creationist! This explains why Darwin did not even bother to record the island locations of the few finches he collected (and in some cases mislabeled), and why these now-famous birds were never specifically mentioned in the Origin of Species.

Through careful analysis of Darwin’s notes and journals, Sulloway dates Darwin’s acceptance of evolution to the second week of March, 1837, after a meeting Darwin had with the eminent ornithologist John Gould, who had been studying his Galápagos bird specimens. With access to museum ornithological collections from areas of South America that Darwin had not visited, Gould corrected a number of taxonomic errors Darwin had made (such as labeling two finch species a “Wren” and “Icterus”), and pointed out to him that although the land birds in the Galápagos were endemic to the islands, they were notably South American in character.

Darwin left the meeting with Gould, Sulloway concludes, convinced “beyond a doubt that transmutation must be responsible for the presence of similar but distinct species on the different islands of the Galápagos group. The supposedly immutable ‘species barrier’ had finally been broken, at least in Darwin’s own mind.” That July, 1837, Darwin opened his first notebook on Transmutation of Species. By 1844 he was confident enough to write in a letter to his botanist friend and colleague Joseph Hooker:

I was so struck with distribution of Galapagos organisms &c &c, & with the character of the American fossil mammifers &c &c, that I determined to collect blindly every sort of fact which cd bear any way on what are species. At last gleams of light have come, & I am almost convinced, (quite contrary to opinion I started with) that species are not (it is like confessing a murder) immutable.

Like Confessing a Murder

Dramatic words for something as seemingly innocuous as a technical problem in biology: the immutability of species. But it doesn’t take a rocket scientist—or an English naturalist—to understand why the theory on the origin of species by means of natural selection would be so controversial: if new species are created naturally—not supernaturally—what place, then, for God? No wonder Darwin waited twenty years before publishing his theory.

From the time of Plato and Aristotle in ancient Greece to the time of Darwin and Wallace in the nineteenth century, nearly everyone believed that a species retained a fixed and immutable “essence.” A species, in fact, was defined by its very essence—the characteristics that made it like no other species. The theory of evolution by means of natural selection, then, is the theory of how kinds can become other kinds, and that upset not only the scientific cart, but the cultural horse pulling it. The great Harvard evolutionary biologist, Ernst Mayr, stressed just how radical was Darwin’s theory (in his 1982 book Growth of Biological Thought):

The fixed, essentialistic species was the fortress to be stormed and destroyed; once this had been accomplished, evolutionary thinking rushed through the breach like a flood through a break in a dike.

The dike, however, was slow to crumble. Darwin’s close friend, the geologist Charles Lyell, withheld his support for a full nine years, and even then hinted at a providential design behind the whole scheme. The astronomer John Herschel called natural selection the “law of higgledy-piggledy.” And Adam Sedgwick, a geologist and Anglican cleric, proclaimed that natural selection was a moral outrage, and penned this ripping harangue to Darwin:

There is a moral or metaphysical part of nature as well as a physical. A man who denies this is deep in the mire of folly. You have ignored this link; and, if I do not mistake your meaning, you have done your best in one or two cases to break it. Were it possible (which thank God it is not) to break it, humanity, in my mind, would suffer a damage that might brutalize it, and sink the human race into a lower grade of degradation than any into which it has fallen since its written records tell us of its history.

In a review in Macmillan’s Magazine, Henry Fawcett wrote of the great divide surrounding On the Origin of Species:

No scientific work that has been published within this century has excited so much general curiosity as the treatise of Mr. Darwin. It has for a time divided the scientific world with two great contending sections. A Darwinite and an anti-Darwinite are now the badges of opposed scientific parties.

Darwinites and anti-Darwinites. Although the scientific community is now united in agreement that evolution happened, a century and a half later the cultural world is still so divided. According to a 2005 poll by the Pew Research Center: 42 percent of Americans hold strict creationist views that “living things have existed in their present form since the beginning of time” while 48 percent believe that humans “evolved over time.” More to the point of why evolution has been in the news of late, the survey also found that 64 percent said they were open to the idea of teaching creationism in addition to evolution in public schools, while 38 percent said they think evolution should be replaced by creationism in biology classrooms. (Recent polls find the acceptance of the theory of evolution in the US increasing and creationism decreasing, but a 54% acceptance rate for the theory is not exactly a mandate for science.)


1878a Three-quarter right profile, seated in a Down House chair (according to some sources), by Leonard Darwin.

Why Evolution Matters

The influence of the theory of evolution on the general culture is so pervasive it can be summed up in a single observation: we live in the age of Darwin. Arguably the most culturally jarring theory in the history of science, the Darwinian revolution changed both science and culture in ways immeasurable, as Ernst Mayr summarized (in my own wording):

1. The static creationist model of species as fixed types, replaced with a fluid evolutionary model of species as ever-changing entities.

2. The theory of top-down intelligent design through a supernatural force, replaced with the theory of bottom-up natural design through natural forces.

3. The anthropocentric view of humans as special creations above all others, replaced with the view of humans as just another animal species.

4. The view of life and the cosmos as having design, direction, and purpose from above, replaced with the view of the world as the product of bottom-up design through necessitating laws of nature and contingent events of history.

5. The view that human nature is infinitely malleable and primarily good, replaced with the view of a constraining human nature in which we are good and evil.

In the memorable observation by Theodosius Dobzhansky: “Nothing in biology makes sense except in the light of evolution.”


1881 Four photographs by Elliott & Fry. This well-known sitting includes the only known photographs of Darwin standing.

Skeptic is a reader-supported publication. All monies go to the Skeptics Society, a 501(c)(3) nonprofit. To receive new posts and support my work, consider becoming a paid subscriber.

Upgrade to paid

Darwin’s God and the Devil’s Chaplain

Darwin matriculated at Cambridge University in theology, but he did so only after abandoning his medical studies at the Edinburgh University because of his distaste for the barbarity of surgery. Darwin’s famous grandfather Erasmus, and his father Robert, both physicians by trade who were deeply schooled in natural history, were also confirmed freethinkers, so there was no doctrinaire pressure on the young Charles to choose theology.

In point of fact, Darwin’s selection of theology as his primary course of study allowed him to pursue his passion of natural history through the academic justification of studying “natural theology”—he was far more interested in God’s works (nature) than God’s words (the Bible). Besides, theology was one of only a handful of professions that a gentleman of the Darwin family’s high social position in the landed gentry of British society could choose. Finally, although Darwin belonged to the Church of England, membership was expected of someone in his social class. 

Still, Darwin’s religiosity was not entirely utilitarian. He began and ended his five-year voyage around the world as a creationist, and he regularly attended services on board the Beagle, and even during some land excursions in South America. It was only upon his return home that his loss of his faith came about, that that loss happened gradually—even reluctantly—over many years.

Nagging doubts about the nature and existence of the deity chipped away at his faith from his studies of the natural world, particularly the cruel nature of many predator-prey relationships. “What a book a Devil’s Chaplain might write on the clumsy, wasteful, blundering low & horridly cruel works of nature!” Darwin harped in an 1856 letter to his botanist mentor Joseph Hooker. In 1860 he wrote to his American colleague, the Harvard biologist Asa Gray, about a species of wasp that paralyzes its prey (but does not kill it), then lays its eggs inside the paralyzed insect so that upon birth its offspring can feed on live flesh:

I cannot persuade myself that a beneficent God would have designedly created the Ichneumonidae with the express intention of their feeding within the living bodies of Caterpillars, or that a cat should play with mice. Not believing this, I see no necessity in the belief that the eye was expressly designed.

Pain and evil in the human world made Darwin doubt even more. “That there is much suffering in the world no one disputes,” he wrote to a correspondent. “Some have attempted to explain this with reference to man by imagining that it serves for his moral improvement. But the number of men in the world is as nothing compared with that of all other sentient beings, and they often suffer greatly without any moral improvement.” Which is more likely, that pain and evil are the result of an all-powerful and good God, or the product of uncaring natural forces? “The presence of much suffering agrees well with the view that all organic beings have been developed through variation and natural selection.” The death of Darwin’s beloved ten-year-old daughter Anne put an end to whatever confidence he had in God’s benevolence, omniscience, and thus existence. According to the great Darwin scholar and biographer Janet Browne: “This death was the formal beginning of Darwin’s conscious dissociation from believing in the traditional figure of God.”

Throughout most of his professional career, however, Darwin eschewed the God question entirely, choosing to focus instead on his scientific studies. Toward the end of his life Darwin received many letters querying him on his religious attitudes. His long-silence gave way to a few revelations. In one letter penned in 1879, just three years before he died, Darwin explained: “In my most extreme fluctuations I have never been an Atheist in the sense of denying the existence of God. I think that generally (and more and more as I grow older), but not always, that an Agnostic would be the more correct description of my state of mind.”

A year later, in 1880, Darwin clarified his reasoning to the British socialist Edward Aveling, who solicited Darwin’s endorsement of a group of radical atheists by asking his permission to dedicate a book Aveling edited entitled The Student’s Darwin, a collection of articles discussing the implications of evolutionary theory for religious thought. The book had a militant antireligious flavor that Darwin disdained and he declined the offer, elaborating his reason with his usual flare for quotable maxims:

It appears to me (whether rightly or wrongly) that direct arguments against christianity & theism produce hardly any effect on the public; & freedom of thought is best promoted by the gradual illumination of men’s minds which follow[s] from the advance of science. It has, therefore, been always my object to avoid writing on religion, & I have confined myself to science.

Darwin then appended an additional hint about a personal motive: “I may, however, have been unduly biased by the pain which it would give some members of my family, if I aided in any way direct attacks on religion.” Darwin’s wife Emma was a deeply religious woman, so out of respect for her he kept the public side of his religious skepticism in check, an admirable feat of self-discipline by a man of high moral character.


Why Darwin Matters

As pattern-seeking, storytelling primates, to most of us the pattern of life and the universe indicates design. For countless millennia we have taken these patterns and constructed stories about how life and the cosmos were designed specifically for us from above. For the past few centuries, however, science has presented us with a viable alternative in which the design comes from below through the direction of built-in self-organizing principles of emergence and complexity. Perhaps this natural process, like the other natural forces of which we are all comfortable accepting as non-threatening to religion, was God’s way of creating life. Maybe God is the laws of nature—or even nature itself—but this is a theological supposition, not a scientific one.

What science tells us is that we are but one among hundreds of millions of species that evolved over the course of three and a half billion years on one tiny planet among many orbiting an ordinary star, itself one of possibly billions of solar systems in an ordinary galaxy that contains hundreds of billions of stars, itself located in a cluster of galaxies not so different from millions of other galaxy clusters, themselves whirling away from one another in an expanding cosmic bubble universe that very possibly is only one among a near infinite number of bubble universes. Is it really possible that this entire cosmological multiverse was designed and exists for one tiny subgroup of a single species on one planet in a lone galaxy in that solitary bubble universe? It seems unlikely.

Herein lies the spiritual side of science—sciencuality, if you will pardon an awkward neologism but one that echoes the sensuality of discovery. If religion and spirituality are suppose to generate awe and humility in the face of the creator, what could be more awesome and humbling than the deep space discovered by Hubble and the cosmologists, and the deep time discovered by Darwin and the evolutionists?

Darwin matters because evolution matters; evolution matters because science matters. Science matters because it is the preeminent story of our age, an epic saga about who we are, where we came from, and where we are going.

###

Skeptic is a reader-supported publication. All monies go to the Skeptics Society, a 501(c)(3) nonprofit. To receive new posts and support my work, consider becoming a paid subscriber.

Upgrade to paid

Michael Shermer is the Publisher of Skeptic magazine, the host of The Michael Shermer Show, and a Presidential Fellow at Chapman University. His many books include Why People Believe Weird ThingsThe Science of Good and EvilThe Believing BrainThe Moral Arc, and Heavens on EarthHis new book is Conspiracy: Why the Rational Believe the Irrational. This essay was based, in part, on Why Darwin Matters: The Case Against Intelligent Design.

The Final Lecture

Here’s the link to this article. Definitely worth a read.

Ten lessons on living a good life and being resilient in the teeth of entropy, problems, setbacks & obstacles, aka normal life

MICHAEL SHERMER

MAY 16, 2023

For the past 12 years I have been a Presidential Fellow at Chapman University, where I have taught a course called Skepticism 101: How to Think Like a Scientist, examples for which I draw from over 30 years of publishing Skeptic magazine and directing the Skeptics Society. I lecture on causality and determining truth, Bayesian reasoning, Signal Detection Theory, the scientific method, rationality and irrationality, game theory, cognitive biases, cults, conspiracies, Holocaust denial, creationism, science and religion, and much more (you can watch some of the lectures that I recorded remotely during the pandemic here).

In the final minutes of the final lecture of my final semester at Chapman a student asked what practical lessons for life I might share with them. I offered as much as I could think of off the top of my head, but since I have researched and written a fair amount on this topic over the decades (and tried to apply these lessons to my own life) I thought I would deliver a final lecture here, not only for my students but for anyone who is interested in knowing what tools science and reason can provide for how to live a good life and how to deal with entropy, problems, setbacks and obstacles, aka normal life. I have kept this short and limited to ten lessons, but I plan to expand each of these into chapter-length lessons and add a number more (possibly for a book). Watch for those in this space as well as in eSkeptic and on my podcast. To that end, please consider becoming a paid subscriber below. All monies go to the Skeptics Society, a 501(c)(3) nonprofit educational organization.

Skeptic is a reader-supported publication. All monies go to the Skeptics Society, a 501(c)(3) nonprofit. To receive new posts and support my work, consider becoming a free or paid subscriber.

Lesson 1. The First Law of Life

The Second Law of Thermodynamics is first law of life, namely to expend energy to survive and flourish. That sounds rather anodyne, so let me unpack that briefly here, then we will see how it applies to all the other lessons.

We are physical beings living in a physical universe governed by the laws of nature. One of the most fundamental of all the laws of nature is called the Second Law of Thermodynamics, sometimes called “entropy”, which holds that in a closed system energy dissipates, disorder increases, and things run down.

A hot cup of coffee, for example, will get cold if you don’t do anything to heat it up again. Why? Because heat is produced by all the jiggling of the water and coffee molecules in the cup, and since energy decreases and disorder increases, over time the molecules will jiggle less and the heat will dissipate into the environment, like the air above the cup or your hand holding the cup (which itself temporarily warms as the heat is transferred). In this case, a microwave oven to re-heat the coffee is your way of fighting back against entropy by putting energy into the cup. Of course, the energy to run the microwave comes from electricity generated by power plants, which you have to pay for each month, so there’s no free lunch in the universe!

Humans are open systems. We capture energy from food and convert it to power our muscles to move and push back against entropy, like making coffee, cleaning the house, going to work, and so forth. This is what I mean when I say that the Second Law of Thermodynamics is the First Law of Life. Your purpose in life is to expend energy to carve out pockets of order that lead to survival and flourishing.

Examples of entropy abound: metal rusts if you don’t maintain it. Weeds overrun gardens if you don’t weed them. Wood rots if you don’t paint it. Beds stay unmade and bedrooms get cluttered if you don’t make and clean them. Your body will grow weak and flabby if you don’t stress it regularly with exercise. Your mind becomes fuzzy and confused if you don’t challenge it to think. Friendships and relationships must be maintained through regular communication. An empty bank account is what happens if you don’t go to work and earn money. Poverty is what societies get if they do nothing productive.

Entropy is not a “force” per se, like gravity. It’s just what happens if energy isn’t put into the system. Think of a sandcastle: There are a near infinite number of ways that grains of sand can be configured into an amorphous blob that resembles nothing in particular, but with just the right amount of water mixed with the sand there are a limited number of ways that the grains can be congealed into structures that resemble castles. What happens if the sandcastle is not maintained? Wind and waves and dogs and children erode it back into a featureless glob. There are simply far more ways for sand to be unstructured than structured. Life consists of building sand castles and maintaining them.

This also explains why failures in life are so much more common than successes: there are simply more ways to fail than there are to succeed. And the higher you aim the more obstacles there are going to be for you to get there, and entropy will push back against you along the way. Remember that the next time you fail. Like sandcastles, failure is normal, success unusual.

Lesson 2. To Thine Own Self Be True

In William Shakespeare’s play Hamlet, the character Polonius says:

“This above all: to thine own self be true,

And it must follow, as the night the day.

Thou canst not then be false to any man.”

To thine own self be true. What does this mean, exactly? Let’s begin with what philosophers call the Law of Identity: A is A, which means that each thing is identical with itself. The 15th century philosopher Nicholas of Cusa explained it this way: “there cannot be several things exactly the same, for in that case there would not be several things, but the same thing itself.”

Being true to yourself means recognizing and acknowledging that A is A, that you are you and not someone else. To try to be something that you are not, or to pretend to be someone else, is a violation of the Law of Identity: A cannot be non-A.

A is A means discovering who you are, your temperament and personality, your intelligence and abilities, your needs and wants, your loves and interests, what you believe and stand for, where you want to go and how you want to get there, and what matters most to you. Thine own self is your A, which cannot also be non-A. The attempt to make A into non-A has caused countless problems, failures, and heartaches in peoples’ lives.

How do you figure out who you are? By testing yourself, by trying new things, by meeting new people, by exploring, traveling, and reading, by trying different jobs and considering different careers. In time you will discover that most things you try, you will not be good at, but out of all those failures will emerge a handful of things that you are good at, a few people whom you are drawn to, and slowly the real you will emerge and thine own true self will come into focus.

Lesson 3. Be Antifragile

If the purpose of life is to survive and flourish in the teeth of entropy pushing back against everything you do, then you need to be antifragile, a word coined by the risk analyst Nassim Nicholas Taleb in his 2012 book of that title, Antifragile: Things That Gain from Disorder, on how to live in a world that is unpredictable and chaotic, and how to thrive during times of stress and even disaster.

Antifragile means growing and prospering from randomness, uncertainty, opacity, and disorder, and benefitting from a variety of shocks. Here’s how my psychologist friend and colleague Jonathan Haidt applies the concept of antifragility to raising children:

Bone is anti-fragile. If you treat it gently, it will get brittle and break. Bone actually needs to get banged around to toughen up. And so do children … they need to have a lot of unsupervised time, to get in over their heads and get themselves out.

For example, peanut allergies were once extremely rare. A mid-1990s study found that only 4 out of 1,000 children under the age of eight had a peanut allergy. A 2008 study by the same researchers, however, found that the rate had skyrocketed by 350 percent to 14 per 1,000. Why? Because parents and teachers had protected children from exposure to peanuts. The lesson is clear: immune systems become antifragile by exposure to environmental stressors, and so too do our minds and bodies to the stressors of daily life.

One solution to this problem may be found in an old saying: “Prepare the child for the road, not the road for the child.” Other idioms capture the principle behind the lesson of antifragility: “What doesn’t kill you makes you stronger,” Nietzsche famously said. “Tough times don’t last but tough people do,” my mother often told me. Here is what I wrote one of my students when she was going through a particularly difficult time:

No matter who you reach out to, ultimately it will come down to you and how you respond to your issues. There’s only so much other people can do. In the end, you have to help yourself. Whatever has happened in your life, you can’t do anything about that now as it is in the past and is out of your control. What is in your control is how you respond to it, whatever the “it” is, starting by deciding today that you are not going to let yourself be a victim any longer. It has to stop.

Ultimately only you can make it stop. Psychologists, family, and friends can only do so much. You must dig deep inside yourself and call up reserves you didn’t know you had, and from there rebuild your life, day by day, hour by hour, until it no longer is holding you back from realizing your full potential. What does not kill you makes you stronger. Whatever happened, it didn’t kill you. You are alive. You are engaged in the world. You are working on assignments. You will grow stronger with every accomplishment.

The current craze of overprotecting students from anything that makes them uncomfortable, including ideas that may challenge them, is making them weaker, not stronger, fragile, not antifragile.

Lesson 4. Be Self-Disciplined Because Action is Character

As the name implies, discipline comes from within the self. You are the architect of your life. You are responsible for what you do. So do it. How? Change your behavior and your cognition will follow. Change your habits and your thoughts will follow.

Everyone is looking for a hack, an easy way around the self-discipline problem. There is no hack and no way around being self-disciplined. External motivations, like motivating yourself with rewards for changing your habits, will not last. The motivation must eventually come from within. Internal motivation is the key to self-discipline.

You want to stop eating sugar and unhealthy food? Stop eating sugar and unhealthy food! Where? Here. When? Now. Self-discipline happens here and now. Stop eating bad food and start eating good food…here and now. Just do it.

Toward the end of his life the novelist F. Scott Fitzgerald wrote that “action is character,” by which he meant that what you do is who you are. Cognitive psychologists call this “embodied cognition”, in which action becomes character. My friend the science writer Amy Alkon wrote a book about this, colorfully titled Unfuckology: A Field Guide to Living with Guts and Confidence, with a chapter title that perfectly captures this principle: “The Mind is Bigger Than the Brain.” Here’s how Amy explains the principle in her humorous way:

Embodied cognition research shows that who you are is not just a product of your brain. It’s also in your breathing, your gut, the way you stand, the way you speak, and, while you’re speaking, whether you make eye contact or dart your eyes like you’re about to bolt under a car like a cat.

By acting and behaving a new way, you push out of your mind the old ways of being that you want to change. You are what you do. So act the way you want to feel. Be the person you want to be by acting like that person. As the Buddha counseled:

Your worst enemy cannot harm you as much as your own thoughts, unguarded. But once mastered, no one can help you as much.

Lesson 5. Don’t be a Victim

In their 2018 book The Rise of Victimhood Culture: Microaggressions, Safe Spaces, and the New Culture Wars, the sociologists Bradley Campbell and Jason Manning document how Western society has transitioned from an honor culture to a dignity culture and now is shifting into a victimhood culture.

In a culture of honor, each person has to earn honor and, unable to tolerate a slight, takes action himself. The big advance in Western society was to let the law handle serious offenses and ignore the inevitable minor ones—what sociologists call the culture of dignity, which reigned in the 20th century. It allows diversity to flourish because different people can live near each other without killing each other. As such, a culture of honor leads to autonomy, independence, self-reliance, confidence, courage, and strength of character.

The past quarter century, however, has seen the rise of a victimhood culture, where people are hypersensitive to slights as in the honor culture, but they don’t take care of it themselves. Instead they appeal to a third party to punish for them. A culture of victimhood leads people to divide the world into good and bad classes—victims and oppressors. As such, a culture of victimhood makes one weak, dependent, timid, afraid, and lacking courage and character.

Yes, any of us can be victims, but how you handle it matters. In a victimhood culture the primary way to gain status is to either be a victim or to condemn alleged perpetrators against victims, leading to an accelerating search for both. An Oxford student explained what happened to her after she joined a campus feminist group named Cuntry Living and started reading their literature on misogyny and patriarchy:

Along with all of this, my view of women changed. I stopped thinking about empowerment and started to see women as vulnerable, mistreated victims. I came to see women as physically fragile, delicate, butterfly-like creatures struggling in the cruel net of patriarchy. I began to see male entitlement everywhere.

As a result she became fearful and timid, afraid even to go out to socialize:

Feminism had not empowered me to take on the world—it had not made me stronger, fiercer or tougher. Even leaving the house became a minefield. What if a man whistled at me? What if someone looked me up and down? How was I supposed to deal with that? This fearmongering had turned me into a timid, stay-at-home, emotionally fragile bore.

Here is an antifragile way to deal with misogyny and patriarchy, from the model and pro-nuclear energy activist Isabelle Boemke:

If your Spanish is rusty a biblical metonymy may be found in the command to “go forth and multiply” (with your mother).

So stop with the safe spaces, trigger warnings, microaggressions, and especially the deplatforming and cancelation of speakers who may cause students to rethink their beliefs—you know, what colleges and universities were designed to do. It is turning young adults into fragile snowflakes instead of antifragile warriors.

Lesson 6. Don’t Eat the Marshmallow

When video of Admiral William H. McRaven’s 2014 commencement address at the University of Texas at Austin was posted online, the speech went viral. Millions of viewers will remember the core message summed up in his memorable line: “If you want to change the world, start off by making your bed.” The Navy SEAL veteran explained the psychology behind such a simple task:

If you make your bed every morning you will have accomplished the first task of the day. It will give you a small sense of pride and it will encourage you to do another task and another and another. By the end of the day, that one task completed will have turned into many tasks completed. Making your bed will also reinforce the fact that little things in life matter. If you can’t do the little things right, you will never do the big things right. And, if by chance you have a miserable day, you will come home to a bed that is made—that you made—and a made bed gives you encouragement that tomorrow will be better.

Admiral McRaven’s “life lessons” in his speech are, in fact, variations on a theme explored by the legendary psychologist Walter Mischel in his 2014 book The Marshmallow Test. The key to being a successful Navy SEAL—or anything else in life—is summed up in the book’s subtitle, Mastering Self-Control. Mischel begins by describing how, in the late 1960s, he and his colleagues devised a straightforward experiment to measure self-control at the Bing Nursery School at Stanford University.

In its simplest form, children between the ages of 4 and 6 were given a choice between one marshmallow now or two marshmallows if they waited 15 minutes. Some kids ate the marshmallow right away, but most engaged in unintentionally hilarious attempts to overcome temptation. They averted their gaze, covered their eyes, squirmed in their seats, or sang to themselves. They made grimacing faces, tugged at their ponytails, picked up the marshmallow and pretended to take a bite. They sniffed it, pushed it away from them, covered it up. If paired with a partner, they engaged in dialogue about how they could work together to reach the goal of doubling their pleasure.

In 2006, Professor Mischel published a new paper in the prestigious journal Psychological Science. The researchers did a follow-up study with the students they had tested 40 years before, examining the type of adults they had grown into. They found that the children who were able to delay gratification had higher SAT scores entering college, higher grade-point averages at the end of college, and they made more money after college. Perhaps not surprisingly, they also tended to have a lower body-mass index. That is, they were less likely to have a weight problem.

So, not eating the marshmallow is good for both your body and your mind. And all of life is a series of marshmallow tests.

Lesson 7: Directing Your Future Self

In an episode of the hit animated television series The Simpsons, Marge warns her husband that he might regret the drinking binge he’s about to go on, to which Homer replies: “That’s a problem for future Homer. Man, I don’t envy that guy.”

All of us, in fact, have future selves. Or, more accurately, there is no fixed self, but rather an ever-changing self, and the fact that we can project ourselves into the future means we can not only anticipate how our future selves might act, we can take measures today to alter how our future selves behave.

In the field of behavioral economics this problem of the future self is called future discounting, or myopic (nearsighted) discounting, and research shows that most of us discount the future too steeply, for example, electing to spend too much now instead of saving some for later. People are notoriously bad at long-term investing, as well as selecting smart retirement plans. The reason is that in the world we evolved in, and in all of human history until recently, life was, in the words of the political theorist Thomas Hobbes, “nasty, brutish, and short.” Why save for a fabulous 75th birthday party when the odds were high that you’d be dead by 50?

For most of our ancestors, a bird in the hand was worth two in the bush. In that world, it was better to eat one marshmallow now rather than risk the promised two marshmallows later that might be purloined or otherwise lost. A bumper sticker captures the temptation psychology: “Life is short. Eat dessert first.”

In today’s world, however, there is a good chance you will live a long life, so there is some justification to figuring out how to delay gratification, save for the future, plan for retirement, and expect your future self to be around for awhile.

The key here is projecting your current self into the future, asking yourself now what you want to happen then, and set up conditions today that you know will take effect later when your future self may not be trusted with doing the right thing.

That is, you don’t want to be Homer and say of your future self “man, I don’t envy that guy.”

Lesson 8: Be Your Own Financial Advisor

The comedian Woody Allen once joked, “It is better to be rich than it is to be poor…if only for financial reasons.” Well, yes, it is, and those financial reasons are not trivial.

Money may not be able to buy you love, happiness, or meaningfulness, but it sure can make life more comfortable and, more importantly, it can increase your opportunities for finding love, happiness and meaningfulness. How?

First, if you’re living on the margin—that is, your income barely covers your expenses and you have next to nothing left over for additional consumption or investment—your opportunities for doing anything else, from vacations to hobbies to retirement, are reduced to next to nothing.

Second, money buys you time, and that time can be put to use to make more money, as well as enjoy life by enriching it with additional opportunities for both business and pleasure.

Third, money buys a better life: better food, better clothes, better homes, better education, better transportation, better travel, better recreation, and better retirement.

How do you make money? Investments in real estate or the stock market (or both). I recommend a book called The Gone Fishin’ Portfolio by a financial advisor named Alex Green, who subsequently became a friend of mine. What Alex demonstrates is that no one can consistently beat the market. You may hear about people who do—for example, fund managers like Bill Miller, who in 2006 was declared by CNNMoney.com to be “The Greatest Money Manager of our Time” because he beat the S&P 500 stock index 15 years in a row.

But as my science writer friend Leonard Mlodinow calculated in his book The Drunkard’s Walk: How Randomness Rules Our Lives, there are over 6,000 fund managers in the U.S., and so if you do a simple coin-flip calculation of the odds that someone in that cohort of 6,000 fund managers would beat the S&P 500 15 years in a row, it turns out to be .75, or 3 out of 4. As Len says, the CNNMoney headline should have read “Expected 15-Year Run Finally Occurs: Bill Miller Lucky Beneficiary.” And, wouldn’t you know it, in the two years after Miller’s 15-year streak, the story read: “the market handily pulverized him.”

When Alex Green says to “go fishin” what he means is that you should not try to be the next Bill Miller. Why? Because only after the fact can we pick out the winners. Instead, you should pick stocks in companies with a solid track record—or, even better, invest in mutual funds tied to, for example, the S&P 500—and then, well, go fishing; that is, leave your investments alone. For example, Green calculates that if you invested $10,000 in 1990 in a mutual fund tied to the entire S&P 500 and then went fishing, 20 years later you would have $90,000, not counting dividend reinvestment, which would push you well over the $100,000 figure.

By contrast, if you tried to be actively involved in trading—buying and selling stocks and trying to anticipate what the market would do—you risk missing the biggest increases in that 20-year block. For example, if you miss just the 5 best days in that 20 years, your $90,000 account would plummet to $45,000. If you miss the 10 best days you’d end up with around $35,000. If you miss the best 25 days your $10,000 investment would only bring you only $19,000. And if you miss the best 50 days…you’d actually lose money.

Anyone can compute for you how much stocks have returned to investors in the past. No one can do that for the future. In the case of the S&P 500, since the 1920s it has returned an annualized average of around 10%. The returns for the NASDAQ, which is heavily loaded in tech stocks that have done so well the past 20 years, is significantly higher. Whichever fund you invest in, however, you should expect that your returns will not be significantly higher or lower than the long-term average, which in any block of time in a two-digit positive number. Here’s how Alex Green explains it:

History clearly demonstrates that no other asset class returns more than stocks over the long haul. Once you understand this—and accept the steep odds against timing the market—you’ve made the first step toward adopting an investment strategy that can generate high returns with an acceptable level of risk.

Here’s a chart showing the value of different assets over the very long run:

Lesson 9: Build Strong Social Networks

Diet and exercise are very important tools for living a long, healthy, and high-quality life, but believe it or not there’s something else you can do that produces even better results and it doesn’t require getting up at Zero Dark Thirty, doing push-ups, or eating kale. All you have to do is be sociable. Here are some comparisons of things you can do to lower your mortality risk based on the latest studies in longevity by scientists around the world:

  • Exercise lowers mortality risk by 33%. A happy marriage lowers it by 49%.
  • Eating 6 or more servings per day of fruits and vegetables lowers mortality risk by 26%. Having a large social network lowers it by 45%
  • Eating 3 servings a day of whole grains lowers mortality risk by 23%. Feeling you have others you can count on for support lowers it by 35%.
  • Eating a Mediterranean diet lowers mortality risk by 21%. Living with someone lowers it by 32%.

These numbers, and their implications for what you can do to improve your life, were compiled by the science journalist Marta Zaraska and published in her book Growing Young: How Friendship, Optimism, and Kindness can Help You Live to 100. Here are some of her suggestions of simple things you can do, all backed by scientific research:

  • Engage in more physical contact with others: kiss your partner more often, hold hands with your kids, hug your friends, rub each other’s back, look others in the eyes.
  • Prioritize your romantic relationship and really commit to it. Read books and articles on how to be a better partner. Avoid contempt, criticism, stonewalling, and defensiveness. Talk with your partner about good things that happen in your daily life. Try new and fun things together and have some fun.
  • Invest in your friendships. Spend more time together, disclose your secrets, and don’t be afraid to ask for favors. When you’re with your partner or friends and family, put your phone away and focus on them.
  • Be more extraverted by greeting staff in a store, calling a friend whom you haven’t talk to in awhile, try a new restaurant or bar or café where you will meet new people working there.

In Zaraska’s words, here’s the bottom line:

It’s time we recognize that improving our social lives and cultivating our minds can be at least as important for health and longevity as are diet and exercise. When you grow as a person, chances are, you will also grow young. To Michael Pollan’s famous statement, “Eat food. Not too much. Mostly plants,” I would add: “Be social, care for others, enjoy life.”

Lesson 10. Find Your Meaning and Purpose in Life

What, specifically, should you do to find meaning and purpose in life? Philosophers, theologians, and sages from spiritual traditions have been writing about this topic for millennia, and recently psychologists have undertaken scientific studies of people and what they do to find meaning and purpose in life. Here are some of their findings.

1. Love and family. The bonding and attachment to other people increases one’s circle of sentiments and a corresponding sense of purpose to care about others as much as, if not more than, oneself. A core principle of leading a meaningful life is to make it more than just about yourself.

2. Meaningful work and career. Having a passion for work and a long-term career gives most people a drive to achieve goals beyond the needs of themselves and their immediate family that lifts all of us to a higher plane, and society toward greater prosperity and moral progress. Having a reason to get up and around in the morning, and having a place to go where one is needed, is a lasting purposeful goal.

3. Social and community involvement. We are not isolated individuals but social beings with a drive to participate in the process of determining how best we should live together, for the benefit of ourselves, our families, our communities, and our societies. This is not just voting but, for example, being actively engaged in the political process; it is not just a matter of joining a club or society, but caring about its goals and the actions of the other members working toward the same goals. Get out and participate!

4. Challenges and goals. Most of us need tests and trials and things at which to aim, both ordinary, such as the physical challenge of sports and recreation and the mental challenge of games and intellectual pursuits, as well as extraordinary, such as striving for abstract principles like truth, justice, and freedom, and struggling through obstacles in the way of realizing them.

5. Transcendency and spirituality. Possibly unique to our species is the capacity for aesthetic appreciation, spiritual reflection, and transcendent contemplation through a variety of expressions such as art, music, dance, exercise, sports, meditation, prayer, quiet contemplation, religious revere, and spiritual contemplation, connecting us to that which is outside of ourselves, and generating a sense of awe and wonder at the vastness of humanity, nature, the world, and the cosmos. The idea that we live in a universe that is 13.8 billion years old, and on a planet that is but one among trillions of planets in our galaxy alone, itself one of hundreds of billions of other galaxies, in a universe that is possibly just one in a multiverse of universes, is so staggering a thought as to leave one speechless in reverence for the vastness of it all.

I will end this reverie on lessons for life with an inspiring poem that completely changed how I looked at my life when I first encountered it. It’s called Invictus, written in 1920 by William Ernest Henley, and is particularly poignant as he wrote it when he was terminally ill:

Out of the night that covers me,

Black as the pit from pole to pole,

I thank whatever gods may be

For my unconquerable soul.

In the fell clutch of circumstance

I have not winced nor cried aloud.

Under the bludgeonings of chance

My head is bloody, but unbowed.

Beyond this place of wrath and tears

Looms but the Horror of the shade,

And yet the menace of the years

Finds and shall find me unafraid.

It matters not how strait the gate,

How charged with punishments the scroll,

I am the master of my fate:

I am the captain of my soul.

Please consider supporting Skeptic by becoming a free or paid subscriber below. All monies go to the Skeptics Society, a 501(c)(3) nonprofit educational organization.

Upgrade to paid

###

Michael Shermer is the Publisher of Skeptic magazine, the host of The Michael Shermer Show, and a Presidential Fellow at Chapman University. His many books include Why People Believe Weird ThingsThe Science of Good and EvilThe Believing BrainThe Moral Arc, and Heavens on EarthHis new book is Conspiracy: Why the Rational Believe the Irrational.