“Watch your backfire” (Backfire Effect)

BACKFIREWhen the 2012 college football season was in high gear, speculation abounded about which teams would play in what passed for the national championship in those days that we wandered in the pre-playoff wilderness. One poster asserted that, if unbeaten, Kansas State and Notre Dame would play for the title. Enter an SEC advocate, whom we’ll call Billy Bo Jim Bob. No way, he insisted. For the SEC was so high, so mighty, so revered, that a one-loss SEC team’s mere presence in that majestic conference would sway the committee to select it over two unbeaten teams. I responded that in 2004 there were five unbeaten teams, including Auburn, and that the Tigers were bypassed for two non-SEC teams. Billy Bo responded, “That proves my point.”

By no means is this mindset limited to sports-crazed Southerners. Billy Bo’s retort was a manifestation of the Backfire Effect. This is when deep convictions meet contradictory proof, resulting not in a new viewpoint, but in a hardening of beliefs.

The most well-known example of recent years is Barack Obama’s birthplace. The newspaper birth announcement, ironically, was discovered by an early Birther who was hoping that a lack of announcement would bolster his position. Instead of accepting this evidence, Birthers promulgated the preposterous notion that Obama had been born in Kenya, his relatives in Hawaii had received news of this, had applied for a Certificate of Live Birth for someone born overseas, the state had granted this request, this information was forward to the newspapers, and the announcement ran, all in 11 days.

Birthers then attributed the release of the Long Form to the imminent release of the Birther Bible, Jerome Corsi’s book Where’s the Birth Certificate? They gathered in their online inculcation chambers, making a big deal about layers and whatever else. When their most cherished idea was disproved, they considered this more evidence for their position.

Imagine your mail includes an unexpected bill you can’t pay. Or you’re on a hike when a cougar appears in the clearing. Or, for maximum effect, you get the unexpected bill AND there’s a cougar nearby. These are bad elements, and they require a response. That same evolutionary wiring may be the reason behind the Backfire Effect. You feel threatened and need to react.

Indeed, we often pay more attention to ideas that upset us. Psychologists Peter Ditto and David Lopez conducted a study in which subjects put a drop of saliva onto a strip. Half the subjects were told if the strip turned green within 20 seconds, it indicated an enzyme disorder. Most in this group waited 20 seconds, then put the strip down and walked away. Just one in six retested to make sure. The other half was told that green meant no disorder. Subjects in this group stood at the strip for far longer than 20 seconds, and over half retested themselves. So good news just passes through us, but potentially bad news can get us stewing.

The Backfire Effect has always existed, but has been made much easier to employ thanks to the Internet. There are sites where nuclear power and the Hiroshima bombing are hoaxes. If regular Birthers are too moderate for you, there are sites that insist Obama is not the president. No matter what reality one if hoping to flee, there are sites that offer comfort.

Whether or not the Backfire Effect kicks in is based not on the amount or type of evidence, nor how much the person believes it. It’s based on how important the belief is to the person. The most common misconception about journalism is that reporters write the headlines, when this is actually done by editors. It greatly surprises people when I tell them this, but no one decrees this an unfounded, immoral, reporter-bashing, editor-shill, conspiracy. By contrast, Galileo’s discoveries threatened the Church, the State, and the population’s understanding of their world, and insinuated that they were not the center of the universe.

I sometimes encounter claims that the United States was founded on the Bible. I respond with mentioning the Constitutional prohibitions against establishing a religion and imposing a religious test for public office. I further point out that the First Commandment mandates the worship of Yahweh, whereas the First Amendment guarantees the right to worship any god or none at all. While offering no evidence, the others insist their point is still valid, saying “Faith was very important the Founding Fathers,” or “The Constitution is our legal foundation, but the Bible is our spiritual foundation.”

One of my favorite mantras is “Never be afraid to have your views or beliefs challenged. If they are correct they will withstand the challenge. If they are wrong, you will be enlightened.” As far as putting this into practice, however, there is little advice I can offer because the Backfire Effect is so strong. But here are some ideas on how you can bring someone to accept unpleasant evidence:

  • Let them know that they and their beliefs are separate. They are not one and the same.
  • Frame the disagreement as a collaboration, not a conflict. You should both be after the truth.
  • Learn what logical fallacies are and how to avoid them. Some right proper, jim-dandy posts about that on this blog.

The how and where are also important. It is probably only the slightest exaggeration to write that no one in Internet history has changed an anonymous person’s deeply held belief online. Facebook messaging and e-mails with someone you know is a little more possible, but in person is best, especially if the person likes and trusts you.

No one wants to look stupid. Rather than shoving proof in their face, encourage them to look for such-and-such online or in a book, then ask them later what they thought of it. Without sacrificing accuracy, show them how it might benefit them or their family. Let them know it doesn’t have to be all or nothing, if this is true. For instance, I could encourage a creationist friend to accept evidence for the age of the universe without treading into how the universe got here.

“Memory Lame” (Unreliability of memories)

ELEPHANTPOSTITI sometimes forget where I put my glasses, so I look for them and eventually find them on my head. Most people have had such experiences. And the few who haven’t could probably be made to believe that they have. Dr. Steven Novella has said of our recollections, “You have a distorted and constructed memory of a distorted and constructed perception, both of which are subservient to whatever narrative your brain is operating under.”

That is one reason why 70 percent of participants in a study conducted by Dr. Julia Shaw confessed to a crime that never occurred. Some even offered details of this non-event. Shaw swayed the subjects by mixing facts with misinformation over three hours of friendly conversation. So confessing to crimes never committed can involve more than plea-bargaining. Suspects under duress and torture are even more vulnerable.

I have countless memories of playing Wiffle Ball growing up, but wonder now how accurate they are. The sport was my favorite outlet for suspended adolescence and I played into my 30s, recording some of the latter days on film. Even two days later, my memories of things said and plays made clashed with what I was watching on tape. False memories such as this can be a distortion of something that happened, a combination of past events, or something invented.

Some are called “source memory errors,” in which the event is remembered, but the particulars confused. For instance, I may remember being picked off first base by Jerry, when it was really his brother Steve who nailed me. Or I may confuse this with a scene from the Bad News Bears.

In extreme cases, there is source amnesia. A woman, in good faith, wrongly accused psychologist Donald Thompson of raping her. The doctor was cleared because at the time of the assault, he had been on live television. It was eventually deduced that the victim had been watching the show when she was assaulted, and blocked most of it out, but associated the doctor with the attack.

I have about a dozen memories of the place I lived at ages 2 and 3: Getting candy from my uncle, a bicycle-built-for-two, and most gloriously, winning Pin the Tail on the Donkey at my third birthday party. Memories this distant are rare because the left inferior prefrontal lobe, required for long-term memory, is underdeveloped in toddlers. Furthermore, memories this ancient are usually fragmented.

Fragmented memories don’t end with adolescence. Author Martin Conway documented the case of a woman who became upset when encountering bricks or paths. It turned out she had been raped as a child on a brick path. Returning to the scene of the crime upset her, though she failed to recall the attack. False and fragmented memories are especially worrisome when they are used by prosecutors and therapists. An unscrupulous individual can encourage a patient or witness to dig deeper for memories that aren’t there, or use an incident to suggest something further happened.

False memories are sometimes the result of one anticipating that something will happen, then remembering it as if it did. I was a basketball manager in high school, stuck watching the boring JV game at the recreation center. The game was running long, and I really wanted to leave to catch the start of the varsity game. While I watched from a distance, another manager asked the JV coach if we could leave. I figured the coach would want us to go, since keeping the varsity statistics took precedence. He nodded his head yes, and off I flew. He later asked why I had left, I told him this story, and he insisted he had shaken his head no. Indeed, he had. I wanted so bad and anticipated so much that he would affirm the request, that’s what I ‘saw.’ This anticipation can be even more influential if leading questions are used, if misinformation comes from a trusted source, or if social and peer pressures are in play. UFO abductions began being perpetrated by gray aliens after the creatures appeared in a 1975 television program.

The McMartin preschool case unraveled when children began reporting that their tormenters were flying or that they included Chuck Norris. But in cases where the supposed memory is realistic, the injustice goes unnoticed.

Another key point is that memories are much more likely to be recovered after contact with a familiar object, place, or aroma, rather than in a therapist’s office or police interrogation room. And if a smell or sight rekindles a memory, it will flow naturally, whereas a detective or therapist may help the person fill in details that are imagined, or encourage them to omit others. Children are especially vulnerable to suggestion and leading questions. When children say they have no memory of something, it is unethical to prod them further.

I was in third hour English class when I heard about the first Space Shuttle disaster. My neighbor on his riding lawnmower told me about the Ronald Reagan assassination attempt. That’s how I remember these things, anyway. Cognitive psychologist Ulric Neisser had persons write down where they were when a momentous event occurred, then asked them the same question years later. Memory had a spotty performance in Neisser’s research, with some subjects even denying they had written the entry.

Phrasing can be crucial. When asked how tall a person was, test subjects estimated a whopping 10 inches more than those who were asked how short the person was. In another study conducted by psychologists Elizabeth Loftus and John Palmer, subjects were shown videos of a car crash that occurred at either 20, 30, or 40 miles per hour. Subjects were then asked to estimate the speed. They guessed the speed not based on the rate of travel, but on which verb described the incident. If the word was “contact,” most respondents said 20 mph. It the verb was “collided,” 30 mph was the most frequent answer, while “smashed” yielded mostly answers of 40 mph.

At least the car was traveling. But later in the experiment, researchers asked the participants if they had seen any broken glass, when there wasn’t any. Again, depending on the verb used, respondents were more likely to report seeing broken glass, a completely invented memory. Likewise, when subjects were asked if they had seen “a stop sign,” they usually said no. But there was a sharp increase when subjects were asked if they had seen “the stop sign,” insinuating there was one. These instances show the vulnerability of someone whose memory is being challenged, especially by someone in a position of authority. And don’t you forget it.

“Fallacious assault” (Critical thinking)

ARGUINGIt’s been a while since we’ve shone the spotlight on critical thinking, although the following are mostly examples of what critical thinking is not. Along those lines, I’ve tried to pin down a precise, easy-to-understand definition of critical thinking and that has eluded me.

But when I envision critical thinking, it’s most often in the form of someone using deductive reasoning in a series of arguments, free of logical fallacies, to reach a valid conclusion. For instance: W. Wayne writes a blog. Blogs are Internet products. Therefore, W. Wayne’s writing is on the Internet. Of course, people seldom talk that elementary in conversation. In the real world, arguments can get complex, tangential, and clouded with emotion, so logical fallacies are more difficult to detect.

I will use recent online examples of logical fallacies, identify what it’s called, and highlight the error in thinking.

U.S. soccer star Landon Donovan, who was surprisingly cut from the 2014 World Cup team, wrote a post-Cup column criticizing decisions made by Coach Jürgen Klinsmann. A few posters labeled Donovan a “bitter man,” which was an AD HOMINEM. These are arguments that address the person making the point rather than the point being made. Donovan’s level of bitterness was unrelated to how valid his criticism of Klinsmann’s coaching was.

This particular type of ad hominem is a GENETIC FALLACY, in which the person making the argument becomes the counterargument of the other person. Rather than coming up with points against the assertion, the interlocutor will criticize the opponent’s incentive or bias. A Friend posted about medical inaccuracies of Dr. Oz, causing her Friend to write sarcastically, “Yeah, Big Pharma wouldn’t have any interest in destroying the credibility of someone who’s costing them money.” The information hadn’t actually come from Big Pharma, but even if it had, whatever interest that source had in ruining Dr. Oz, or how much money he was costing them, would have had no bearing on the point’s validity.

But what about the adage, “Consider the source”? Does this have value? That depends. If a flash comes across my news feed that Answers in Genesis has disproven evolution, I would know this comes from a silly antiscientific source with a terrible track and keep scrolling. However, if I clicked the link and responded, I would need to answer with science, facts, and possibly a challenge to submit the findings to a peer-reviewed journal. Just dismissing AIG as being Young Earth Creationists or deliberately stupid would be inadequate.

Another form of ad hominem is the TU QUOQUE, which means “you too.” These point out the speaker’s hypocrisy, or counters a charge with a charge, and is not an attempt at refutation. Joshua Feuerstein made a minor Internet splash last year with an anti-evolution/anti-Big Bang rant. We will look at this rant and include responses to it, some which were fallacies and some which were solid retorts.

Early on, the goateed, backwards baseball cap wearing Feuerstein shouts, “Evolution was never observed!”

In his response, patheos.com blogger J.T. Eberhard committed a tu quoque by writing, “If being observed directly is your criteria, god should be thrown out immediately along with any stories of him creating the universe.”

This is true, but it is not a valid counterargument. Eberhard later provided one of those by writing, “Evolution has been observed. The most recent is the controlled experiment in evolution with the lizard species Podarcis sicula, in which the species developed a Cecal Valve, a new feature not present in the ancestral population. The old population of Podarcis sicula was still around and breeding, yet they had branched off to create a new animal that with adapted behaviors and features.” Here, Eberhard addresses Feuerstein’s claim and provides strong counterevidence.

Still in a frenzy, an incredulous Feuerstein later spits out, “You want me to believe that out of some accidental cosmic bang was created one cell and that somewhere along the line we all magically developed different will and different traits?!”

Eberhard commits another tu quoque by writing, “If you’re going to be defending a book which asserts the existence of a talking snake, a man being created from dirt, a woman being created from a rib, a man walking on water and rising from the dead, maybe it’s best not to disdainfully accuse your opponents of magical thinking.”

This points out flaws in Biblical literalism and highlights Feuerstein’s hypocrisy, but it does not refute the point. What Eberhart wrote next, however, did: “We have observed mechanisms that produce increased functionality over time. Evolution is driven by the same key forces that generate new order everywhere in our universe without the need for any appeal to god. They are mutation, reproduction, and selection. If you have these three catalysts in place, order and often improved functionality are the end result.”

Incidentally, Feuerstein had created a STRAW MAN by misrepresenting what astrophysicists and biologists teach. A straw man is when a person either completely makes up or greatly distorts an opponent’s position for the purpose of attacking it.

A simple Wikipedia query would have let Feuerstein know that the Big Bang Theory teaches that out of the explosion came subatomic particles which congealed into hydrogen. Clouds of hydrogen large enough to produce significant gravitational pull then collapsed inward to form stars, which fueled processes that created other elements. Contrary to Feuerstein’s cosmology, scientists do not teach a primitive cell came from the Big Bang and that this cell magically grew into others.

Also contrary to Feuerstein’s assertion, biologists do not teach that evolution is inexplicable magic. They teach that evolution is the change in inherited characteristics of biological populations over time, and is driven by random mutation, natural selection, and adaptation.

Moving on, we come to the FALSE DILEMMA, where an artificially limited number of choices are given as the only possible reason something exists. Jennifer LeClaire of Charisma Online, wondering why so many people were leaving the faith, wrote, “Is the church doing something wrong? Or is the culture wooing once-saved Christians to the godless side? Or both?” It is a false dilemma to say that only these factors, or combination thereof, could be the driving force. When I realized religion clashed with science, I dumped religion. Dumping science didn’t seem like a very practical option. I did not have a bad experience in church, nor did any person or organization recruit me to heathenism. I reached my conclusions through a third way of observation, reflection, questioning, searching, and life experiences. False dilemmas are usually created so the person has an easy way to arrive at and “prove” a point. In LeClaire’s case, she deduced that lukewarm preachers and evangelical atheists were the culprits. She cited examples of both, then was able to tie this into a nice bow because she had begun her post by eliminating competing options.

Neither lukewarm preachers nor heathen recruiters have been able to sway the musical artist Lecrae. Arguing for God’s existence, he trotted out Pascal’s Wager and announced, “If I’m wrong about God then I wasted my life. If you’re wrong about God then you wasted your eternity.” This is the ARGUMENT FROM CONSEQUENCES, where possible unfavorable conclusions are presented in lieu of providing evidence for the position.

The greatest congregation of logical fallacies I’ve come across recently was in a column my Mark Baisley at townhall.com. He crammed four of them into a column arguing that the laws of physics preclude spontaneous emergence of life. Rather than presenting original physics arguments, however, Bailey writes, “For single celled creatures to materialize, reproduce, and survive defies statistical randomness even in the inviting habitat of planet Earth. And the likelihood that the elements would self-generate into intelligent life lies outside of the probability distribution.”

However, Baisley’s inability to comprehend how life could develop is not evidence of that non-development. This is the logical fallacy of making arguments based on PERSONAL INCREDULITY. Put another way, there’s a reason he’s not a scientist.

He also commits the APPEAL TO IRRELEVANT AUTHORITY. He cited the Roman architect Vitruvius as having recognized the beauty of human form and ascribing it to God’s majesty. Vitruvius did intelligently design some nice works of his own. But his status as an ancient, revered artist gives him no special insight into how Man developed.

Later, Baisley cites “optimal conditions for structures and human biology” to argue that this “implies an ambient existence of intelligent design.” This is the APPEAL TO IGNORANCE. He asserts that because physics can’t prove how we got here, this means an intelligent designer did it. He also applies CIRCULAR REASONING by writing, “Man is so far advanced that it had to have been designed.” It is circular reasoning because he has not established that God is the only way man could reach an advanced state. His premise and conclusion are the same. It could also be argued that this was BEGGING THE QUESTION, whereby one assumes what one claims to be proving. Baisley asserts a god exists because of the structure of human biology, the structure of which proves intelligent design.

I hadn’t intended to beat up on the religious; it’s just that they’ve been especially fallacious on my news feed of late. But let’s give them a rest and close with a secular example.

Kyle Smith of the New York Post highlighted a Freakonomics blog that presented the possibility that a McDonald’s double cheeseburger might be the “cheapest, most nutritious, and bountiful food that has ever existed in human history.” It was noted the mass-produced sandwich has 390 calories, 23 grams of protein, seven percent of daily fiber, and 20 percent of daily calcium. Readers were challenged to find another inexpensive food that matched those criteria. When some offered boiled lentils, Smith MOVED THE GOALPOSTS by writing, “Now go open a restaurant called McBoiled Lentils and see how many customers line up.” He asked for a specific type of food, was presented with it, then changed the rules to also mandate that it be commercially viable. He further deduced that the lentil lovers were “huffy back-to-the-earth types, class snobs, locavore foodies, and militant anti-corporate types.” If I’ve done a decent job of instruction, you recognize these attacks as ad hominem.

“Peer review review” (Peer review process)

BOBBY4Most of my posts deal with phenomenon I don’t believe in, such as Yeti, homeopathic medicine, and crystal balls. Today we will focus on something I believe in strongly: The peer review and publication process as it applies to science.

The process, in brief: 1. A scientist conducts an experiment. 2. The scientist writes about it and submits the paper for review by other subject matter experts. 3. The paper is possibly revised after comments from reviewers. 4. The paper is published, if warranted. While the process of making it into a peer-reviewed journal ends there, attempts at replication and falsification continue, and may even lead to publication later in a peer-reviewed journal that reaches the opposite conclusion.

Peer review is a key part of the Scientific Method and the process helps reduce the chance that papers which rely on faulty conclusions or shoddy research methods will be published. Having the worked dissected by a handful of Ph.D.s, then read by many more subject matter experts, is an attempt to validate the science.

Consider why this is important. If a geologist suspects he has found holes in climate change theory, he should submit his work for peer review, not e-mail his findings to Sen. James Inhofe. If a microbiologist thinks she has cured the common cold, this should be announced to a journal’s review board, not hawked in newspaper advertisements. If the Institute for Creation Research claims to have a satellite photo of God speaking the world into existence 5,000 years ago, this evidence should be submitted to a peer-reviewed astronomy journal, not posted to the ICR website. Indeed, the surest sign of pseudoscience is when the peer-review process is bypassed in favor of going straight to a populous selected for their bias or blissful ignorance.

A last-second victory by your team may cause you to have a mostly sleepless night. Eating guacamole can cause one to overdo it and end up feeling stuffed. So even the most wonderful events can have drawbacks, and that brings us to potential deficiencies in the peer review and publication process.

Confirmation bias can occur even in a field whose role largely consists of combating it. With contentious issues like String Theory or the Multiverse, prejudice can seep in and the reviewer may give unmerited credit or undeserving scorn to a viewpoint. This is where editors really earn their money, though they could acquiesce to bias as well. There is also the possibility of personal favoritism or disdain directed at a researcher, which is why some journals prefer researchers remain anonymous in the review process. While science is cold and detached, those executing it are still susceptible to human frailties.

From this regrettable foible, we move to outright fraud. Unscrupulous researchers have been known to review their own work under an assumed name, then praise it mightily. In the most extreme case yet uncovered, mathematics journal editor Mohamed El Naschie reviewed several articles of his own before publishing them. Although more effort is now made to verify the existence and credentials of reviewers, there are more than 100 articles published in peer-reviewed journals that were, in fact, not peer-reviewed.

Another issue is vanity publishing, where a journal will print anything for a price. This has been standard practice in the novella and poetry business for years, and this tactic has been adapted for science journalism. To illustrate how easily the system can be taken advantage of, skeptics have gotten articles published in lorem ipsum or in long strings of gibberish that were said to be authored by Simpsons characters. While these were not, of course, peer-reviewed journals, the fact that they are advertised as such harms the reputation of the process.

These journals with technical-sounding names that bypass peer review are the bane of science and a lifeline for alternative medicine gurus, anti-vaccine activists, and creationists. They can be aided by lackeys in mainstream media who fail to properly vet news of miracle cures and government-pharmaceutical industry cover-ups. Even the Wall Street Journal allowed a creationist to write a column endorsing the idea.

Despite these shortcomings, the peer review process remains the most objective and preeminent way to assess scientific work. And while laudable, peer review is only a preparatory step to help ensure research methods are satisfactory, and that the Scientific Method is followed. Flaws could still slip by well-meaning, skilled reviewers, but the goal is to keep the process transparent. Peer review and publication means the paper’s conclusion is realistic, but this does not necessarily validate the theory. Replication and experimentation should continue.

The importance of the process has even been recognized by the legal community. In Daubert vs. Dow Pharmaceuticals and subsequent cases, the courts have ruled that research must be sound, independently reviewed, and use unbiased methodology before it can be considered science. This is crucial for some types of cases, where the validity of a potential cure that a hospital refused to administer may be the focal point of a malpractice suit.

Like mediums who refuse the James Randi challenge or Therapeutic Touch practitioners who decline to participate in double blind studies, self-described researchers who never engage in peer review have a litany of excuses for not doing so. Creation.com goes on for a few thousand words ostensibly explaining why it won’t submit, though it never answers the question. It meanders on about supposed flaws in the peer review publication process and about the majesty of God’s creation, but the closest it comes to explaining why it won’t submit is saying that science writers and editors are biased against them. Creationists’ complete lack of merit, education, and competence in biology and cosmology are deemed nonfactors.

Among the reasons creationists would not be published in Science, Nature, or a lesser journal is because their field is not falsifiable. There is no way to test the alleged framework of the process, and it lacks any predictive quality.

Hence, creationists have created their own print products and deemed them peer-reviewed journals. Persons who ignore all contrary evidence and cite the Bible as science peruse tripe from their brethren and then publish it. From a linguistic standpoint, this could probably be called a peer-reviewed journal. But it is not a scientific peer-reviewed journal. The same applies to alternative medicine publications and periodicals dedicated to psychic awareness.

Answers in Genesis posits that the complexity of life is proof of their position, and that it defies belief that the nervous system could have come from anything by God. The circular reasoning of the first point and the logical fallacy of relying on personal incredulity to bolster the second point have been addressed in other posts. But here, the premise is that since creationism cannot be falsified, there would be no reason to examine the evidence, or to consider it for peer review.

As it so happens, this blog is not a peer-reviewed journal, so I can examine the evidence. The proof for creation put forth by Answers in Genesis are “miracles,” the “marvelous reflection” of that creation, and that “things looked designed.” Those are the arguments of those who contend heathen hatred and bias are the only reasons their ideas are considered inferior to those of Hawking, Sagan, and Einstein.

“Thoughts on thinking” (Critical thinking)

THINKOK, boys, girls, and any ancient aliens stopping by Earth for a return visit, time for another critical thinking spotlight, starting with a few more formal fallacies. We are concerned with the argument’s form, not its content. It is possible to use correct premises, reach a correct conclusion, and still commit a logical fallacy.

The classic form of argument goes major premise-minor premise-conclusion. Used correctly, it looks like this:

No men are women.
Lyle is a man.
Therefore, Lyle is not a woman.

Here is an example of it being used incorrectly, in the form of the fallacy of exclusive premises :

No men are women.
Some women are not electricians.
Therefore, some electricians are not men.

Every line here is correct, but the logic remains flawed. There is nothing in either premise that supports the conclusion. The distinction between men and women has no relevance to what percentage of men work as electricians. A key point is that two negative premises can never equal a conclusion.

Building a point in this way requires limiting the terms to three. In the four-term fallacy, an extraneous element is introduced, usually by way of equivocation. This is using the same word, but with a different meaning both times, such as this:

Nothing is more important than good health.
A corndog is better than nothing.
Therefore, a corndog is better than good health.

In the major premise, “nothing” is used to indicate the premium value of good health. In the minor premise, it is used to establish that a corndog’s value is more than zero.

A related fallacy, the quantifier shift, occurs when quantifiers are wrongly transposed. For example:

“Everybody has something to believe in. Therefore, there is something that everybody believes in.” It is true that every person believes is some type of idea, but not true that there is a specific concept that everyone adheres to.

Next, consider the proof by example, in which one or two examples are presented as proof of a broader statement. Entire conspiracy theories are built on this faulty premise. It is also common in political and social discourse. You might see it in this form:

“Pol Pot and Joseph Stalin committed mass murder, so atheists are responsible for genocide.” In the interest of balance, it could also be, “94 percent of Nazi Germany was Christian; therefore followers of Jesus endorse the Holocaust.”

Now we’ll tackle some informal fallacies. These are arguments whose premises fail to support the conclusion. These usually involve a problem with reasoning in addition to logical structure flaws.

One of the more common is Begging the Question, also known as Circular Reasoning. This happens when the speaker attempts to prove something that is included in the initial premise of an argument. Put another way, a proposition which requires proof is assumed without offering this proof.

As Aristotle said, “Begging or assuming the point at issue consists of failing to demonstrate the required proposition.” That pretty much says the same thing as above, but I wanted to reference a Greek philosopher to seem more impressive. Anyway, here is some question-begging:

“Children’s memories of previous lives confirm the existence of past lives because there would be no other source for these memories.”

The conclusion is that past lives exist. However, the premise starts with the same assumption. Saying the memories could have no other source than a past life is assuming past lives exist. The speaker has to argue for this, not be conceded the point.

Another example would be, “’President Reagan was a great communicator because he had the knack of talking effectively.” Great communicator and talking effectively are synonymous. Using one to support the other is circular reasoning. To support the claim, the speaker should say something like, “Ronald Reagan could articulate complex ideas in simple terms, which is one of the reasons he was a great communicator.”

Now we’ll consider the hasty generalization, in which one reaches a conclusion based on insufficient evidence. “I know a guy who has driven drunk nine times and never hurt anyone, so it’s safe.” It is unreasonable and dangerous to conclude drunk driving is safe, based on the experiences of one person out of six billion.

Also, be wary of survey results. They could be based on sample size that is too small. Or they could be selectively chosen from one out of 100 surveys that had desirable results. Or the result could be cherry picked, where other findings in the same survey are ignored for their inconvenience.

Phrasing is also crucial. This was demonstrated when caller ID was about to become commercially available. Surveys revealed either overwhelming support or overwhelming opposition, depending solely on if the question was, “Would you like to have the number of the person calling you displayed,” or “Would you like to have your number displayed when calling someone?”

We will close with the Red Herring, which is one of the easiest fallacies to spot and involves skirting the issue. Debates about whether President George W. Bush had blundered by invading Iraq often devolved into, “We need to support the troops,” which was irrelevant to the question. The Red Herring often displays a lack of responsibility, as in, “I got a ticket for fishing out of season. Don’t the cops have real criminals to focus on?” Or, “I’m not getting enough people to my skeptic blog because of these people believing in ghosts and ESP.”

“Probabilistic missiles” (Critical thinking)

SKELETON THNKR
It’s been fun spooking ghostbusters and searching for proof that Loch Ness Monster hunters exist. But I’m taking a respite from those pursuits to focus a few posts on encouraging critical thinking.

There are probably at least 100 types of logical fallacies, so I will focus on specific types in different posts as a way of organizing them. Fallacies fall into two main categories: Formal and informal. A formal fallacy is wrong because of a flaw in the argument’s logical structure.

Let’s say one is making a deductive argument, which is one based on a series of premises that reach a conclusion. It is possible for each premise to be correct and still arrive at an incorrect conclusion. For example:

1. My children are happy when I take them to Toys R Us.
2. My children are happy.
3. Therefore, I took them to Toys R Us.

Although the conclusion may be true, it could be false.

Seldom would such an elementary example be presented, but we can learn to detect it in more subtle forms. A politician might say his opponent believes in high taxes and government involvement in most aspects of life. Then he would add this is how it is done in communist countries, and let audience members infer that the opponent is a communist sympathizer.

Going back to the toys argument, here it is in valid form:

1. If I take my children to Toys R Us, they will be happy.
2. I took my children to Toys R Us.
3. Therefore, my children are happy. (By the way, does anyone know how to make a backwards R in computer type? It would be useful when referencing this store or Korn).

It is also possible for a conclusion to be false despite being supported by correct premises:

1. If Bill Clinton was British prime minister, he would be a head of state.
2. Bill Clinton was a head of state.
3. Therefore, Bill Clinton was British prime minister.

Again, it would be rare that such an obviously wrong example would be foisted. (That’s my past tense verb for the day, foisted). But a trained ear can spot a less glaring instance. Creationist Ray Comfort points to order in the universe as proof of God, without bothering to demonstrate that a god is the only way order could be attained. This is the formal, propositional fallacy of Affirming the Consequent, which will be handled more in-depth during a subsequent post. The focus this time is probabilistic fallacies. These occur when a listener wrongly takes something for granted because they deduce that it would probably be the case.

We’ll start with the Base Rate Fallacy. Here, if given both general and specific information, a person tends to focus on the latter. Let’s say all we know about George is that he wears black, has multiple piercings and tattoos, and listens to Deicide. Is he more likely to be a Christian or a Satanist?

Most people would underestimate the probability of him being a Christian, and overestimate the probability of him being a Satanist. Doing so requires downplaying in this instance the Base Rate of being a Christian. There are about 500 times more Christians in the world than Satanists. While it is more likely that George is a Satanist than someone who favors blue, is without piercings and ink, and listens to Count Basie, the probability of George being a Christian is more likely than his being a Satanist. It is important to understand that there are only two possibilities being offered, meaning he could be an atheist, agnostic, apatheist, Druid, or Pagan without being a Satanist.

We also see the Base Rate Fallacy when a person decides for safety reasons to travel by road or rail instead of air. An airplane is the safest method of travel, but since surviving an airplane crash is a far more remote possibility than making it through a bus or train wreck, persons fall prey to the fallacy.

The Conjunction Fallacy assumes that specific conditions are more probable than a single, general one. The most well-known example was offered by psychologists Daniel Kahneman and Amos Tversky, who presented the Linda Problem. In it, Linda is a 31-year-old, single, intelligent, philosophy major who has been active in liberal causes. Based on this information, which is more probable: That Linda is a bank teller; or, that Linda is a bank teller who is active in the feminist movement?

Most would guess the second choice. But the probability of two events occurring in conjunction is always less than or equal to the probability of just one occurring. Even though there is a tiny chance that Linda is a bank teller, and a good chance she is active in feminism, the chance that Linda is a bank teller feminist is less than her being just a teller.

Next, we have the Hot Hand Fallacy, and its opposite, the Gambler’s Fallacy. In the former, persons think that a run of events is likely to continue, whether it be a basketball player’s shots, consumer trends, or the stock market. By contrast, in the Gambler’s Fallacy a person thinks a trend is likely to end. Both ideas are wrong since random events cannot be predicted with certainty.

The Hot Hand Fallacy causes fans to think a basketball player is likely to hit or miss based on his performance in the previous few minutes. However, each shot is a separate occurrence, so the chance of a player making it is the same whether or not he sank the previous one. A study of four years’ worth of NBA field goal attempts bore this out.

Meanwhile, the Gambler’s Fallacy is the false belief that random numbers in a small sample will balance the way they do in large quantities. The best known example was at Monte Carlo in 1913, when the roulette ball landed on black 26 straight times. The casino made millions of francs off gamblers who figured red was surely coming up next. The chance of 26 straight black spins was one in 67,108,863. However, those were the odds of all red-black combinations that could occur in 26 spins.

When tossing a quarter, a run of five heads has just a one in 32 chance of occurring. But that’s before any tosses are made. After four straight heads, the chance of the next one coming up George Washington is one in two. This is reasonably common knowledge, but is harder to detect when the ideas become more abstract.

The chance of any one person dying from a tightrope fall is very small. If a person did perish in such a manner, it would be the Gambler’s Fallacy for his brother to conclude, “I can tightrope walk with virtual impunity since the chance of two persons in the same family dying this way is infinitesimally small.” No one would reach such a ludicrous conclusion, but persons can employ logic just as faulty when dealing with more complex issues.

Finally, we will consider the multiple comparisons fallacy. Usually, this is the result of politicians, columnists, or advertisers using selective reporting to bolster a claim, rather than the result of a lack of logical thinking on the listener.

Suppose 100 studies are done on the impact of wearing black slacks to contracting Alzheimer’s. Ninety-four of the studies show no impact. Three studies indicate wearers are twice as likely to have the disease, while the other three show they are half as likely. As a result, a clothing advertisement makes the boast, “Studies show black slack wearers less susceptible to Alzheimer’s.”

I’m wearing black slacks right now and don’t have Alzheimer’s, which sadly is the best transitional, closing sentence I can come up with.

“Falsehoods, fallacies, and falafels” (Critical thinking)

FORTUNETELLER5

Time again to give fortune tellers and magic spritzer water a break and focus a post on critical thinking. We will use some real life examples to illustrate different logical fallacies.

In the mid-1980s, a serial rapist terrorized elderly women in a Pittsburgh suburb. After investigations, interviews, and leads failed to turn up anything, police chief Chris Kelly consulted a psychic. “What did we have to lose,” he asked, “We’d tried everything else.”

What they had to lose were time and resources. But it’s his second line I want to focus on. The failure of traditional police methods was irrelevant to whether consulting a psychic was a valid option. Instead of having the psychic come up with a suspect, why not just grab a random man on the street? To justify the random man, psychic, or any other new method, one needs evidence that it would work. The police chief in this case employed a non sequitur form of reasoning.

This is a common occurrence with alternative medicine patients. Lack of success from traditional methods is insufficient reason to treat one’s backache with tachyon water and jasmine crystals.

Non sequitur sightings are also frequent in the Intelligent Design world, where an organism’s complexity is presented as proof that God did it. But just because a scientific explanation hasn’t been found doesn’t mean one doesn’t exist. That would be like saying persons in the 18th Century who considered flight impossible were correct since it wasn’t being done at the time. Besides, if an eye or a mollusk’s protection is so complex as to necessitate a designer, whatever created it would have to be more complex than what it created, so the creator would have to a have a creator, as would that creator, ad infinitum.

We’ll now move on to the false dichotomy, but stick with the God stuff. Skeptic Andre Kole related the story of meeting a Christian magician who showed him a tract he had written entitled “Jesus: Magician or God?” Kole explained this was a false dichotomy. Jesus could have been neither a magician nor a god. Perhaps he was a great leader with excellent organizing skills, distinctions requiring no supernatural explanation. Or maybe he was embellished, fraudulent, loony, manufactured, or misquoted.

Kole read the tract and credited the author with making a good case that Jesus was not a magician. But it certainly didn’t follow that Jesus was therefore a deity. The frustrated author then asked why so many biblical prophecies had come true, thus employing a pair of logical fallacies: Moving the Goalposts and Appealing to Ignorance.

We move now to the regressive fallacy, which is the failure to take into account natural fluctuations. Stock market prices, golf scores, and chronic back pain inevitably go up and down. A person seeking relief from a throbbing elbow through chiropractics, magnetic belts, or chi-infused falafels is likely to do so when the pain is at its worst. If the pain lessens, the method will be credited, but it may be due to the natural fluctuations. This demonstrates the value of controlled double blind tests under strictly defined conditions. The plural of anecdote is not data. We need scientific tests, not testimonials, to gauge a medicine’s effectiveness.

We’ve beat up on alternative medicine and creationists enough, let’s address a logical fallacy common among conspiracy theorists. Proportionality bias is the belief that extreme events must have extreme, probably sinister, causes. It is a telling feature that there are seldom conspiracy theories surrounding something that failed. After substantial digging, I could only find a couple of sites that argued for a conspiracy surrounding the Ronald Reagan assassination attempt. And these gained no traction and were of little interest among the conspiracy minded. Even one of the more extreme sites, beforeitsnews.com, though sympathetic to the idea, noted that “there is no smoking gun.” (I guess John Hinckley’s didn’t count). But if Hinckley had succeeded, imagine the zeal with which conspiracy theorists would pounce on the idea that an obsessive, greatly disturbed lone gunman did the deed.

Then we have recency bias, which is placing more emphasis on what has happened lately and thinking it will continue. Many an amateur investor has fallen prey to this one, your blogger included. It’s easier to think current momentum will continue rather than bothering to analyze trends. Yet one year before advent of the Internet, personal computer, or cell phone, few persons would have foreseen these devices.

Our brains have evolved to react more strongly and quickly to threats and fear than to flattery and soothing. This makes us vulnerable to negativity bias. Condoleezza Rice took advantage of this when she glossed over the total lack of evidence for weapons of mass destruction and pronounced this ominous vision: “The smoking gun of evidence for WMDs in Iraq could come in the form of a mushroom cloud.”

“Truly Large Numbing” (Coincidence and misuse of numbers)

IN SQUAREListening to 75 songs playing at random in my CD player, I wondered how many orders it could go in. Finding the answer would be easy. I needed merely to e-mail a mathematics Ph.D. that I knew.

I don’t know what the number was called, but recall that it was over 2.4 billion googol. That is a Truly Large Number, which is the best segue I can come up with for moving into the meat of this topic. We’ll look at the Law of Truly Large Numbers and see how it relates to the skeptic movement.

The Ph. D., who incidentally was also the best miniature golf player I’ve ever known (reckon he was using geometry), was a good professor, as he explained the answer. 75 tracks could be the first one played, 74 could be the second, 73 the third, and so on. So the answer was arrived at by multiplying 75 x 74 x 73 x 72, all the way to x 2. What had been an astronomical number shrouded in mystery and awe became an astronomical number that I understood.

I have plenty of company in my mathematics ignorance. If you’re in a room with 25 persons, many people think the chance of any two of them sharing the same birthday would be just one in 24, or about two percent. But it’s not just one person who has a one in 24 chance of having someone with the same birthday. There are 24 persons with that one-in-24 chance. That bumps the chance that at least two of them share the same birthday to just past 50 percent.

I am substantially out of my element when dealing with advanced mathematics, but I recognize when digits are being misused, and it’s a regular occurrence when it involves Truly Large Numbers. The most frequent abusers are mediums and fortune tellers, but conspiracy theorists and creationists also get in on the numerical chicanery.

How this can be exploited was shown by British illusionist and TV personality Derren Brown. He presented a system for winning horse racing bets. He had thousands of volunteers, then subsequently followed the winners. At the end, only the top performer was presented on TV, with the system seemingly vindicated.

The Law of Truly Large Numbers states with a large enough sample, many odd coincidences are likely to occur. More simply, with billions of people doing hundreds of things a day, it would be the most amazing thing ever if nothing amazing happened. This can be overlooked because people tend to seek meaning in life and prefer order over randomness.

But drawing supernatural conclusions relies on the Appeal to Ignorance. When making this appeal, the proponent, probably in all caps, will insist an occurrence is so unlikely that there can be no other explanation. But mathematician and author John Allen Paulos clarifies that rarity isn’t evidence. Imagine someone (let’s make it Harpo Marx) shuffling and turning over 52 playing cards. Whatever order Harpo lays them in, there was about a one in a quarter-googol chance of that being the sequence when he began. Paulos notes it would be nonsense to conclude that Harpo could not have dealt them that way because the sequence was so improbable.

Anyone could see the absurdity of that conclusion. But it’s different when dealing with occurrences that have personal meaning. Then, the tendency is to ascribe genuine power to what the palm reader told you, what you prayed about, or what the telephone psychic predicted. In these instances, one holds tight to the subjective validation, and cognitive dissonance won’t let it go.

Uri Geller put down his bent spoons long enough to come up with a long list of “stunning coincidences” regarding the Sept. 11 attacks that added up to 11. One example: “September 11th is the 254th day of the year: 2+5+4 = 11.” Skeptic leader Robert Carroll countered with a list that contained many noncoincidences that added up to something other than 11.

A person may dream of an airplane crash the night before a wreck happens. With six billion people having multiple dreams each night, someone is going to have a dream about an airplane crash every day. I’ve had over 100 myself. At some point, someone will probably also hit on the place and airline. To call this proof of clairvoyance would require ignoring the many more dreams that don’t come true, or the things that happen without being dreamt about. Also, dreams are often vague or ambiguous, allowing many interpretations.

Highlighting an event’s rarity or unlikelihood is also a hallmark of the creationist. One difference from the others is that it usually also involves the misapplication of science principles. One example, from darwinsimrefuted.com: “For a 300-molecule-long protein to form by total random chance would be a one in 10 to the 390th power occurrence.” Throw in a straw man or two and the creationist instantly vanquishes evolution and biology. The misrepresentation here is that everything in the process is random. In actuality, evolution is a very slow process with many incremental steps, overseen by natural selection. As to the Truly Large Number correlation, I defer to to evolutionfaq.com: “In the prebiotic oceans of early Earth, there were billions of trials taking place simultaneously as the oceans, rich in amino acids, were continuously churned by the tidal forces of the moon and the harsh weather conditions of Earth. Considered in this more comprehensive and inclusive way, the true odds are revealed.”

This month, an Israeli soldier was shot, with the bullet hitting a grenade in his pocket. The bullet was blocked by the grenade, which failed to detonate. This made the social media rounds, with prayer cited as the reason. There was no mention of the thousands of persons slaughtered in the conflict in spite of all this prayer. The dud grenade was cited as proof of miracles, without reference to the millions of non-miracle working grenades over the past century.

Highly improbable does not mean impossible, as every winning lottery ticket demonstrates. It also doesn’t mean anything is at work beyond the Law of Truly Large Numbers.

“Critical edition” (Critical thinking)

BOYTHINKINGThe stated purpose of this blog is twofold: To examine claims of the supernatural and paranormal, and to promote critical thinking. Yet only two of my 34 posts thus far have dealt exclusively with the latter. Taking jabs at Tarot Cards, ghost hunts, and crystal healing have proven just too tempting.

But in the interest of balance, we are overdue for a critical thinking spotlight. We will go through some bad argument forms to see how to recognize and avoid them. Remember, this refers to the arguments’ structure and not the positions taken.

Also, a person’s intelligence has little to do with their critical thinking skills. A 140 IQ could be a plus toward thinking critically, but only if it’s used correctly. Jerome Corsi, a Harvard Ph.D., uses his vast intelligence to lead the Birther movement. Stanford alum Bryan Fischer argues that shaking iPod ear buds in a box will disprove evolution.

To keep from falling into these traps, learn to recognize logical fallacies. A frequent one is “Affirming the Consequent.” It can take a form like this: “If Madam X is psychic, she could correctly predict the future. Madam X correctly predicted that there would be an airplane crash in Asia this summer. Therefore, Madam X is psychic.”

Even if the first two statements are true, it can lead to a false conclusion. This is because the first two statements fail to take into account that there could be other factors in play. In this instance, persons can make correct predictions based on knowledge of the subject rather than paranormal abilities. Or, common in the prophecy field, someone may get a few hits and many more misses. Or Madam X could have researched airplane crash histories to determine which times and places are most likely to experience one. In fact, this very method was used in the 1980s by a skeptic, who correctly predicted a crash, then revealed his methods.

Closely related to affirming the consequent is “Confusion of the Inverse.” It is commonly heard that 95 percent of accidents occur within 10 miles of home. Deducing, then, that being far from home makes one safer would be confusing the inverse. Most accidents occur within 10 miles of home because that’s where people spend a majority of their time. Besides accidents, most meals, entertainment, and exercise also occur within 10 miles of home.
But being far from home won’t make one tired, bored, and lazy.

Another example: “People who sit at the front of the classroom make more A’s than those who sit in the back. So to make better grades, sit in the front.” But “A” students usually want to be close to the teacher and visual aids, whereas the more casual pupil prefers the rows of protection that allow them to pass notes and doodle.

Another similar logical fallacy is “Denying the Antecedent.” This follows the form, “If A, then B. Not B. Therefore, not A.” It is a doomed premise from the beginning because it is imposing false constraints on the subject.

An example would be: “If it is rush hour, the interstate will be packed. It is not rush hour. Therefore, traffic will be light.” Again, two correct statements might lead to an incorrect conclusion. An accident or, more excitingly, a sinkhole, might have traffic backed up.

Then we have special pleading. Here, a person needs to carve out an exception to one of their arguments in order for the overarching point to be made.

When discussing the existence of God with creationists, one line I frequently hear is that something cannot come from nothing. When this theological table is turned, and the creationist is asked what caused God or how he come from nothing, the usual replies are “He just always was,” or “He didn’t need to be created.” This, of course, contradicts the point the creationist made in the first place.

Next, we have tu quoque, which is literally “You too.” When a U.S. District Judge proscribed county commissioners in Carroll County, Md., from using exclusively Christian prayers before meetings, aforementioned theocrat Bryan Fischer claimed this was Christian persecution. I committed a tu quoque by chiming in with, “This outrage coming from a man who wants to ban the construction of mosques.” But Fischer’s hypocrisy was unrelated to the claim he made. There were good arguments against Fisher’s position, but I failed to make them.

Two other logical fallacies are composition and division. In composition, the implication is that because something is true in part, it must be true as a whole. But because atoms are unseen, it doesn’t mean that a desk comprised of them would likewise be invisible. Similarly, if a person at a soccer match stands up to improve his view, it doesn’t follow that if all his fellow fans join in, everyone will see better.

The inverse in division, which makes the false conclusion that because something is true as a whole, it must be true in part. One can safely consume salt, but not sodium or chloride individually.

I hope this has been enlightening, and I’ve enjoyed writing it. But now I’m ready again to start tackling exorcists and reincarnation.

“A night of heavy thinking” (Critical thinking)

CHIMPTHINKING


Most people would agree that it’s better to be objective and corrected than to be biased and wrong. But since the journey from the latter to the former is painful, many people avoid it.

A couple of ideas might make the trip less unpleasant. First, try separating yourself from your beliefs. They are not one and the same. Second, think of disagreements as collaborations, not conflicts. If discussing an issue with someone, both of you should be trying to find the truth. That’s where critical thinking comes in. This is using your knowledge and intelligence to reach the most reasonable position, while overcoming roadblocks to rational thinking.

These roadblocks include selective memory, especially when it strengthens one’s belief. If Maria is convinced that full moons leads to crazy behavior, she will notice when these two collide. But she may pay no attention to full moons without crazy behavior or to crazy behavior without full moons. For an objective approach, Maria should first analayze patterns over the past five years to see if there is any change in criminal behavior during a full moon. Next, she should remember that correlation does not imply causation and consider various conclusions.

Another hindrance to overcome is false memories. I was baffled when I couldn’t find the ID card I put in the slot between my seats three minutes prior. When I found it, it was not in the slot, on the car floor, or under the seats. It was in my wallet. A false memory had been created, and this involved only me and a short passage of time.

This phenomenon manifests itself in a tragic manner when a parent, perhaps dealing with a lack of sleep and change of routine, creates a false memory of having left their child at daycare instead of in the back seat. After the Space Shuttle disaster in 1986, a group of students were surveyed about where they were when they heard about the explosion. They were asked the same question 10 years later and a majority gave the wrong answer. It is a handy tool in one’s critical thinking kit to know memories are often manufactured to fill in gaps in our recollection.

Without this understanding, the thinker is subject to the flaw of relying on anecdotes from others to bolster their views. A person convinced that dowsing can be used to find water is going to put more credibility than is justified in an uncle’s friend’s story that he could do it. Testimonies can be subjective, inaccurate, even invented. Thrilling tales of encountering Yeti or of a cousin’s trip to an amazingly prescient fortune teller do not prove the existence of the beast or the accuracy of the crystal ball. Anecdotal evidence is often an oxymoron. Resist making judgments on testimonies, especially the more extraordinary the claim. Let’s base our conclusions on the most likely reality, not the most appealing.

Also, beware the appeal to emotions. While not always meant to engender a reaction, words such as patriot, children, traitor, God, flag, socialism, terrorist, and white privilege can be an attempt to heighten feelings and to bias the listener for or against a topic or person. Instead of the emotive words, focus on the reasoning and facts behind the claim. A similar tactic is the false dilemma, where the speaker frames the only choices as good-or-evil, right-or-wrong, and with-us-or-against-us. This is common during war or national emergencies. When confronted with these deliberately limited choices, seek opposing viewpoints which may reveal the existence of other alternatives. These can include being for both, against both, neutral to both, or partially for or against both.

Another tactic to look for is the ad hoc hypothesis, where a rationale is invented to keep a theory from being disproven. James Randi was interviewing three men who claimed to be able to detect a type of water with supernatural properties, and they had a sample with them. Randi put the magic water in a container, then filled two similar containers with water from a tap. He then proposed the three men be put in separate rooms and the containers rearranged. The three would enter the room individually and determine which was the magic water. There would only be a one in 27 chance the trio would all pick the lucky liquid, and if they did so, it would be a good sign the claim was accurate. At this point, the experiment abruptly ended. The men claimed the magic properties had emanated from its source and infiltrated the other samples. Put no stock in claims such as these since they cannot be independently tested.

It is also crucial to get a rudimentary understanding of the Law of Truly Large Numbers. With a large sample, many seemingly stunning occurences are actually likely to happen. Telepathy, astrology, and clairvoyance all play on ignorance of this idea. The Lincoln-Kennedy assassination coincidences are another well-known example. This list puts stock in the idea that the presidents were elected 100 years apart, but ignores the years of their births and deaths being separated by uninteresting periods of 108 and 98 years. With practice, the listener can begin to realize when numbers are being used correctly and objectively, rather than incorrectly and with bias to support an argument.

Another barrier to overcome is the post hoc fallacy. This is falsely asserting that because one event happened near the same time as another, the two are related. Astrologer Valerie Livina wrote that a lunar eclipse caused deadly fires in Australia. But skeptic leader Dr. Robert Carroll replied, “How do we know that the fires didn’t cause the eclipse?” We don’t know that, but there is no reason to believe it, and there is no reason to believe the opposite. To think critically, employ Occam’s Razor and identify the possible causes and effects, beginning with the most likely.