Logical Fallacy Shoot the Moon

May 2nd, 2010 by Potato

Back in November, Steven Novella at Neurologica blog had this clever turn of phrase: “Nice straw man. I can see Maher wants to hit as many logical fallacies in one post as possible. Maybe he thinks it’s like shooting the moon in hearts – if you get them all, you win.”

As a scientist, I recognize that debate is an important part of life. Ideas are fire-hardened when tested and defended against (constructive) criticism. Anyone can be wrong, and it’s important to arrive at the truth (even if you are wrong). After all, being wrong can cost you money, or lead to poor treatment, while the truth, as the saying goes, will set us free. You need to test your assumptions, methods, and facts to make sure that you’re not fooling yourself, or that others are not fooling you.

Indeed, this is what the whole idea of peer review is all about: if 3 experts in the field can look at your work and not find anything so wrong with it as to prevent it from being published, it’s probably worth other people’s time to read. The criticisms they do come up with can be used to improve the work.

However, while intelligent debate can be useful and productive and enlightening, there are a number of people out there that have no intention of engaging in constructive debate. They want to argue. They believe they are right and just want to bully other people into thinking they are right too. The truth matters little to them, and you cannot change their mind. They will often use the following logical fallacies to try to win* their case.

* – and note that winning is not the same as finding/proving the truth.

You’ll notice a theme in many of the logical fallacies: rather than dealing with the idea or the data directly, people will instead focus on issues surrounding the core debate, such as the person making the assertion.

ad hominem: Attacking the person delivering the point, rather than the point itself. For example, “Garth Turner is an arrogant blowhard who was totally wrong about Nortel and has been wrong about everything. Therefore the real estate market in Canada will not crash.” Well, all the rest of it may be true (and in partial defense to Garth, only by degrees), but that doesn’t dispute the fact that the housing market in Canada may very well crash. Shooting the messenger doesn’t make the message any less true. Granted, as humans we have evolved some sophisticated neural mechanisms telling us to not trust what untrustworthy people say, which is why ad hominem attacks can be effective in a war for the hearts (though not necessarily the minds) of the common people. However, saying something bad about a person expressing an idea is not countering the idea, so you have to watch for that. If the best an opponent can come up with is attacking the messenger, the idea may have merit. However:

Bias is also something to watch out for. As they say, “it is difficult to get a man to understand something when his job depends on not understanding it.” Bias, on its own, is not necessarily a reason to give an idea up for dead. However, you’ve got to be especially careful with biased sources, and perhaps demand stronger proof from them. Your scepticism should be on full tilt when reviewing data and arguments from biased sources. It will change the balance of probabilities and how you weight evidence. Contrarily, when people speak against their biases it may be a reason to perk up and pay attention, even if you don’t have much faith in the idea to begin with. Where things get particularly tricky is when pointing out biases blurs the line with ad hominem attacks. For instance, global warming sceptics like to say that atmospheric scientists are biased because their funding agencies support the notion of global warming. But that’s a little specious since there are a great many scientists out there with funding support from a wide variety of organizations: they can’t all have their livelihoods depend on toeing the line, and pretty much all of the government funding is not contingent on finding certain results.

Argument from authority is often made. Again, this works because we can’t all be experts on everything, and we’re brought up to respect the hierarchy. So when an authority figure says something is so, we may believe them — especially when that person is an expert in a field that we are not an expert in. Indeed, when we look to probabilities and weighting of evidence, we may — rightly so in most cases — rely more on evidence that comes from those with authority or respect in the related field. However, it’s not guaranteed that what an authority figure says is true; it’s not true just because an authority figure is saying it. So if that’s all you have to go on, it’s a little weak. This logical fallacy is sometimes twisted by those who are speaking against authority by turning it into an ad hominem: “of course you’d say that, it’s what the man wants you to believe. You’re not going to believe in science just because a bunch of scientists tell you to, are you? They’re really just making arguments from authority.” It’s also manipulated by using a non-relevant authority figure: cf. all of the physicians and engineers that have signed the petitions against global weirding, for instance.

We talk often about the “weight of evidence” — in an imperfect world you often only have a reasonable certainty of what the truth is, but can’t know it absolutely. Some, however, take this phrase to mean “he who shovels the most bullshit wins”. So you have to watch for irrelevancies thrown out just to make the pile look larger. This is especially useful in drawing someone down the rabbit hole: start with something true, even if it’s not relevant, and then continue with more specious arguments. Red herrings like this can also serve to distract from the real issue. One favourite distraction of global warming deniers is to throw out irrelevant facts such as “the Earth has had warming periods before” and “the amount of carbon dioxide released by humans is dwarfed by the amount released by natural processes each year.” It’s true that there have been warm periods in the distant past — considerably warmer than where we are now. There was also a period where the Earth was ruled by dinosaurs, and a time when no surface plant life existed. The Earth will be just fine whether or not global weirding occurs. The problem is that now we have humans and civilization and all our stuff, and we care very much about what happens to all of that — what happens if Hawaii is swallowed by the rising sea and we never find out what happens on Lost? Yes, life (even humans and cats) will probably survive if the current breadbaskets of the world turn to deserts and we have to start farming or hunt squirrel in what is now boreal forest… but the upheaval would be devastating. Likewise, natural processes release orders of magnitude more CO2 every year than we burn in fossil fuels, but they also reabsorb that CO2. It’s part of a carbon cycle, whereas our emissions are partly one-way. Same with personal finance: your $60/month coffee habit may be dwarfed by your $2000/mo rent cheque, but if your rent is in balance with your salary, it’s the additional small net expenditures that will lead you into debt trouble…

Changing the scale. This is a problem I’ve seen several times, but I don’t think it’s in the formal list of logical fallacies. There is noise in the real world. You can have a theory that’s great at explaining a lot of things, such as Newtonian physics. However, it has a limited scope or scale where it applies: it does a poor job when dealing with atomic-scale phenomenon, or those close to the speed of light. That doesn’t mean it isn’t correct or useful within the range of everyday experience. Likewise, people criticize global warming, exclaiming “how can we predict what will happen to the climate decades from now if we can’t even predict tomorrow’s weather!” — of course, the global warming models aren’t designed to predict weather. That’s just noise to those models. The fact that the models that are designed to predict weather on the several-day timescale aren’t good isn’t necessarily evidence that the climate models are junk. Likewise, people may question how an investor can know that over a 40-year timeframe they have a very good chance of making money if that same investor has no idea what the market will do the next day, year, or even decade. It can cut both ways: a theory or datum can be so narrow in its application, or require such astonishing rare conditions to apply that it is not really useful to the real world, and thus is not really worth arguing over. This can happen with, for example, paranormal or psychic phenomenon that proponents claim are real, but which do not operate in the presence of skeptic’s negative energy. More generally, these tend to fall under the category of:

Flying Spaghetti Monster, from Wikipedia

Non-falsifiable. In science, we generally do not consider explanations which are not falsifiable. “You can’t prove that it’s not true” may be considered by schoolyard braggarts and preachers to be a reason to believe something, but a theory that can’t be tested is not much of a theory (and if it can’t be tested, how much explanatory power does it have?). Note that there is a difference between theories that are difficult to falsify (for example, to really falsify the theory of global warming we’d just continue on our current path and see what happens — not really practical) and those that cannot be falsified (the many-tentacled and all-powerful flying spaghetti monster an the ur-god Potato (from Whom we have been Blessed) does not like to be tested and alters the results of your experiments so it is impossible to prove or disprove His existence).

Straw man: a popular method of shooting down an opponents’ point is to over-simplify it, and then discredit that superficially similar, over-simplified version instead, without really getting to the core of the actual theory. This is the straw-man attack, and the etymology should be obvious. Creationists like to use this to make evolution seem absurd, by saying that evolution implies dinosaurs having chickens come out of their eggs, or a monkey giving birth to a human. The Neurologica blog entry I linked earlier includes a straw man from the anti-vaccination side.

Common sense is, unfortunately, anything but common. There are many cases where our common sense betrays us: for example, before you learn about how gravity works (and if you’re not a scientifically-inclined person, after you leave school and forget), you’d think that a feather and a bowling ball would drop at different rates if dropped on the moon, since they obviously do so in everyday experience (with air resistance). Of course, that appeal to common sense would lead you astray. Likewise, in the long comment on global weirding at Netbug’s site, I debunked the common sense appeal of the deniers that “the sun is behind it all.” After all, they say, “the sun is the engine of the system”. It’s where all the heat energy comes from, and it’s many, many times bigger than the earth. Unfortunately, the effects on our climate of changes in the energy coming from the sun are swamped by local effects due to some large amplification factors. After all, Victoria, BC, and St. John’s, NL, are at nearly the same latitude and get roughly the same amount of energy from the sun, but their climates are vastly different. The common sense notion that the engine of the system makes all the difference doesn’t make sense!

One that is not among the lists of formal logical fallacies I’ve looked up, but which I find myself beating my head against constantly, is the issue of only looking at one side of the equation. This happens a lot when you’re looking at a risk vs benefit decisions, whether it’s getting a flu shot or buying a hybrid car. Some people will focus solely on the risks or costs, but not the overall balance (or the context). This gets especially entertaining when you add unknown risks in to the mix, because people become irrational about avoiding risks that are not completely known, especially if they also don’t have control over those risks. Yet the absolute level of risk is not an issue to them, since they willingly partake in demonstrably riskier behaviours regularly (such as crossing the street, driving a car, etc.).

Democracy is a fine way to get people to come to moderately agreeable compromises when interacting with each other. It is not, however, a way to arrive at the truth, especially since common sense isn’t so common. So appeals to popularity should not sway you: just because an idea is popular does not mean it is true. Again, just because an appeal to popularity is made does not make the idea not true, either — the fact that a great many atmospheric scientists agree on the basics of global warming does not in and of itself serve as proof of global warming (but pointing out the appeal to popularity there is not a disproof, either).

Begging the question is a turn of phrase I never quite got my head around. However, I am quite familiar with tautologies and circular arguments, which is what this logical fallacy comes down to.

False Dichotomy: A rhetorical device often used is to create a false dichotomy: you’re with us or you’re against us. Except, we all know that only a Sith deals in absolutes; there are usually compromise solutions, and positions of the middle ground.

While this list is not exhaustive of the many ways humans can abuse logic and debate, I will end with my very favourite: denying the antecedent. That’s fancy-talk for getting your assumptions wrong. I love looking at rules-of-thumb, since they can be so useful for everyday life. They all tend to come from somewhere though, and they are only valid for certain sets of initial assumptions. For example, one rule of thumb is that it’s better to own your house than rent it since your landlord is making a profit off of you. And for the most part, that’s true; but it’s not always true. If the housing market gets out of control (and I have argued many times before that it has in Toronto) then your landlord, if he were to buy today, might not make a profit off of your rent. In which case, it would be better for you to rent. It’s very important to identify your assumptions, I can’t tell you how often that has been critical in science, especially when shortcuts and methods of estimating come into play (which are, after all, just fancy rules of thumb). Perhaps a more universal example would be the “rule of 72” for doubling-times. It’s a rule of thumb that works fairly well over a range of annual returns, but if you a) can’t compound those returns (e.g., if you lucked into a bond with a high coupon one year, or bought stock at a market low, but in subsequent years couldn’t reinvest for the same rate of return) or b) are outside of a certain range, then it’s going to give you poor results.

In the spirit of shooting the moon, here is a checklist of all 13 logical fallacies listed here. If someone you’re debating with gets them all, then they “win”!

  • Ad hominem
  • Bias
  • Authority
  • True but not relevant/red herring
  • Non-falsifiable
  • Changing the scale/lost in the noise
  • Straw man
  • Common sense
  • One side of the equation/losing context
  • Appeals to popularity
  • Denying the antecedent/ignoring your assumptions
  • Begging the question/tautology
  • False dichotomy

3 Responses to “Logical Fallacy Shoot the Moon”

  1. Netbug Says:

    Nice post. 2 thoughts mildly on topic:

    1. This is what bothers me about the stigma associated with “flip-floppers”. I WANT someone who can change their mind and opinion with the emergence and presentation of new facts.

    2. This TED talk is also sort of on topic and I found very, well, not enlightening, because it’s just common-sense, but well presented, like your argument. :) http://www.ted.com/talks/michael_specter_the_danger_of_science_denial.html

    And why on earth is there this opinion among people that in order to have a positive relationship with someone you have to agree on everything? If we agreed on everything, what a boring world we would live in.

  2. Netbug Says:

    I also really like the illustrations.

  3. Potato Says:

    Oh, it looks like such a list was already produced by Doc Novella in an earlier post:
    http://www.theness.com/neurologicablog/?p=499

    Oh, well.