Tag: Logical Fallacy

Fun with Logical Fallacies

We came across a cool book recently called Logically Fallacious: The Ultimate Collection of Over 300 Logical Fallacies, by a social psychologist named Bo Bennett. We were a bit skeptical at first — lists like that can be lacking thoughtfulness and synthesis — but then were hooked by a sentence in the introduction that brought the book near and dear to our hearts:

This book is a crash course, meant to catapult you into a world where you start to see things how they really are, not how you think they are.

We could use the same tag line for Farnam Street. (What was that thing about great artists stealing?)

Logically Fallacious a fun little reference guide to bad thinking, but let’s try to highlight a few that seem to arise quite often without enough recognition. (To head off any objections at the pass, most of these are not strict logical fallacies in the technical sense, but more so examples of bad reasoning.)

Logical Fallacies

No True Scotsman

This one is a favorite. It arises when someone makes a broad sweeping claim that a “real” or “true” so and so would only do X or would never do Y.

Example: “No true Scotsman would drink an ale like that!”

“I know dyed-in-the-wool Scotsmen who drink many such ales!”

“Well then he’s not a True Scotsman!”

Problem: The problem should be obvious: It’s a circular definition! A True Scotsman is thus defined as anyone who would not drink such ales, which then makes them a True Scotsman, and so on. It’s non-falsifiable. There’s a Puritanical aspect to this line of reasoning that almost always leads to circularity.

Genetic Fallacy

This doesn’t have to do with genetics per se so much as the genetic origin of an argument. The “genetic fallacy” is when you disclaim someone’s argument based solely on some aspect of their background or the motivation of the claim.

Example: “Of course Joe’s arguing that unions are good for the world, he’s the head of the Local 147 Chapter!”

Problem: Whether or not Joe is the head of his local union chapter has nothing to do with whether unions are good or bad. It certainly may influence his argument, but it doesn’t invalidate his argument. You must approach the merits of the argument rather than the merits of Joe to figure out whether it’s true or not.

Failure to Elucidate

This is when someone tries to “explain” something slippery by redefining it in an equally nebulous way, instead of actually explaining it. Hearing something stated this way is usually a strong indicator that the person doesn’t know what they’re talking about.

Example: “The Secret works because of the vibration of sub-lingual frequencies.”

“What the heck are sub-lingual frequencies?”

“They’re waves of energy that exist below the level of our consciousness.”

“…”

Problem: The claimant thinks they have explained the thing in a satisfactory way, but they haven’t — they’ve simply offered another useless definition that does no work in explaining why the claim makes any sense. Too often the challenger will simply accept the follow up, or worse, repeat it to others, without getting a satisfactory explanation. In a Feynman-like way, you must keep probing, and if the probes reveal more failures to elucidate, it’s likely that you can reject the claim, at least until real evidence is presented.

Causal Reductionism

This reflects closely on Nassim Taleb’s work and the concept of the Narrative Fallacy — an undue simplifying of reality to a simple cause–> effect chain.

Example: “Warren Buffett was successful because his dad was a Congressman. He had a leg up I don’t have!”

Problem: This form of argument is used pretty frequently because the claimant wishes it was true or is otherwise comfortable with the narrative. It resolves reality into a neat little box, when actual reality is complicated. To address this particular example, extreme success on the level of a Buffett clearly would have multiple causes acting in the same direction. His father’s political affiliation is probably way down the list.

This fallacy is common in conspiracy theory-type arguments, where the proponent is convinced that because they have some inarguable facts — Howard Buffett was a congressman; being politically connected offers some advantages — their conclusion must also be correct. They ignore other explanations that are likely to be more correct, or refuse to admit that we don’t quite know the answer. Reductionism leads to a lot of wrong thinking — the antidote is learning to think more broadly and be skeptical of narratives.

“Fallacy of Composition/Fallacy of Division”

These two fallacies are two sides of the same coin: The first problem is thinking that if some part of a greater whole has certain properties, that the whole must share the same properties. The second is the reverse: Thinking that because a whole is judged to have certain properties, that its constituent parts must necessarily share those properties.

Examples: “Your brain is made of molecules, and molecules are not conscious, so your brain must not be the source of consciousness.”

“Wall Street is a dishonest place, and so my neighbor Steve, who works at Goldman Sachs, must be a crook.”

Problem: In the first example, stolen directly from the book, we’re ignoring emergent properties: Qualities that emerge upon the combination of various elements with more mundane innate qualities. (Like a great corporate culture.) In the second example, we make the same mistake in a mirrored way: We forget that greed may be emergent in the system itself, even from a group of otherwise fairly honest people. The other mistake is assuming that each constituent part of the system must necessarily share the traits of the whole system. (i.e., because Wall St. is a dishonest system, your neighbor must be dishonest.)

***

Still Interested? Check out the whole book. It’s fun to pick up regularly and see which fallacies you can start recognizing all around you.

Crimes Against Logic: Exposing the Bogus Arguments of Politicians, Journalists, and Others

Jamie Whyte

A lot of our day is spent trying to convince people of something. To do this we often make arguments as to why our product or service is better, or, more commonly why our own opinion is right and yours is wrong. But few of us understand the art of argumentation.

Crimes Against Logic: Exposing the Bogus Arguments of Politicians, Priests, Journalists, and Other Serial Offenders, a book by Jamie Whyte, “aims to help fill the gap left by the education system,” in the ways that our reasoning can go wrong. “The logic equivalent of one of those troubleshooting guides in your car or computer manual.”

Errors in logic are not visible.

When a car breaks down, anyone can see that it has even if he knows nothing about how cars work. Reasoning is different. Unless you know how reasoning can go wrong, you can’t see that it has. The talking doesn’t stop, no steam emerges from the ears, the eyes don’t flash red.

Until Google invents a device that exposes our errors in reasoning we need to rely on ourselves. And most of us don’t know a lot about the ways that reasoning can go wrong. Whyte argues that we’ve become a nation of suckers.

Schools and universities pack their minds with invaluable pieces of information— about the nitrogen cycle, the causes of World War II, iambic pentameter, and trigonometry— but leave them incapable of identifying even basic errors of logic. Which makes for a nation of suckers, unable to resist the bogus reasoning of those who want something from them, such as votes or money or devotion.

Often, when we can’t tell good logic from bad we turn to cynicism, “discounting everything said by anyone in a position of power or influence.”

But cynicism is a poor defense, because it doesn’t help to tell good reasoning from bad. Believing nothing is just as silly as believing everything. Cynicism, like gullibility, is a symptom of underdeveloped critical faculties.

The Irrelevant Right

Jack has offered some opinion— that President Bush invaded Iraq to steal its oil, let’s say—with which his friend Jill disagrees. Jill offers some reasons why Jack’s opinion is wrong and after a few unsuccessful attempts at answering them, Jack petulantly retorts that he is entitled to his opinion.

The fallacy lies in Jack’s assumption that this retort is somehow a satisfactory reply to Jill’s objections, while, in fact, it is completely irrelevant.

Jack is just changing the subject to one of rights, not addressing the issue. Here is a simple way of putting it.

The fallacy lies in Jack’s assumption that this retort is somehow a satisfactory reply to Jill’s objections, while, in fact, it is completely irrelevant.

We consider our opinions to be sacred.

Many people seem to feel that their opinions are somehow sacred, so that everyone else is obliged to handle them with great care. When confronted with counterarguments, they do not pause and wonder if they might be wrong after all. They take offense.

So the next time someone says you have a right to your own opinion, mentally go back and see if they are addressing your argument or just changing the subject. If you really want to have fun, you can ask them what duties do rights impose on others?

Motives

When my sister was fifteen, she thought she had fat thighs. Occasionally, she would demand to know, “My thighs are fat, aren’t they?”

“No darling,” my parents would reply, “you have nice thighs; you’re a beautiful girl.”

Well, that confirmed it. “You’re just saying that!” was the constant refrain as my sister took our parents’ protestations to the contrary to confirm all her worst fears.

My sister was committing the Motive Fallacy. She thought that by exposing our parents’ motives for expressing an opinion— to make her feel better and shut her up— she had shown the opinion to be false. But she hadn’t. It is perfectly possible to have some interest in holding or expressing an opinion and for that opinion to be true. A man may stand to gain a great deal of peace and quiet from telling his wife that he loves her. But he may really love her nevertheless. It suits most to believe they are of better than average looks, and at least 44 percent of the 90 percent who believe this actually are. My sister’s legs were not fat. In other words, you don’t show someone’s opinion false just by showing that he has a motive for holding it.

This happens when billions of dollars are at risk too.

The motive fallacy is another way that we end a debate. You don’t actually refute the positions of the other person, you simply change the subject.

First, you are discussing some issue, such as whether my sister has fat thighs, and then, after the fallacy is committed, you find yourself talking about the motives of those involved in the discussion. Perhaps this is why the fallacy is so popular. It turns all discussions— be they about economic policy, religion, or thighs— into discussions about our alleged motives and inner drives.

Authority

The fallacy lies in confusing two quite different kinds of authority. There is the kind of authority your parents, football referees, and parking attendants have: the power to decide certain matters. For example, your parents have the power to decide when you will go to bed. Hence, in answer to the question “Why is 8:00 P.M. my bedtime?” the answer “Because I say so” is quite right; your parents are, quite literally, the authors of your bedtime. But it is not up to them whether or not Jesus was conceived without the help of sexual intercourse. Mary’s being a virgin at the time of Jesus’s birth is beyond the will of your parents, or indeed anybody else’s (with the possible exception of Jesus’s parents). So your father’s answer “Because I say so” is quite wrong when the question is “Why should I believe in the virgin birth?” The matter exceeds the scope of his parental authority.

Yet, there is another metaphorical sense of “authority” on which the answer “Because I say so” is sometimes reasonable, even when literal authority is absent, namely, the expert kind of authority. If someone is an expert on some subject (or an authority on the topic, as it is often put) then his opinion is likely to be true— or, at least, more likely to be true than the opinion of a non-expert. So, appealing to the opinion of such an authority— i.e., an expert— in support of your view is perfectly OK. It is indirect evidence for your opinion.

We can’t all be experts on everything. When laypeople sit around debating evolutionary biology, quantum physics, developmental economics, and the like, as the government’s reckless education policies mean they increasingly do, one of the best pieces of evidence likely to be put forward is simply “Because Nobel laureate Joe Bloggs says so.” …

The Authority Fallacy should now be clear. It occurs when the first literal type of authority, whereby someone has the power to make certain decisions, is confounded with the second metaphorical type, whereby someone is an expert and so likely to be right about some matter of fact.

Relating this to government and democracy, Whyte points out the power of the people also comes with the ability to make the wrong choices.

All democratic politicians agree that ultimate political authority lies with The People. On other matters they may disagree. One may think private schools an abomination, the other that the state should have no role in education. Each tries to convince the public that her view is right, knowing that popular opinion will decide the matter. But, “decide the matter” does not mean determine who is right. The People cannot do that; no one can by mere decision make a state monopoly on education superior to a private system, or vice versa. Public opinion decides the matter only insofar as it chooses which policy will be adopted. And the public is perfectly capable of choosing the inferior policy. If it were not, if popular opinion were invariably correct, then politicians would have no serious leadership role to play; government could be conducted by a combination of opinion pollsters and bureaucrats.

Spotting this fallacy is easy, simply ask yourself if the source offered up as an authority is indeed an expert on the matter in question. If not, ask them to explicitly walk you though the argument.

Crimes Against Logic goes on to introduce you to other logical fallacies that you and others use every day. If you’re interested in improving your own arguments and spotting errors in the arguments of others, this is a good starting place.