Dodgy argumentsextendeda4

12
RECOGNISING DODGY ARGUMENTS HOW VALUES, BIASES AND DODGY ARGUMENTS MISLEAD US (and how to make better arguments ourselves)

Transcript of Dodgy argumentsextendeda4

Page 1: Dodgy argumentsextendeda4

Recognising DoDgY ARgumentsHow vAlues, biAses AnD DoDgY ARguments misleAD us (and how to make better

arguments ourselves)

Page 2: Dodgy argumentsextendeda4
Page 3: Dodgy argumentsextendeda4

PARt 1: You Don’t HAve time foR tHis!using mentAl sHoRtcuts on comPlicAteD issues

These days we have greater access to more information than ever before on big issues of importance, like how our food is grown and made, the introduction of new technologies and the safety of the products we use everyday. But with a limited amount of time and energy, we are forced to use mental shortcuts and make judgement calls.

Shortcut thinking can be very handy because:

• We can use our own experiences to form ideas and opinions.

• We can identify patterns in the world, and that helps us make predictions.

• We can identify with people who share our values to make us happy, feel safe and offer the basis of a community.

?

fe

Shortcut thinking can cause problems, however, when:

• Our personal experiences are limited, and though they are a good place to start, they are just the start.

• The patterns we identify may not be true, leading to false predictions.

• We mistake a false belief to be correct because we liked or trusted it or where it came from.

While our personal values and experiences can be helpful in making decisions, relying solely on shortcut thinking can give us confidence in inaccurate information or can lead us to fall for dodgy arguments. Unfortunately, this is far easier to recognise in others than ourselves.

This guide shows how to take a step back and judge the accuracy of informatiotn you’re receiving. Is it logical? Is it well argued? What is logical and well argued?

Continued in Part 2: What is Logical?

“I was elected to lead, not to read.”President schwarzenegger, the simpsons movie

Page 4: Dodgy argumentsextendeda4

Logic is 1 + 2 = 3.

We all understand what ‘1’ and ‘2’ mean. When you add them together, we can understand that the outcome will be ‘3’.

The relationship between the numbers is logical.

Here’s another example: “Magnets attract iron, this object is made from iron; therefore this object will be attracted to magnets.”

This argument provides two ideas; “Magnets attract iron,” and, “This object is made from iron.” These are called premises. Combined, they lead to the logical conclusion “This object will be attracted to magnets.”

Just as 1 plus 3 does not equal 2, you cannot move this information around and have the same outcome: “Magnets attract iron, this object is attracted to magnets; therefore this object is made from iron.”

While this is structured logically, it doesn’t have a logical conclusion. It excludes the possibility that other metals may be attracted to magnets. To draw a logical conclusion you’d have to say:

“Only iron is attracted to magnets, this object is attracted to magnets; therefore this object is made from iron.”

This has a logical structure and a logical conclusion, but the premise that “Only iron is attracted to magnets” is false information used to draw the conclusion.

Understanding how premises combine to form logical conclusions is an efficient way to judge ideas, while tying one idea to another; or using false or over simplified facts is a common way to make an argument sound logical when it isn’t.

Continued in Part 3: Bad Arguments or Good Lies

PARt 2: wHAt is logicAl?tHe boRing bits AnD tHe bAsics

“Logic is the beginning of wisdom, not the end.”commander spock, star trek

Page 5: Dodgy argumentsextendeda4

You may have heard someone say an argument is a ‘straw man argument’. This describes an argument that misrepresents an idea, often through oversimplification, exaggeration or an association with an unrelated topic. Straw man arguments sound good, but on closer inspection they fall apart. They can be used deliberately to fool you, or used accidently because of poor reasoning skills. Consider the following:

Opinion: ‘Did you know the measles vaccine actually contains the measles virus? I wouldn’t have my kids jabbed with it, getting the vaccine means getting measles.’

The information here is oversimplified; the measles vaccine does contain the measles virus, but it is broken down to grow slowly. By being exposed to the virus this way your body can build up a resistance to it, so if you encounter the real virus you have a greater immunity to it. This missing information changes the argument entirely.

Another example is:

Opinion: ‘Our research and outcomes have been replicated independently in other laboratories and so the findings are conclusive.’

Counter opinion: ‘Your research was funded by industry so these findings are biased. What about these other laboratories? Are they privately owned too? This is just an example of private interests taking control of science.’

The counter argument seeks to discredit the initial findings based on who funded the research, not on the methods or conclusions of the research. While there may be a question of credibility, connecting inaccuracy and bias directly to funding without addressing the research itself is a straw man argument.

PARt 3: bAD ARguments oR gooD liesPulling APARt stRAw mAn ARguments

“People without brains do an awful lot of talking, don’t they?”the straw man, the wizard of oz

Continued in Part 4: Stereotypes, the fantastic time-saver.

Page 6: Dodgy argumentsextendeda4

Some arguments focus on the person rather than on what they’re saying. Determining who you trust and who you don’t can be an important first step in evaluating information, but there is a difference between who a person is and what they are saying. The analogy often used is ‘playing the player not the ball’.

This happens a lot in debates about science. The scientific process relies on evidence and seeks to draw likely conclusions. But it is human nature to use a mix of emotions and logic to form opinions. Emotions are important in determining our values, and can influence which ideas we accept or reject. These elements can work against each other when we emotionally reject an argument despite logic showing it to be true.

For example, GM food often becomes a debate about what is natural or unnatural. But what is natural is often determined by a person’s values. So, the argument becomes based on personal perspectives rather than the science.

Psychological studies show, when examining information, we accept information that supports our existing

beliefs or values and tend to reject everything else regardless of how scientifically robust it is1. So when we think we’re reasoning, we’re often rationalising, resulting in a decision based on what we’re most comfortable with, and not a careful weighing-up of the alternatives.

Be careful that you’re not falling victim to the mental shortcut of simply agreeing with a position that aligns with your values. Look closely at arguments you may agree with and see if you can’t come up with arguments against them. Check if the logic behind the argument is sound, try to investigate the basis of the ideas used to justify the argument. Are they valid? Or is there false information being used to back up an argument? Agreeing or disagreeing with something because it aligns with your values doesn’t logically make it wrong or right. A closer and open-minded analysis of exactly why you think something can be useful.

1 Binder, Scheufele, Brossard, & Gunther, (2011). Interpersonal amplification of risk? Citizen discussions and their influence on risk and benefit perceptions of a biological research facility. Risk Analysis.

“Trust starts with truth and ends with truth.” santosh Kalwar, Author

PARt 4: steReotYPes, tHe fAntAstic time-sAveRHow vAlues AnD biAs cAn guiDe ouR tRust AnD blinD us.

Continued in Part 5: It happened to me once!

Page 7: Dodgy argumentsextendeda4

50%

50%

Continued in Part 6: It’s Just a Theory!

PARt 5: it HAPPeneD to me once! tHe gAmbleR’s fAllAcY AnD fAlse PAtteRns

Toss a coin six times and record how it lands. Then, make a prediction on what side will land up next. Invite a friend to also make a prediction without showing them the record of past results.

Q: Who has the best odds of picking the correct side?

A: You both do.

Regardless of the previous result, there is a 50 per cent chance of getting heads or tails in the next coin toss. But it is hard not to think there is a pattern forming. Gamblers fallacy describes the belief that past events can affect the outcome of a current event when they don’t.

It’s not always easy to know when a pattern is forming or when it means something. You have a headache so you take some aspirin, some water, and lay down. You awake refreshed, headache gone. The aspirin worked, or was it the water? Or the lie down? Or a combination of the three? Or something else entirely? Ideally, you’d have to test the scenario again and again, changing the variables of water, aspirin and sleep to be sure.

People recount personal experiences as evidence, and personal experiences can be convincing, yet they can be very limited as evidence of proof. You have a headache so you take some willow tree bark extract, drink some water, and lay down. You awake refreshed, headache gone. Does willow tree bark cure headaches?

Science involves finding ways to test and challenge patterns. You develop a hypothesis to explain a pattern; then you find ways to test if your hypothesis is wrong. Once other explanations have been whittled away, only then can you have confidence that a pattern is real, and that your hypothesis explaining it is, in fact, scientific.

Willow tree bark can cure headaches. People found it worked consistently for centuries before chemists in the 19th century found salicylic acid was the active component causing the effect. This was then combined with sodium salicylate to make acetylsalicylic acid, which is also known as aspirin.

“Let’s burn down the observatory so this never happens again!” moe szyslak, the simpsons

Page 8: Dodgy argumentsextendeda4

“Scientists don’t really know what’s going on; they’re just making stuff up.”

Ever heard that said? It’s true. But to be more accurate you’d say:

Scientists don’t know what’s going on, but they’re taking measurements of the world, repeating this multiple times in multiple ways to ensure those measurements are accurate, developing theories about why the world is how it is, testing those theories, changing them to match the results they find until they’re certain their theory is sound. Then, other scientists test and challenge the theories; while others use those theories as the basis of new predictions and theories about the world.

Science may seem unreliable because the theories it makes about the world change, yet this is its strength. Each change is like a sharper focus presenting a slightly clearer picture on how the world works. Many theories are well established, surviving as observations and successful predictions based on them accumulate. Yet even the best theories must be abandoned if the facts and predictions fail to match it. The best way to know you’re right is to acknowledge you may be wrong.

The strength of science is it always leaves the door open to new possibilities and better theories. However this can be misused to make strong theories seem weak and weak theories seem strong. For example, the theory of anthropogenic climate climate has been developed over more than 100 years. Even though climate models continue to develop, the idea that our global climate is warming as the result of human action may seem like a recent one, despite this long history of scientific work. Arguments against it often focus on the lack of 100 per cent certainty.

The nature of science means 100 per cent certainty is impossible to ever achieve. The best scientists can do, with testing, predictions and more testing, is to tell us they’re reasonably certain a theory is accurate. We need to make decisions based on the best evidence we have available, which often entails going with the weight of evidence.

PARt 6: it’s Just A tHeoRY!(liKe gRAvitY)

Continued in Part 7: A precautionary tale

“Science knows it doesn’t know everything, otherwise it would stop.”Dara Ó briain, irish comedian

Page 9: Dodgy argumentsextendeda4

Where there is concern over the safety of a new technology, those responsible for the technology are expected to show that potential risks are being managed appropriately. This is called the precautionary principle.

When it comes to the environment, the precautionary principle states that decision-makers should always take action to limit environmental damage, erring on the side of caution when evaluating proposals that may seriously or irreversibly impact on the environment.2

The precautionary principle is about knowing the consequences of acting cautiously as opposed to doing nothing. It’s possible to take the precautionary principle too far by stopping an activity until there is complete proof of its suitability and safety. This could result in greater risks occurring that could have been addressed had an activity not been stopped, and it is at odds with how evidence-based science works.

To consider the precautionary principle fully you need to consider the benefits of a technology against the risk and impact of something going wrong. For example, take genetically modified crops created to reduce the need for pesticides.

2 http://www.aph.gov.au/Library/pubs/rn/1997-98/98rn04.htm

Researchers go to great lengths to limit the risk of genes flowing to wild plants by using various methods such as barrier crops or sterile crops. There is always a possibility of missing something, however further research and testing take time and funding. Meanwhile, the pesticides that have a known detrimental impact on the environment could be reduced. Risks will never be zero, but they can be put into perspective against other, more certain, consequences.

Remember, science is a way to build confidence in an idea rather than to obtain certainty. Waiting for absolute proof might sound reasonable, yet is an impossible criteria to meet.

An over stringent use of the precautionary principle can mean fewer technological solutions to existing problems. Technologies such as nano water filtration, nano solar energy, and biotechnologies all come with health and safety risks, but uncertainty doesn’t mean we should stop researching them. Technologies come with risks and benefits and we should all be part of the decisions as to which risks and what benefits we want to accept.

PARt 7: A PRecAutionARY tAle

Continued in Part 8: An indefensible defence

“If every conceivable precaution is taken at first, one is often too discouraged to proceed at all.” Archer J. P. martin, nobel laureate

Page 10: Dodgy argumentsextendeda4

Human psychology is a peculiar thing. Experiments show that we first seek out information that supports our existing positions and dismiss or ignore information that goes against them, and when we adopt a position on anything we are very reluctant to give it up – even in the face of solid evidence that contradicts that position.3 Indeed, in many cases, being shown contrary evidence can make people more strongly defend their original position.4 In doing so we can get stuck with increasingly indefensible positions. It’s better to re-evaluate your position as new information is presented and hold back on assumptions when doing so. This will allow you to gain new knowledge, and because you are seeking and willing to consider opposing information, you can be confident that the opinions that you to hold are well placed.

So how do you challenge dodgy arguments?

> Try and keep your arguments free from your own emotive biases.

> Frame your argument so it aligns with the values of the person you are talking to.

> Be willing to consider and accept opposing information.

> Remember that your argument must have a logical structure (you won’t convince anyone 2+3=1).

3 Druckman, J. & Bolsen, T., (2011), Framing, Motivated Reasoning, and Opinions about Emergent Technologies. Journal of Communication.

4 Nyhan and Reifler, (2010), When Corrections Fail: The persistence of political misperceptions, Political Behavior.

PARt 8: An inDefensible Defence

“To argue with a person who has renounced reason is like administering medicine to the dead.”thomas Paine, Author

Page 11: Dodgy argumentsextendeda4

suggested readingBlur: How to Know What's True in the Age of Information Overload, By Bill Kovach and Tom Rosenstiel

Slow thinking, fast thinking, by Dan Kanehan

The Debunking Handbook, by John Cook and Stephan Lewandowsky

online ResourcesThe Critical Thinking video series: www.youtube.com/technyouvids

The Critical Thinking education resource: https://education.technyou.edu.au/critical-thinking

TechNyou Information Outreach Service: www.technyou.edu.au

Page 12: Dodgy argumentsextendeda4