Big Oil will stop at nothing to pump every last drop of oil out of the ground, from paying scientists to say that oil spills, fracking and other routine matters of oil development projects aren’t...
Revelations that shale gas extraction could lower property values, increase insurance costs, and damage the environment – according to the Department of Environment, Food and Rural Affairs’ (Defra...
Imagine coming across a piece of reliable information that contradicts everything you’ve ever believed about, say, global warming or the war on terror. It would likely prompt the question: if you were wrong about such an important issue, what else could you be wrong about? What’s more, if you’ve been wrong about a bunch of things, then perhaps you’re not quite as well-informed as you had previously believed.
Thoughts like these are jarring ones because they threaten our sense of self — making us feel stupid, empty, even worthless. Unsurprisingly then, most people’s willingness to open up to new information depends largely on how this information will challenge or coincide with their preconceived notions of what is good or bad, right or wrong, true or false.
According to a study by researchers at the University of Waterloo, called Self-Affirmation and Sensitivity to Argument Strength, when people are presented with corrective information that runs counter to their ideology, those who most strongly identify with the ideology will intensify their incorrect beliefs. And as such, the greater the challenge new information poses to a person’s self-worth, the less likely it is to have any impact at all on them.
If there's something positive to draw from these uncomfortable realizations of our purposeful ignorance, it's that if we take the time to better understand why and how people think and feel the way they do, these inherent biases can be successfully mitigated and controlled.
And with this aim in mind, what follows — keeping in mind that I have likely succumb to a few of these during the writing of this piece, as you will during the process of reading it — are eight of the most commonplace logical fallacies that misinform our minds every day.
1. Backfire effect: As mentioned above, the more a piece of information lowers self-worth, the more likely it is to be rejected outright. Therefore, new information can create such ideological insecurity that people will manufacture counterarguments to the point that they overcompensate and become more convinced of their original views. Hence, instead of convincing someone to question an invalidated belief, fresh information can actually ‘backfire’ by strengthening the grasp a refuted opinion has on an individual.
Monkey see, monkey do. Image Credit: danmachold/Flickr
2. Status quo bias: We tend to be apprehensive of change, and this often leads us to make choices motivated by the desire to keep things as familiar as possible. This is because for most people the current baseline is taken as a reference point, and any change from that baseline is perceived as a loss. Needless to say, preference for the status quo represents a core component of conservative ideology – militarism, austerity and environmental exploitation are all-too-familiar attempts to hold on to the status quo.
3. Confirmation fallacy: We love to agree with those who agree with us. We visit websites that re-express our political opinions, re-read literature that reaffirms our cultural upbringings, befriend people with likeminded attitudes and form cohesive social circles based around similar key viewpoints. At the same time, we practice a reactive reasoning in that we undervalue, scrutinise and dismiss arguments, figures, and people that challenge our entrenched worldviews — after all, we are our own biggest censors.
4. In-group fallacy: Similar to the confirmation fallacy, due to our innate desire to be socially accepted, we tend to favour the thoughts, ideals and sentiments of those with whom we racially and culturally identify with most. And conversely, this means we are suspicious, fearful and ignorant of the preferences, wants, needs and values of groups and peoples that we have difficulty identifying with — this goes a long way toward explaining why racism remains so rampant in liberal-democratic countries.
5. False consensus bias: As we cannot really experience anything outside of our own consciousness, we tend to believe most people think like we do. In group settings, false consensus biases cause us to accept that the opinions, preferences and values of our own group reflect the larger population. And since groups tend to reach a consensus and avoid those who dispute it, they believe everyone thinks that way. This is the sort of groupthink that convinces political extremists they have widespread support.
Put a stop to groupthink by jumping off the bandwagon. Image Credit: caffeina/Flickr
6. Bandwagon Effect: Opinions and viewpoints spread infectiously among people, meaning we are very likely to adopt a belief merely because lots of other people believe it too. In other words, people are both socially insecure and cognitively lazy. We don’t want to think for ourselves, and we often assume that if someone else has already adopted something, it can’t be bad. Even though the popularity of an argument has little bearing on its validity, we disregard our own judgements in an attempt to assimilate.
7. Current moment fallacy: A cognitive tragedy of the commons, we have a hard time imagining ourselves in the future and altering our behaviours accordingly. As such, most opt for gratification now, saving discomfort for later. This lack of self-control, where most people would rather exchange serious troubles in the not-to-distant future for more trivial pleasures in the moment, personifies the impulsive decision-making that is responsible for the financial meltdown, political corruption and developments that harm the environment.
8. Blind Spot Bias: Ironically enough, if you read this article thinking that these biases don’t apply to you, you might suffer from this logically fallacy, which makes us think that while biases may apply to others, we are immune to them. This is because when we assess ourselves for irrationality, we look inward, searching through our thoughts and feelings for bias. But biases operate unconsciously, so while we have little trouble pointing out the biases in others, it is exceedingly difficult for us to take note of our own.
But why go through all this trouble to point out the logical fallacies that seem to be driving ignorance and close-mindedness in our society? Well, the political implications of this sort of self-reflexive psychoanalytic exercise should be pretty obvious…
In the past year alone, Canadians have borne witness to half a dozen Senate corruption scandals, a spying agency that’s quietly collecting and sharing our personal information, the actual destruction of priceless scientific archives and a relentless war on science and knowledge — all of which serve to demonstrate just how ideological our government has become.
So as we inch closer to the 2015 federal election, it is our responsibility as democratic citizens to take note of the ways these logical fallacies — and the dozens of others we succumb to — can misinform our minds, and those of our political leaders, each and everyday. For if we work at becoming a more cognizant and well-informed citizenry it will spill over into the polling station, and with any luck, onto Parliament Hill as well.
Title Image Credit: Andrew Mason/Wikimedia Commons