Why we believe what we believe
How our brains process information and how brands are making the most of it.
May 5, 2017
Post-truth, fake news, false amplifiers, silos, alternative facts, information operations - it's hard to know who and what to believe anymore.
But before we can understand the who and the what, it's important to understand the why.
I was really looking forward to playing golf last Sunday morning. Being an outdoors activity I opened my weather app to check the forecast for the day: rain and strong winds. Deflated, yet undeterred, I proceeded to scour the web until I found a website that predicted rain and winds but only in the late afternoon. This website suddenly became my most trusted source for weather. But why?
I went on a search for the truth about truth and found answers from Associate Professor and UOW psychology lecturer Steven Roodenrys and UOW Executive Dean of Business Professor Charles Areni.
BREAKING NEWS: Sauces reveal leak in murder investigation.
We're lazy and time poor
There's a concept in psychology called dual process theory. It's based on the view that humans have two different systems of cognitive processes.
System one operates more or less automatically without reflection and is often referred to as our intuition. This system, as Professor Roodenrys explains, is often all that we need to get the job done.
"It's quick, easy and doesn't use much energy," he says. "The suggestion is we've evolved in this way to avoid heavy thinking. If you're in an environment where there's little food, you want to use as little energy as possible."
System two - reasoning - requires more rational, effortful thinking to process all the relevant information we have in order to make a decision or form an opinion.
"Some people are more reluctant to engage this second system than others," Professor Roodenrys says, in what seems to be a polite way of saying that some of us don't use our brain's reasoning powers as often as we should.
This reliance on system one - while making perfect sense for our early ancestors - can lead to problems when used to navigate the hundreds of articles on our social feed every day. Taking the time and effort to separate fact from fiction on every post would be exhausting. That's why we often rely on our intuition, to varying degrees of success.
It's a cognitive loophole that Professor Areni says marketers exploit by developing a brand. "The most basic thing that brands do is they simplify decision making," he says.
A simple grocery shop can present the brain with hundreds of decisions to make. More often than not, we turn to system one to speed up the process.
"Brands are important because much of the differentiation between two products is the brand," Professor Areni says. "Given the constraints on time and information processing and given the wide variety of choice, something has got to give. Making a decision on the basis of a brand is a way to solve those problems, especially if you have had a favourable experience in the past."
We don't like to feel bad or uncomfortable
Effectively, we have a whole bunch of different beliefs - from religion and politics to whether pineapple belongs on a pizza (it doesn't). However, if we went through and systematically catalogued them all you might find that one belief directly contradicts another.
"If you were to think about these beliefs at the same time it would create an uneasiness - a cognitive dissonance," Professor Roodenrys says. Additionally, an event may happen that creates a new conflict that will need addressing. For example, you may support Donald Trump, but then see a video of him eating Hawaiian pizza, which you strongly despise.
Having two contradictory beliefs can make us feel so uncomfortable that we change our beliefs to rectify the contradiction. "We seek answers to reduce this dissonance," Professor Roodenrys says.
You may jump off the Trump bandwagon or give pineapple pizza another go. Alternatively, you'll construct reasoning to explain the dissonance, like Trump was merely being nice to a local chef, or that his taste in pizza toppings has little to do with his policies.
It may be a trivial example but it shows how our beliefs can be formed and altered just to stop us having to deal with discomfort. Often, we find the best way to reduce the possibility of cognitive dissonance is to limit the information we expose ourselves to, like not reading the nutrition label on a new cereal when we're trying to eat healthy.
"Consumers like to make decisions they can feel good about," Professor Areni says. "But feeling good about a purchase decision often involves ignoring information or not processing all of the information."
We like facts that align with our views
Confirmation bias. You've probably heard of it. It's the theory that we look for evidence that is consistent with our own view and avoid or disregard information that is going to challenge it. That type of evidence provokes anxiety and, as humans, we understandably prefer to feel good about ourselves (or our upcoming games of golf).
"Often we will simply filter out, or not pay attention to something that doesn't fit with our beliefs," Professor Roodenrys says. "It reinforces the idea that we're right and it makes us feel comfortable that we're part of a group that feels the same way."
While over the past 20 years we've seen a proliferation of media sources, the platforms we're using to digest them are often only feeding our biases.
"The algorithms that social media outlets use pay attention to what you've read and suggest similar things," Professor Roodenrys says. "For most people, to find an opinion that contradicts their beliefs on social media, they'll have to go and do very deliberate searches."
These self-curated silos, however, are nothing new. "While social media silos are perhaps smaller and the walls perhaps thicker, there's always been selected media exposure depending on one's preferences," Professor Areni says. "So when people say a Trump statement was clearly proven wrong, it depends on who's listening."
If something does break through the silo walls, just presenting the facts often isn't enough to change a person's mind. It can even cause a backfire effect, making them more set in the ways, something Oatmeal does a great job of explaining in .
Confirmation bias is part of the reason Professor Areni believes Volkswagen will eventually overcome the 2015 global scandal over faked emissions tests. He says Volkswagen is a very strong brand with a large, loyal following. Consumers will see this bad behaviour as out of character and most will forgive or quickly forget this indiscretion, as it's easier than changing their beliefs.
"Somebody that has a conviction is great at selective perception," Professor Areni says. "In fact, one of the basic features of all humans is that when we form an expectation we're very reluctant to give that thing up."
Facts are facts, but our biases tell us otherwise.
BREAKING NEWS: Firefighters work to unscramble inner city disaster
We dislike uncertainty and ambiguity
Sometimes, believing is about relieving the anxiety that comes with the unknown. Whether we're driven by the discomfort of not knowing or simply just an inherent curiosity, Professor Roodenrys says people vary in the amount of ambiguity they can tolerate.
"Most people will feel a certain amount of anxiety about uncertain situations," he says. "We would rather know what's happening or at least think we know. Even if something isn't true, if it satisfies the criteria of allowing us to think we understand, then people will often take that information on as fact."
Believing in something is easier than not knowing.
This desire for familiarity is something brands use to their advantage when launching a new product because marketers know we're going to choose a brand we already trust.
"Having a strong brand allows you to launch into new product categories because you're not asking consumers to accept a completely new product," Professor Areni says. "The consumer knows the brand, it's just the brand's being applied in a new product category - as long as you retain the essence of a brand, that stickiness."
People believed the Trump brand wouldn't have enough of this stickiness to extend to politics, but he was able to use the brand essence of Trump as the straight-shooting host of The Apprentice and extend it all the way to the White House.
We believe what we're told
The news we're bombarded with and conceptual knowledge that's at our fingertips is overwhelming compared to what our ancestors would have faced just a few hundred years ago.
"We have an inbuilt bias to believe what we are told because if you don't, you're probably not going to survive in a hunter-gatherer environment," Professor Roodenrys says. "If you ignore advice that a noise is a particular animal and it's dangerous, you're more likely to die."
Without that element of trust and belief in what other people tell us we wouldn't survive as well as a species. Collaboration and the sharing of information have been major contributors to the success of the human race.
Unfortunately, as Professor Roodenrys says, "There hasn't been enough time or evolutionary pressure to change our cognitive systems." As a result, we're not that great at picking up when someone is lying to us. We tend to take a new piece of information on as part of our perceived knowledge. Some share a news post without even reading the article.
Fake Facebook accounts are taking advantage of this inbuilt trust to sow the seeds of misinformation with fake news in the hope of influencing elections. As of April 13, Facebook has taken action against more than 30,000 fake accounts in the lead-up to the election in France.
"If legitimate voices are being drowned out by fake accounts, authentic conversation becomes very difficult," Facebook says in its on information operations.
So who/what should we believe?
For those who want the whole truth and nothing but, Professor Roodenrys says we simply need to invest more energy and look beyond the headline.
"Our natural processes can lead us astray," he says. "We need to think clearly and scientifically. Taking a scientific approach to things is a way of combatting the biases we have."
And help is on its way. The first - appropriately held the day after April Fools' Day 2017- gave people trivia quizzes and lesson plans to help them navigate the increasingly murky waters of the news. ABC's has returned to analyse the accuracy of claims politicians, lobby groups and organisations make in the media. Wikipedia founder Jimmy Wales has just launched , a crowd-funded news platform that aims to "fix the news" with a community of volunteers and paid journalists submitting and fact-checking news that is free of advertising or a paywall.
Smart people will always do dumb things, but if we take just a moment to base important decisions and action on rational thinking, there will be less of us playing golf in the pouring rain. Again.
Breaking News photos by Paul Jones