With all the perks of modern life, we rely more and more on technology and less on our own skills. We are losing the skills that helped humanity get where we are. We are getting lazy. And it starts with such simple things as not being able to walk properly anymore and ends with the inability to use common sense and critical thinking in everyday life.
We are lazy to think for ourselves, and we gullibly believe everything we hear as long as it comes from someone in our narrow circle of acquaintances that we trust. We don’t try to seek the source of the information we are getting. It is enough that “a friend” on our favorite social network shared something shocking, and we immediately believe it and even spread it to other people. We don’t take the time to think whether the information we receive is even possible or makes any sense whatsoever. As long as it fits into our view of the world, it must be true. And so, the fake news and the alternative truths spread like wildfire.
Thinking about what we hear, about our answers, ideas, views, and attitude is hard. Most of us are cognitively lazy. We prefer to stick with the old views because it is easier than dealing with new ones. We also like consistency and predictability. Rethinking a long-held belief makes the world less predictable. Who can we trust if we can’t even trust ourselves and if what we thought a week ago may not be true? Rethinking something we internalize feels like losing a portion of who we are. It threatens our identity. You may consider someone who uses a twenty-year-old mobile phone without a touch screen and the ability to connect to the internet and download apps as old-fashioned. You would find it weird. Yet, you are comfortable having twenty years old views.
Adam Grant illustrates this on the famous experiment with a frog. You probably heard that when you drop a frog into a pot of boiling water, it immediately jumps out. In contrast, if you drop it into cold water and gradually increase the temperature, it will not react to a slowly changing environment until it is too late and will boil. It is a nice metaphor meant to illustrate that we respond quickly to sudden danger but are more likely to accept a slowly creeping change without us realizing what’s going on.
The problem with the story is that it is not true. When you drop a frog into boiling water, it gets so severely burned that it dies. If you gradually increase the temperature, the moment it gets uncomfortable, it will leap out.
When you first hear the story, it makes some sort of sense, and it is a nice story to tell. So you don’t question whether it is true. Once we accept an idea or a view, we see no reason why to examine it again. It is not the frogs who are left to boil. Their instincts will keep them alive. It is us who are in trouble by not reevaluating what we believe we know about the world.
After I read Grant’s take on the frogs, and in the spirit of rethinking, I tried to find some information on experiments that would validate that the allegory is indeed a myth. And I couldn’t. No modern scientist would run such an experiment on poor amphibians. And so, we have a hundred-year-old experiment that may be misunderstood versus the purely theoretical opinion of modern-day scientists. Is there a particular speed of warming the water that works and another that doesn’t? So it seems that in the end, we have just opinions and educated guesses—what a mess.
An interesting story is better than a truthful one
The problem with opinions is that we prefer feeling right over being right. It is not necessarily crucial to our self-esteem to be right as long as we feel that we are right. We don’t need any data to back it up.
Sociologist Murray Davis came up with a theory that ideas survive and spread not because they are true but because they are interesting. No one is interested in talking about boring facts that we already know. It is something that piques our interest, that is unusual, exotic, controversial, that denies something we thought was true, that deserves to be spread. However, it needs to be ideas that question only weakly held beliefs. If it questions something we deeply care about, a belief that forms our identity, we shut it down.
The ideas we spread most enthusiastically on social media are those that are controversial and polarizing, yet they support our worldview. In these cases, we are completely happy to be an unsuspecting parrot repeating whatever we hear as long as it makes us feel right. We don’t even try to look at it with a skeptical eye and consider whether it could be just a hoax.
Skepticism and critical thinking are important. They help you navigate today’s complex world and ensure you are not being taken advantage of by every crook and fake news peddler. However, for critical thinking to be effective, it needs to be based on some logical framework that respects data and evidence. The moment you decouple your decision from the evidence, you are not thinking critically. You are becoming a slave of your beliefs and biases.
Whether you are still in the realm of critical thinking, consider an opinion you disagree with and ask yourself a simple question. “What data, facts, and evidence would convince you to change your opinion on the topic?” If the answer is that no evidence would convince you to change your mind, then you are not thinking critically. You left the realm of thinking and entered the realm of blind beliefs.
In Think Again, Adam Grant proposes these habits of rethinking:
Don’t take any information at face value. Treat everything as a hypothesis that needs to be confirmed by research. When someone shares with you something that feels controversial, do your own search for the source of that information before you spread it to other people.
Define your identity in terms of values and not opinions. Understanding your core values and who you are will make it safe for you to change your views without endangering your identity.
Fight your assumptions and biases by looking for information that would invalidate your beliefs. Either you find them, and you can adjust your beliefs, or you won’t, and that will allow you to be comfortable with the knowledge that you are not deluding yourself.
Learn to question the “how” rather than the “why” of people’s opinions. When people talk about the “why,” it often leads them to entrench in their positions. When they have to think about “how,” they may realize that their extreme view may not be that realistic and are more willing to rethink it. How would you make it a reality? How did you come to that opinion? These are good questions to start with.
For example, during the last elections in the Czech Republic, some politicians, to win votes, tried to scare the public by spreading the hoax that the opposing party would require each household to accommodate some immigrants. They played the external threat card, as that usually works to win votes. Many people started to repeat this ridiculous assertion. Yet, all that was needed was to ask, “how exactly can this even be done in a modern democracy?” And the answer would be obvious, “it can’t.”
Don’t let your feelings cloud your judgment. I would add this one on top of Grant’s list. In Thinking, Fast and Slow, Daniel Kahneman describes an experiment when you have a choice to draw a marble from one of two bags. The first bag contains one red marble and nine black ones. The second contains eight red marbles and ninety-two black ones. The goal is to draw a red marble. Which bag do you choose? The one that contains one red marble, or the one that contains eight of them? Most people will choose the second bag. It somehow feels like a better bet as it contains eight red marbles. Yet, the math says otherwise. You have a 10% chance to draw a red marble from the first bag (1 out of 10) and only 8% from the second bag (8 out of 100).
Similarly, your mind can be influenced by how you are presented with the results of vaccinations. If I tell you that the chance of adverse effects leading to a death of a specific vaccine is 0.001%, your mind says it’s essentially zero, and you are comfortable with getting the vaccine. If I tell you that it leads to death in 1 out of 100,000 cases, you will think twice about getting it. The statistical chance is the same, but the second case somehow feels more dangerous. It creates an image of one real person who dies. The way how you present the statistics will inform how it will be understood. When you are presented with data, try to look at it from different perspectives, understand where the data comes from, and don’t leap to the first conclusion. Rethink whether the data can mean something else.
Don’t act now, think first
Avoid the urge to act without thinking. Unexpected events or sneaky salespeople often push us to succumb to false urgency, “act now, or miss the opportunity.” It forces us to think less critically and can easily lead to bad decisions. It is very rarely that matters are truly as urgent as other people claim. Prudence and getting some data first is, as a general rule, a better policy. A false sense of urgency can not only keep you focused on the wrong things but can, in the long term, destroy your credibility. Remember the story of the boy who cried wolf?
You need to make fact-seeking part of your daily life. Build the habits of critical thinking and checking the data before you rush to conclusions. Stop dramatizing your worldview and see the world for what it is. Not perfect, but getting better.
What is your take on the topic? Do you use common sense and critical thinking when presented with new information? Or do you just trust your sources and spread whatever you hear? Who is to blame for the spread of fake news and hoaxes on social media? How would you stop it? Or should it be stopped at all?
Photo: JillWellington / Pixabay.com
Follow me on Twitter: @GeekyLeader