We See Things the Way We Are Not the Way They Are

Phelix Juma
10 min readOct 25, 2020

In an attempt to save on delivery costs, Sendy can look at the data points about the road networks to know the optimum path to the delivery location. This can be based on various features such as distance, traffic conditions, road conditions among many other things. All these features can be passed to an AI model to extract the optimal path that will minimize costs of delivery as well as minimize time taken to deliver the items. Now consider you are in the savannah forest and you see a lion fast approaching. You need to find the fastest path through the forest that will lead you to safety in the shortest time possible. You quickly scan the forest and do your math and realize that the optimal path to your safety is…. oh shit, the lion pounces on you and that’s the end of you. You cry in pain as the lion devours you alive.

That is what happened to our forefathers who tried to be rational in the forests. When a lion is fast approaching, you do not have the time to think rationally on how you will survive the attack because then you will be dead before you even start the calculations. Our forefathers who survived are those who acted unconsciously; the ones who saw the lion and started running or climbing a tree immediately without giving it a thought and so in our DNA, the survival tactic that was encoded are those that favored the irrational mind; the mind that made decisions fast and with the least effort.

Amos Tversky and Daniel Kahneman — some of my most favorite economists — called it heuristics which are basically mental shortcuts that allow us to make decisions fast and with the least effort — a work for which Kahneman won a Nobel Prize in Economic Sciences (Amos had unfortunately passed at the time of the award). Mental Heuristics form the basis of cognitive biases — a set of conditions or factors that greatly distort our decision-making processes. Whereas humans rely hugely on heuristics for decision making, these mental shortcuts at times lead us to be polarized on various topics with different people picking a side depending on their cognitive biases.

Consider a simple Physics concept of charge polarization as show below:

Charge Polarization on a pair of conducting spheres: Illustration of how new information polarizes us based on our pre-existing cognitive biases

Initially, the two spheres have positive and negative charges within but they are not discernible (part a). When we bring in an external charged rod close to interact with them, the spheres become polarized ie electrons and protons move to opposite sides of the spheres as shown in part b. The charges on the sides that are in contact neutralize each other and when the spheres are separated then the external charged rod is removed, the spheres remain charged as shown in part d with one positively charged and the other negatively charged.

Human beings behave in a similar manner. We all have cognitive biases based on a set of information we have acquired over our lifetime. These biases may not be visible and some of us do not even know that they exist (actually reject their existence if others point it out). When we receive new information eg we read something on the news, or we watch the Trump vs Biden debate or we see forest fires in the Amazon, our reaction and/or interpretation of the news is not dependent on the news but our existing biases. The external news, in most cases, acts as an exciter just like the charged rod in the figure above. The news (ie charged rod) will interact with our innate cognitive biases making them to get activated and polarized. In the end, as shown in part d, what we finally make of the news is not dependent on the news itself but our biases. Just the same way the two spheres were influenced by the same charged rod but ended up with different charges with one being negative and the other being positive so do humans. The news only acts to publicize our biases with one group either supporting the news or opposing it.

The biggest mistake you can make is to assume that you are objective in your thinking and that you have no cognitive bias. We all have biases; the question is: how do we deal with them?

That takes me to my favorite question to my friends: “Why is it that we all watched the same debate between Trump and Biden but Trump supporters concluded that Trump outperformed Biden while Biden supporters concluded that Biden outperformed Trump?” Now let me get back to other issues; why is it that after the BBI report, those who were opposing it before — the Hustler Team — continue to oppose it while those who were supporting it before continue to support it yet we are all reading the same report? Why is it that when any forest fire happens, climate change fanatics are fast to blame climate change despite the numerous other factors that come into play to fuel a natural forest fire?

Cognitive biases make truth to be relative. The decision to either support or oppose an event is not dependent on the event or “facts” presented but our innate cognitive biases.

If you are a climate change fanatic, every forest fire, big or small, will be attributed to climate change and nothing else because we place too much significance to this evidence. When you are a Trump supporter, anything Trump says will be genius and anything Biden says will be trash. If you are a Ruto supporter then the BBI is the worst document that has ever been prepared in the history of Kenya. Hence the famed adage:

We see things the way we are, not the way they are.

Consider the mind map below:

Interaction between information and cognitive bias. Notice how the final conclusion is influenced more by the bias than by the information itself

Humans generally have a set of around 10 major cognitive biases. When we receive any new information, the information undergoes a “convolution transformation” with the bias. What this means is that, for the same information we receive, the resultant interpretation is dependent on the type of bias we already have and that is used to convolve the information. Since the convolution between two distinct function pair is distinct, based on the type of bias we already have in each of us, different people will make completely different conclusions from the same piece of information because they each have different biases. It is not surprising that people with the same bias will make the same conclusion from the same information and that’s the reason why people always end up polarized around the major issues in the world: matters of religion, politics, climate change, police shootings/brutality, abortion among so many other things.

In deep learning convolutional neural networks, we call this kind of transformer a convolution filter. Not to make a direct comparison as they are different concepts but cognitive biases transform the information through convolution filter operation; what I will call “information filters.”

In information filtering, an individual unknowingly ignores certain aspects of information, places undue significance on certain parts of the information or unknowingly searches for specific pieces of evidence to either support or discredit the information with the aim of having their cognitive biases take precedence.

As an example, if my bias is that the BBI is a ploy against Ruto and I am a Ruto supporter, I will read the document thinking I am being objective while in real sense, my information filters will ensure that I look for parts of the document that support my existing beliefs and place very high significance on them while at the same time, ignoring parts of the document that seem to oppose my beliefs and downplay them. In the end, after reading the document (even several times as some so called professors are saying on Twitter), it won’t change the beliefs they had before reading the document; it will simply enhance the cognitive biases they had before reading the document. The same is said of a Trump-Biden debate where Trump supporters are there to “confirm” that Trump is the best president and Biden is the worst and the same is true of Biden supporters. The Trump supporters will not objectively listen to the debate, they will ignore any thing that Biden says negative about Trump and place undue importance on the things Trump says against Biden — that is called confirmation bias — a phenomenon that makes us seek for parts of the information that confirms and/or support our existing beliefs. This is the same reason why Googling your symptoms will always show that you are indeed positive for whatever ailment you are searching; the problem is not Google, you are the problem; your confirmation bias is the problem.

So, how do we regain our objectivity when dealing with issues? How do we ensure that we don’t fall prey to cognitive biases? The first step is ensuring that we are aware of our existing biases. Denying the influence of our cognitive biases and thinking that we are being objective makes us wonder why others aren’t seeing things the way we are and sometimes trash talk them, labeling them as “stupid”, “racist” or any other derogatory names. Are you more likely to accept a person from the same tribe as you when conducting job interviews? The only way to eliminate tribal hiring is by first acknowledging that we have that problem so that we can fight it. If I know that my support for Biden may mask my view of Trump, I need to take a step back and take into consideration all my biases before making the final decision because chances are high that I am hugely wrong about Trump.

The first step to fighting the influence of cognitive bias on our decision making process is to acknowledge and appreciate their existence

To understand if we harbor these biases, we need to know what they are so that we can fight them and get back to objective discussions, objective reasoning and objective decision-making process:

1. Confirmation Bias is where people tend to listen more often to information that confirms the beliefs they already have; they tend to favor information that confirms their previously held beliefs.

2. Hindsight Bias is the tendency of people to see events, even random ones, as more predictable than they actually are.

3. Anchoring Bias is where we tend to be overly influenced by the first piece of information that we hear (or what you were introduced to; typical of religion).

4. Misinformation Effect is where our memories of particular events become heavily influenced by things that happened after the actual event itself. What we think we remember may not be the actual thing that happened.

5. In Actor Observer Bias, how we see things depend on if we are the actor or the observer; where we are the actor, we will attribute it to external influences (we don’t like blaming ourselves) and where we are the observer, we will attribute it to internal causes (we like blaming others). “I was driving carefully but the other driver was reckless hence the accident.”

6. In False Consensus Effect, we tend to overestimate how much other people agree with our own beliefs, behaviors, attitudes, and values making us to incorrectly think everyone agrees with us and overvalue our own opinions.

7. In Halo Effect, our initial impression of a person influences what we think of them eg pretty teachers are smarter than the not pretty ones or a beautiful job applicant is smarter and hence more deserving of the job. Loosely put, the halo effect is a physical attractiveness stereotype.

8. In Self-Serving Bias, people tend to give themselves credit for successes but lay the blame for failures on outside causes.

9. In Availability Heuristics, we tend to estimate the probability of something happening based on how many examples readily come to mind. It is the reason why those who haven’t seen a close associate contract or die of Covid19 downplay the health risks of Covid19 or the reason why climate change fanatics place so much weight on climate change as a cause of natural forest fires.

10. Lastly, in Optimism Bias, we overestimate the likelihood that good things will happen to us while underestimating the probability that negative events will impact our lives. It’s the reason why we all downplay Covid19 and think that we cannot die out of it, only others can. It’s the reason why new lovers get mad when told “mtaachana tu” because to them, they are optimists and break ups only happen to others not them.

Cognitive biases are inevitable. In some cases, they are necessary for our primal instincts eg boosting our self-esteem, helping us make decisions fast and with least effort and so many other advantages that mental heuristics have to offer. However, in most circumstances that require rational thinking, the cognitive biases are our greatest enemies. Let us acknowledge the biases and fight to eliminate their influence on our decision-making processes by applying a correction factor for our biases before making the final decision.

We see things the way we are not the way they are but what if we could turn things around? What if we could correct for our biases and see things the way they are so that truth can cease to be relative ?

--

--