Availability Bias: The error that leads us to our decision

3 23
Avatar for durjoy003
3 years ago

Let's think of a fantasy scene. Ashfaq, a staff man who is a few years old in your office, has been given promotion and has applied for it. He has a good record, even in the performance list, among the first few in his department. But in his first year, he made a mistake, which caused the company to suffer a lot. He accidentally deleted all files from the company's server, all of which are still in the minds of the people. What will be your decision? Will you give him a promotion?

Statistics suggest that the chances of a promotion for Ashfaq are slim. Because the first year of his mistake will make you think about the problems you had to face. As a result, his contribution to the company will be diluted. Finally, you'll consider his performance in the fat- and only that mistake. The effect behind this is a mistake called 'Availability Bias', which, when deciding, causes the effects of a memory to fade away.

In other words, we decide on the information or memories that come to our thoughts the fastest and easiest. Thus the decision loses its neutrality.

Availability bias can lead to bad decision making, because the memories that come to mind are often inadequate. As a result, we decide by processing low-quality information, which most of the time does not follow the real world and leads to bad consequences.

Intellectual errors can lead us to make bad decisions; Image Source: Wingify on Growth

In a variety of academic and professional matters, the availability bias puts the bad decision stake behind it. If we rely only on memory during data analysis, the extent of damage may be too high in cases where reasonable decision making is needed. One such place is economics. Thinking in private is a big reason why we are trying to reduce the bad costs, because there are so many small, wrong decisions where the availability bias affects us severely. But it can also bring good results to media and business decisions, as most people are affected by the availability disorder. Media and business organisations are also using this bias in their favour.

Our brain follows a lot of mental shortcuts that help us make quick decisions. The availability bias makes our decision making process easier. But it misleads us when we judge the possibility of something. Because our memory is not a real model, based on which we can predict the future. In fact, most people don't think statistically. Let's give an example.

Imagine you're on a plane. How much do you think the plane is likely to crash? The safety of your flight depends on many different factors and it is more difficult to calculate all of them in the human brain. If there is no training on statistics, your brain will take a different approach to answering this question. You will be reminded of the terrible news of the plane crash in the past, as well as the pictures of the planes that crashed. Because such news leaves a deep impression on your brain. And that will give you an extra assessment of the possibility of such an accident. That's how the availability bias works.

Image Source: Wu Zhijian Blog

Some memories are automatically captured in our brain and other memories require extra attention and effort. The two main reasons behind the memory that are easily captured are the ones that are most likely to happen or leave a lasting impression on us.

The events that happen so much, usually work with other shortcuts in our brain, which we try to understand our world. Amos Tverski and Daniel Kahneman, two pioneers of behavioural psychology, did a study in 1973. The participants were asked if the 'k' character was more of a word that was first placed (kitche, kangaroo, kale etc.) or a third character (ask, cake, bike etc.) In general, the word 'k' is more than twice the number of words that are in the third letter. But 70 percent of participants replied that the number of words that sat 'k' was higher at first. This is because the first letter 'k' is more useful to think of. These words are the most common, so it seems they are more.

Daniel Kahneman and Amos Tverski; Image Source: Penguin Random House

Kahneman and Tversky also researched the events that left us with a lasting impression. In a 1983 study, half of the participants were asked how much of a major flood in North America was possible. The other half were asked how much of a massive flood could be caused by the earthquake in California. According to statistics, the probability of flooding in California is significantly lower than that of flooding across the North American continent. Participants, however, were also ahead of the chances of flooding California due to the earthquake. The explanation is that California is an earthquake-prone state. And the flood swells are quite close to the earthquake, so they can draw a living picture of the thought together. On the other hand, the North American continent is so vast and undefined in the world of thought that it is difficult to create such a clear picture.

We make a lot of decisions every day. Media, sensitivity or memory influences these decisions more than rational thinking. Sufficient knowledge of mistakes can serve as a barrier against such misinterpretation. The result of the numerous shortcuts the brain uses to process the information that we have in our brain is the birth of availability. Every step of decision making needs to make the brain aware of this error.

Flooding in California is a rare occurrence; Image Source: The New York Times

Kahneman and Tverski describe two different methods of thinking in our brain. In Kahneman's book Thinking, Fast and Slow, he named them both System 1 and System 2. System 1 decides to speed up and the way it works is automatic. The general public decides to rely on System 1 for most of the time. But to avoid the ideological mistakes, System 2 is necessary. Because system 2 is well-thought-out and rational. But it is not automatic, it is necessary to think like this and man is usually a lazy animal. System 1 quickly finds shortcuts and is mistaken. On the other hand, thinking through System 2 makes people realise that the consequences of their quick thinking are misleading.

However, the most effective strategy of avoiding availability bias is 'red timing'. In the red teaming process, a member of the group is selected to challenge the opinion of the majority. Personal opinion of the person selected is not important. His job is to find out the error of everyone's opinion. This method is very effective when making important big decisions. It is easy to identify the bias if there is sufficient knowledge of behavioural science. And if someone who is rational is able to make a mistake, the chances of making a neutral decision are great.

A 1993 study was conducted on the impact of consumer markets and media on our common lives and the role of the Availability Bias. Participants were asked about whether drug use in the United States is increasing or decreasing. Most of them had the answer, it's growing. But the National Household Survey on Drug Abuse's report was different. If a particular topic is broadcast regularly in the media, it is a different concept than the real world. This can affect people's knowledge and behaviour in any way. Your behaviour will be biased depending on where you are receiving information. Your common sense will be formed. So, we need to be aware of making neutral decisions by avoiding the mistakes of thought.

Source of the lead image: https://www.verywellmind.com/availability-heuristic-2794824

4
$ 0.00
Sponsors of durjoy003
empty
empty
empty
Avatar for durjoy003
3 years ago

Comments

Facts

$ 0.00
3 years ago

Nice one buddy

$ 0.00
3 years ago