preloader

Covid-19 vaccine: decisions under uncertainty

Would you accept the Covid-19 AstraZeneca vaccine if you had to make that choice today? 

You may have had this kind of conversation recently with your friends and family and may have found out that… the answer is far from obvious!

Today, the risk of thrombosis (and perhaps other fatal diseases) does clearly exist, but we lack the data to estimate its exact probability.

On the other hand, we already know that, on average, about 1% of people who suffer from the first variant of the Covid-19 will die from it.

According to recent surveys, 71% of French citizens are reported to refuse the AstraZeneca vaccine (which means their prefer not to be vaccinated at all if they are only given this option)

Behavioral economics can shed some light on such behaviors.

Both situations are clearly uncertain, but the level of uncertainty is higher in the first situation where the probability of death is still unknown. In behavioral economics, the first situation corresponds to a situation of ambiguity while the second is a situation of risk.

Most people are not comfortable with risk, but even less so with ambiguity! They are prone to an ambiguity or familiarity bias.

Let’s play two little games to try and see the difference between these two concepts!

First game:

You have two bowls in front of you, each containing 10 balls, either red or green, and you can choose only one. Red balls make you gain 100$ while green balls make you lose 100$. 

The first bowl contains 5 red balls and 5 green balls. You don’t know the proportion of red and green balls in the second one. Which bowl do you choose to draw from? 

Second game:

You still have two bowls in front of you. The left one contains 5 red balls and 5 green balls, while the right one only contains one light green ball. Red and green have the same properties as before, while light green makes you gain nothing. Which bowl do you choose to draw from?

In the first game, and this might be your case too, most people choose to draw a ball from the first bowl, as they prefer to know precisely which risk they are taking. In cognitive science, this behavior is called ambiguity aversion. This bias corresponds to the idea that we prefer known risks over unknown ones, fitting with the famous proverb: “better the devil you know than the devil you don’t”.  This is typically the current situation where some patients have to choose between the AstraZeneca vaccine, with an unknown probability of death, and running the risk to get the Covid19, with a better known probability of death.

The second game illustrates a different bias, which we tend to mix up with ambiguity aversion: namely risk aversion. Confronted with two options, one that is risky while the other is certain, we tend to prefer not to take any risk – which in this example means taking the light green ball, even though both options offer exactly the same gain on average.

Those two biases are not necessarily linked, as some people may be ill-at-ease with ambiguity while indifferent to risk, or the other way round!  

The first game illustrating the ambiguity aversion corresponds to what we call the Ellsberg paradox. Developed by Daniel Ellsberg in 1961, it was brought to light to question the rationality assumption, which had been taken for granted by classical economists for two centuries. This paradox allowed to demonstrate in a new way that we are not always rational in our decision-making. And this discovery had huge impacts since it later gave birth to a whole new academic field within finance, which aims at integrating cognitive biases and emotions into our decision-making process to better understand and anticipate it… Behavioral finance was born.  

To learn more on this:

Introduction to behavioral economics

Cognitive biases and risk decision-making 

The Allais Paradox or the limits of rationality

Second game:

You still have two bowls in front of you. The left one contains 5 red balls and 5 green balls, while the right one only contains one light green ball. Red and green have the same properties as before, while light green makes you gain nothing. Which bowl do you choose to draw from?

 

In the first game, and this might be your case too, most people choose to draw a ball from the first bowl, as they prefer to know precisely which risk they are taking. In cognitive science, this behavior is called ambiguity aversion. This bias corresponds to the idea that we prefer known risks over unknown ones, fitting with the famous proverb: “better the devil you know than the devil you don’t”.  This is typically the current situation where some patients have to choose between the AstraZeneca vaccine, with an unknown probability of death, and running the risk to get the Covid19, with a better known probability of death.

The second game illustrates a different bias, which we tend to mix up with ambiguity aversion: namely risk aversion. Confronted with two options, one that is risky while the other is certain, we tend to prefer not to take any risk – which in this example means taking the light green ball, even though both options offer exactly the same gain on average.

Those two biases are not necessarily linked, as some people may be ill-at-ease with ambiguity while indifferent to risk, or the other way round!  

The first game illustrating the ambiguity aversion corresponds to what we call the Ellsberg paradox. Developed by Daniel Ellsberg in 1961, it was brought to light to question the rationality assumption, which had been taken for granted by classical economists for two centuries. This paradox allowed to demonstrate in a new way that we are not always rational in our decision-making. And this discovery had huge impacts since it later gave birth to a whole new academic field within finance, which aims at integrating cognitive biases and emotions into our decision-making process to better understand and anticipate it… Behavioral finance was born.  

To learn more on this:

Introduction to behavioral economics

Cognitive biases and risk decision-making 

The Allais Paradox or the limits of rationality