top of page
Nuala Walsh

Irrational Decision-Making: Predictable during Uncertainty…?


As we ridicule the absurdity of human behaviour being reduced to toilet-roll hunting- while simultaneously popping an extra roll into our shopping basket! – it may be useful to recall some irrational but highly predictable biases that our minds become more susceptible to during times of uncertainty.

While awareness will not necessarily insulate us from the perils of irrational decision-making, it may help us self-regulate our actions and behaviour.

How We Decide Matters

As widely written about in the behavioral science literature, we make decisions in two ways (Kahneman, 2011): our System 1 thinking is quick and instinctive (eg turn right or left when driving) compared to the slower and more deliberative System 2 thinking (eg take vacation in Rome or Rio). In times of panic, emotion or ego-depletion, we have little time for what may be interpreted as luxurious System 2 and tend to default more to System 1.

Five Key Biases Influencing our Decisions & Responses.

Psychologists and economists suggest various explanations for our predictably irrational behaviour which can be amplified under situations of extreme uncertainty.

1. Social Herding:

Our propensity to follow others is well-known and well-reasoned yet often ignored. Others’ choices provide a useful source of information and lowers our cognitive effort. We also tend to believe that ‘others know more than we do’ and rely on this signal to help our decisions. During uncertainty, this carries a force multiplier effect and is amplified in the search for the right direction. Equally, in seeking comfort, decision-making can be reinforced by the belief that ‘others think like me and agree with me’. This is the False Consensus Effect which can propel people to frantically scramble for toilet-rolls, assuming others will. This herding is also contributed to by the fear of missing out. Prospect theory rightly suggests that the pain of losses loom far greater than that of equivalent gains, and we are predisposed to avoid pain, even if the choice we makes is not in our rational self-interest.

Behavioural economists like to explain similarly irrational stock market behaviour by referring to speculative bubbles like Tulipmania. In 1637, the exotic Semper Augustus tulip sold for up to 6 times the average annual salary of a Dutch citizen. Knowing our human disposition to crowding is helpful in stressful times.

2. Availability Bias:

Information impacts how we see the world. An availability bias occurs when we over-rely on easily-recalled, accessible information. This is often related to salience and recency of recall. The danger is that we ignore what is unfamiliar, not easily imagined or has a low probability (eg. shark attacks). Equally, we see things that readily come to mind as indicative of the future and assume momentum continuation. This is not always the case.

During uncertainty and anxiety, thinking is compromised, horizons distorted and biases amplified. Intentional avoidance increases when negative information contradicts our beliefs or feels uncomfortable. Volume of information is debilitating as too much becomes wallpaper and messages get lost. This availability bias can be mitigated when counterfactual examples gain higher level of exposure which in turn aids level of recall.

3. Overconfidence & Illusion of Invulnerability:

Described as one of the most ‘pervasive biases in calibration’ (Koriat, Lichtenstein, & Fischhoff, 1980), overconfidence is the unjustified, distorted belief in the supremacy, validity and accuracy of one’s own ideas, usually without adequate foundation. We see this a lot in business! It occurs where people believe certain outcomes will happen, positive or negative. Most people are overconfident, a factor not related to age, race or class.

Three types of overconfidence are identified in science literature (Moore & Healy, 2008): (i) Overestimation of ability or success rates (eg. earnings forecasts); (ii) Overprecision of belief accuracy (eg. traders stock-picking); (iii) Overplacement of performance or ‘better-than-average’ effect (eg. surviving illnesses). During uncertainty, each of these gets more entrenched, an observed dilemma for those in positions in authority.

Biases are inter-related, driven by motivational and cognitive reasoning throughout the decision process. Overconfidence is also closely linked to the illusion of control when people engage in unconscious denial to safeguard themselves against a negative outcome. Underestimating downside or over-estimating upside risk creates excessive risk-taking which impact security, sustainability and well-being.

4. Probability Neglect:

We also tend to unconsciously disregard probability and overfocus on the negativity of outcomes. Again this is amplified when making decisions under uncertainty, high emotion or stress, as outlined by Sunstein (2003), using the case of insurance against terrorism. Small risks can be completely neglected or significantly overrated, which leads to ripple effects. This can explain excessive reactions to low-probability risk eg the shark attack.

In an experiment, Rottenstreich & Hsee (2001) explored the sensitivity of probability-outcome responses under high/low risk conditions. They asked one group of people how much they would pay to avoid a 1% chance of a ‘quick, painful but not too dangerous electric shock’. Another group was asked about their willingness to pay to avoid a 99% chance of shock. To avoid a 1% chance, people were willing to pay an average of $7. Yet to avoid a 99% chance, they were only willing to pay $10. People simply didn’t register the magnitude of probability difference between a 1% probability compared to a 99% probability. This happens more often in stressful or deeply uncertain situations.

5. Confirmation Bias:

The prevalent decision bias that we all know but still fall victim to. This occurs when we seek out the views that support our position, commonly seen in sports fandom and fanaticism. In seeking truth or direction amid uncertainty, our need for security will amplify this bias. As context matters and minds rarely change, especially with high-conviction judgements, reasons are unconsciously filtered to justify our System-1 intuitions and are more easily retrieved than counterfactuals. Ego-centrism also dominates as we trust intuition and experience, dismissing any counter-explanations.

Relatedly, anchoring can then occur as people fail to adjust from their deep-conviction ideas. Add in the endowment effect as we significantly overweight our own ideas and our judgement becomes further impaired. Of course, commitments escalate when made publicly making it difficult to shift positions.

Crises pass as history shows that all do. With emotional responses inevitable, signalling calm matters as others use colleagues, friends and leaders as their behavioural reference point and social cue. Leveraging social norms and nudges for good, using community identity to reinforce desired target behaviours matters. Plan for the worst but transmit hope for the best. Hope is contagious and achieves great things if mobilised for the greater good.

These biases are all highly predictable, dynamic and inter-related. Most of us will become more susceptible to them in our decisions and behavioural responses during times of uncertainty. Whether they are irrational depends on the subjective assessment of the cost-benefit equation. While awareness will not necessarily insulate from flawed or suboptimal decision-making, it will certainly help us make the best decisions possible for the good of ourselves, our families, our clients and our society.

© MindEquity Ltd. Business, Brand & Behaviour Consulting.

Contact: Nuala@mindequity.co.uk

Sources:

Kahneman, D. (2011). Thinking, fast and slow. Macmillan.

Koriat, A., Lichtenstein, S., & Fischhoff, B. (1980). Reasons for confidence. Journal of Experimental Psychology: Human learning and memory, 6(2), 107.

Moore, D. A., & Healy, P. J. (2008). The trouble with overconfidence. Psychological review, 115(2), 502.

Rottenstreich, Y., & Hsee, C. K. (2001). Money, kisses, and electric shocks: On the affective psychology of risk. Psychological science, 12(3), 185-190.

Sunstein, C. R. (2003). Terrorism and probability neglect. Journal of Risk and Uncertainty, 26(2-3), 121-136.

bottom of page