Strategy and Dunning Kruger

How can we make strategy discussions more about strategic thinking and real discourse rather than innovation theatre and programmatic analysis?

The following is an insightful passage from Adam Grant in a 2021 HBR piece:

The legend of Steve Jobs is that he transformed our lives with the strength of his convictions. The key to his greatness, the story goes, was his ability to bend the world to his vision. The reality is that much of Apple’s success came from his team’s pushing him to rethink his positions. If Jobs hadn’t surrounded himself with people who knew how to change his mind, he might not have changed the world.

For years Jobs insisted he would never make a phone. After his team finally persuaded him to reconsider, he banned outside apps; it took another year to get him to reverse that stance. Within nine months the App Store had a billion downloads, and a decade later the iPhone had generated more than $1 trillion in revenue.

Almost every leader has studied the genius of Jobs, but surprisingly few have studied the genius of those who managed to influence him. As an organizational psychologist, I’ve spent time with a number of people who succeeded in motivating him to think again, and I’ve analyzed the science behind their techniques. The bad news is that plenty of leaders are so sure of themselves that they reject worthy opinions and ideas from others and refuse to abandon their own bad ones. The good news is that it is possible to get even the most overconfident, stubborn, narcissistic, and disagreeable people to open their minds.

He goes on to give a few examples of techniques that can be used to change mind set and behaviour including asking someone to explain how something works, letting that person take the reigns (plant the seeds of ideas but let them develop), find a way to praise the narcissist, disagree with the disagreeable and keep pushing your ideas. In another HBR article from 2018, Tony Schwartz suggests that ultimately, personal transformation requires the courage to challenge one’s current comfort zone, and to tolerate that discomfort without overreacting. He suggests a series of provocative questions to ask leaders and their teams to build a practice around asking themselves:

  • “What am I not seeing?
  • “What else is true?”
  • “What is my responsibility in this situation?”
  • “How is my perspective being influenced by my fears?”

In the book Thinking Fast and Slow (Daniel Kahneman) as well as the book Noise (by Kahneman and colleagues), the authors highlight the challenges of cognitive bias and noise in the decision making process. Biases such as confirmation bias, anchoring, framing effect etc have a significant impact on decisions. Noise is essentially random scatter of choices and evaluations. For example, people may make different decisions or evaluate situations differently from one day to the next or experts in a field may evaluate a problem or case wildly differently where bias is not the only source of error. These books highlight the very real need for some type of awareness training to uncover bias and sources of noise plus the need for ‘rules’ and guidelines (such as checklists) to minimize their impact. A simple example would be the use of checklists in medicine or aviation that have been shown to have a substantial effect in reducing errors from both bias and noise.

A useful way of thinking about why so many senior leaders believe they know it all despite it being obvious to many others in the organization that the reality they see is not the same as what others see is the Dunning Kruger effect.

The Dunning Kruger Effect (Source: Training Peaks)

Despite the concept being subject to some scrutiny, the ideas are useful. According to Psychology Today, the Dunning-Kruger effect is a cognitive bias in which people wrongly overestimate their knowledge or ability in a specific area. This tends to occur because a lack of self-awareness prevents them from accurately assessing their own skills:

  • Those with limited knowledge in a domain suffer a dual burden: Not only do they reach mistaken conclusions and make regrettable errors, but their incompetence robs them of the ability to realize it
  • Those who are the least skilled tend are most likely to overestimate their abilities
  • Even smart people can be affected because having intelligence is not the same thing as learning and having a specific skill. Many individuals believe their experiences in one area are transferable to another
  • Confidence is so highly valued that people would rather pretend to be smart or skilled than looking inadequate
  • To avoid falling prey to the Dunning-Kruger effect, people can honestly and routinely question their knowledge base and the conclusions they draw, rather than blindly accepting them
  • Individuals could also escape the trap by seeking others whose expertise can help cover their own blind spots, such as turning to a colleague or friend for advice or constructive criticism. Continuing to study a specific subject will also bring one’s capacity into a clearer focus

This clearly has some relevance to what we see in the archetypal western leader who has been brought up on a diet of individualism and meritocracy. This may be even more pronounced in countries such as the US, UK and Australia which according to the research of Geert Hofstede, are some of the most individualistic nations on earth. Individualistic countries tend to place a high value on achieving goals where success is attributed to ‘I’ (my own capabilities). One’s success is also due to one’s own efforts (merit) where a person gets what they deserve. Hence those who have reached the C suite attribute much of that success to themselves whilst believing what got them there will continue to serve them well, despite evidence to the contrary. Confidence is good, misplaced confidence verging on arrogance is not.

Leave a comment