Notes from the coaching room
There is a rich seam of leadership writing about how humans think. What most of it amounts to is a recipe to challenge ourselves, and sometimes others, about we think what we think and what we think can stand up to even our own analysis. In coaching conversations this is often difficult at first, as deep-rooted patterns of thinking suddenly feel more precarious and we wonder how solid some of our foundations may be, but some simple techniques can soon be acquired which can be a powerful tool to help us succeed.
Nobel Prize-winning Economist, Daniel Kahneman, with his 2011 book ‘Thinking Fast and Slow’ encapsulates the purpose, power and perils of humankind’s relationship with habit. Drawing on evolutionary psychology and neuroscience, Kahneman illustrates how ‘fast’ thinking, or relying on established patterns of thought, is a hard-wired feature of humankind as a means to conserve energy, manage risk and get things done. Taking time to think about, e.g. whether to stop at a red light is probably not a good use of energy or time and possibly a dangerous one.
While safely relying on the acquired habit before a red light is objectively smart, it becomes less so when more complex situations arise and here the hard-wired reliance on habit becomes a potentially risky hindrance and ‘slow’ thinking becomes the method of choice. While that may be self-evident over e.g. a decision to buy a home, Kahneman illustrates that the attraction of habit is sufficiently strong that our instinctive thinking remains heavily habitual rather than selecting a different response when required. Even buying a house can be jeopardised by not understanding whether we are seeking comfort in what we know when more challenging thinking could deliver a better home for our needs. By definition, all fast thinking is biased and based on assumptions that we need to deliberately challenge if we are to reliably make good decisions in novel or complex situations.
Organisational thinker Chris Argyris identifies this in his Theories of Action. In this long study of high-performing executives in global firms, Argyris identifies that they can identify both how things in their organisation should be – ‘theories of use’ - and how they actually are - ‘theories of action’ – and identifies a series of individual assumptions that drive the difference between the two – a goal to maximise personal ‘winning’ and minimise ‘losing’; a belief that suppressing negative feelings is required and a desire to appear as rational as possible. Argyris didn’t actually coin the phrase, but this has become known as ‘what makes smart people do dumb things’, and led to a well-known technique to change it, Double Loop Learning.
‘Critical Thinking’ has become the catch-all term to bring all this thinking about thinking into a set of practical solutions.
The first and most common solution is to identify individual thinking styles and, critically, highlight how they may differ between individuals in teams. Myers-Briggs is the longest-established in a long line of tools of increasing sophistication aiming at developing self-knowledge about our individual thinking preferences. Harrison and Bramson in ‘The Art of Thinking’ gave names to what we often see in ourselves and others – the creative ‘synthesist’ who is also argumentative; the values-driven ‘idealist’, who frustrates others by struggling to get things done; the driven ‘pragmatist’ who can be liable to cut corners; the detailed ‘analyst’ who lacks conclusive direction – these and other thinking styles have been a helpful entry-point into challenging our own and others assumptions about situations and decisions.
Equally helpful has been the development of what are known as ‘thinking errors’ – a list of the common habits that lead to bias in decisions making. A few of them are set out below:
- Filtering - only seeing data that supports your bias and discounting the rest
- Polarising - only seeing extremes and ignoring middle ground
- Overgeneralising - assuming that things are ‘always like this’
- Jumping to conclusions - expressing a belief without the data to support it
- Catastrophising - expecting the worst
- Personalisation - assuming that things are always someone’s fault
- Emotional reasoning - over-reliance on emotion rather than data
- Fallacy of change - believing that changing someone else is necessary for success
- Always being right - ignoring the potential for personal error
- Heaven’s reward fallacy - expecting to suffer and be rewarded for it later
Whether it’s Double Loop Learning, indulging in slow thinking or identifying personality types and preferences, the lesson from Critical Thinking techniques is that we are liable to be blinded by our habits and instincts when faced with decisions and may well ignore better options. If making the right call is important (and when isn’t it?), taking some time to challenge makes great sense.
Things for modern leaders to consider:
- Critical thinking is a tough topic because it requires leaders to question themselves and their thinking habits which, almost by definition, are individual blind spots. This particularly lends itself to working with a coach. There are practical things to do, ‘though, which can move it along. What are your assumptions? Why have you made them? Are they relevant?
- What parts of your thinking are based on what’s happened before? Is it a reliable guide? What is different about this experience from the last time you saw something similar? What is the same? Does that make you question the value of the experience this time around?
- What habits have you acquired that are in the way of making good decisions? What about the team, or the organisation? What can you do to make a change?
Xytal is one of the leading British consultancy developing leadership in the health sector. Notes from the coaching room is drawn from our real experience of the issues faced by many leaders in improving their practice. If you found this article beneficial, contact one of our team to learn more.