Thinking, Fast and Slow [Speed Summary]
- Thinking, Fast and Slow
- Author: Daniel Kahneman
- Publisher: Allen Lane
- Publication: 2011
How persuasive is your marketing?
If you’re thinking about persuasion in marketing in terms of rational argument, then you’re missing out on the true power of persuasion; persuasion by appealing to people’s intuition not their reason.
Thinking, Fast and Slow is an accessible overview of Nobel prize-winning insights from psychologists Daniel Kahneman and Amos Tversky’ into how human intuition works.
The basic idea is simple – there are two routes to persuasion, based on two basic modes of thinking.
- “System 1” (OS 1) thinking is intuitive thinking – fast, automatic and emotional – and based on simple mental rules of thumb (“heuristics”) and thinking biases (cognitive biases) that result in impressions, feelings and inclinations.
- “System 2” (OS 2) thinking is rational thinking – slow, deliberate and systematic – and based on considered evaluation that result in logical conclusions.
The problem with much marketing is that it tries to appeal to “System 2” thinking with rational ‘persuasion attempts’ when most consumers, most of the time, use System 1 intuitive thinking to judge, decide and act (mostly because System 1 thinking is easier and faster).
The opportunity for marketing therefore is to focus on System 1 intuitive thinking, and persuade through intuitive appeal as well as rational argument.
To do this, marketers need to understand the basic social, cognitive and emotional factors that underpin System 1 intuitive thinking.
Intuitive thinking is not magical; something ‘feels right’ when it fits a mental rule of thumb (heuristic) or the way our mind is wired (cognitive biases). For example, we are intuitively drawn to brands, quite independently of any rational appeal, based on their
- Salience (fame)
- Sentiment they evoke (feeling)
- Signals they communicate (form)
System 1 (OS 1) Heuristics
- Affect Heuristic – we intuitively think that if the decision feels good, it’s the right decision (basing decisions on emotional reaction rather than a calculation of risks and benefits)
- Anchoring Heuristic – we intuitively think that recently acquired information is relevant when making a decision – even when it is not
- Availability Heuristic – we intuitively think the things we remember are more likely to happen again and that they are more important (attributed importance is based on the ease they are retrieved from memory, and this is largely determined by the extent of coverage in the media)
- Representativeness Heuristic – we intuitively think that different events that seem similar to us have a similar likelihood of occurrence – when often they don’t
- Commitment Heuristic – we intuitively think that if we’ve already invested in a decision, we should continue to do so (AKA “endowment effect” – people justify increased investment in a decision based on the cumulative prior investment, despite new evidence suggesting that the cost, starting today, of continuing the decision outweighs the expected benefit).
System 1 (OS 1) Biases
- Belief Bias – our thinking is biased by how believable we personally find a conclusion
- Confirmation Bias – our thinking is biased towards interpreting information in a way that confirms preconceptions
- Optimism Bias – our thinking is biased towards being over-optimistic, overestimating favorable and pleasing outcomes
- Hindsight Bias – our thinking is biased by the illusion that past events were as predictable at the time they happened as they are now.
- Framing Effect – our thinking is biased by how information is presented (90% fat-free feels better than 10% fat)
- Loss Aversion – our thinking is biased by an aversion to loss – eliminating the risk of losing is preferable to increasing the risk of winning (prospect theory).
- Narrative Fallacy – our thinking is biased by the assumption that good stories are true stories
- Regression Fallacy – our thinking is biased by not taking into account the chance component of events
- Planning Fallacy – our thinking tends to overestimate benefits and underestimate costs, making us more likely to engage in risky behaviour
- Halo Effect – our thinking is biased by existing judgements about a person – if we judge them positively in one respect, we’re likely to assume they’ll be positive in another
- The Law of Small Numbers – our thinking is biased by generalising from the particular – we make the assumption that a small sample is representative of a much larger population.
- WYSIATI – out thinking is biased by the assumption that – What You See Is All There Is – so we discount or ignore what we don’t know
In spelling out the System 1 heuristics and biases that are the DNA of human intution, ‘Thinking, Fast and Slow’ provides marketers with a blueprint for a new generation of intuitive marketing based on appealing to consumers intuitive selves as well as their rational selves. Persuasion is still the name of the game – but through appeals to intuition, rather than logic.