Don’t get fooled by randomness (or mis- or disinformation)

The mind tricks us into seeing patterns when there are none. This causes all kinds of problems.

According to Nassim Taleb, people routinely mistake luck for skill, randomness for determination, and misinformation for fact. These are the factors that cause us to process information incorrectly, and distort our views of the world:

All-or-nothing view

We tend to see events as all-or-nothing, rather than shades of probability. If you see a successful business executive, they’ll tell you that the reason for their success is some combination of hard work and talent. But what if you have thousands of successful business executives? Probability tells us that some of them must be successful only through sheer luck. But none of them will admit to that.

Affirming the consequent

Our brains slip up in trying to find patterns to explain things. Studies show that most successful business leaders are risk-takers. Does this mean that most risk-takers become successful? Ask those whose risks failed.

Hindsight bias

We also ignore probability when it comes to past events. When we look back at things that have happened, it seems like they were inevitable. Just listen to any political pundit explain why a shock election result was obviously going to happen. This means we ignore the effect of random chance, instead preferring to rationalise why something happened.

Survivorship bias

If we only see the survivors of an event, our brains assume everyone survived. Or at least, the survival rate is good. Take the view that stock markets just go up in the long run — imagine how rich you’d be today if you’d invested in the stock market in 1900?

But this view is based on the stock markets and companies that have survived to this day. If you really did invest in stocks in 1900, you probably would’ve bought into the developed markets of Imperial Russia or Argentina rather than the emerging United States, and most of the companies you bought into would have failed.

Mistaking a one-off for the whole

Similar to taking an all-or-nothing view, Taleb warns us against conflating details with the ensemble. This is particularly acute when you’re facing nefarious agents that are spreading disinformation (information that’s deliberately spread to deceive) or unwitting redistributors of misinformation (incorrect or misleading information presented as fact). For instance, in saying there are Neo-Nazis in Ukraine — how relevant is it to the whole? what is the proportion? There are unfortunately Neo-Nazis in most large countries, including in Russia.

Okay the problem is you take something called a ‘detail’, say an anecdote, single random event, and by emphasizing it, making you think that that detail represents the ensemble. It does not, and we are much more vulnerable to details, the salient details. Something psychologists like Danny Kahneman called the representativeness heuristic.

Nassim Nicholas Taleb, Disinformation and Fooled by Randomness

In short: an anecdote falls far short of conclusive evidence.

See also Brandolini’s Law or the bullshit asymmetry principle that states that the effort to refute bullshit is an order of magnitude larger than is needed to produce it.

Watch out for

All of these combine to make us ignore the possibility of rare, disastrous events like a huge market crash. It’s hard for us to comprehend something that’s never happened before. So we discard the possibility, in the same way statisticians might discard outlying results to avoid skewing an average. This is a mistake and leads to paralysis when rare events do happen.

History teaches us that things that never happened before do happen.

Nassim Nicholas Taleb, Fooled by Randomness

With thanks to Ivan Edwards who wrote most of this post. Thanks Ivan!


Gladwell, Malcolm (April 2002), Blowing Up, The New Yorker

Taleb, Nassim Nicholas (2001), Fooled by Randomness: The Hidden Role of Chance in Life and  in the Markets, Penguin

Dunning-Kruger effect: Are you pickaxing to the plateau or plummeting past the peak?

This is the model that explains why after two hours into an edX Python course you’re stuffing a backpack, searching for motels in central Menlo Park and muttering gibberish to your cat about s3 buckets and a world-beating app that’s the Uber of dentistry.

Still, on the flipside, the model offers solace when you know nothing after six months. Keep going, it gets better.  

This is a phenomenon that, in the words of David Dunning, “visits us all.” It’s not a model that’s about others, or them, it afflicts everyone. It forms part of a wider cognitive bias called naive realism. Our brains are authors of our own reality. We rationalize. Our “self-talk” normally paints us – our ego, what we’ve done, who we are – in a glowing, radiant light.

Unfortunately, this witness cannot be trusted, m’lud.

The first rule of the Dunning-Kruger club is you don’t know you’re a member of the Dunning-Kruger club.

David Dunning

Watch out for

The original effect has morphed incorrectly from the original research. It’s “poor performers are overconfident,” not “beginners are overconfident.”

Try to think in terms of probabilities rather than certainties.

Don’t confuse facts (which can be found true or false) with opinions (that can’t). Opinions can usually be prefaced with words like “I think”, “we should” and “they ought.” They are beliefs. The trouble comes when we bend the facts to fit our opinions. Facts shouldn’t bend. And, yes, that’s an opinion.   

This is the is-ought problem or Hume’s law. This brilliant, animated illustration from BBC Radio 4 read by Harry Shearer (aka Principal Skinner) explains it sweetly in less than 90 seconds. The whole series is great, so go listen.

Also see the will vs skill coaching matrix. Managers will often see this behaviour.


Dunning, D. (2011). The Dunning–Kruger Effect: On Being Ignorant of One’s Own Ignorance. Advances in Experimental Social Psychology, Volume 44, Pages 247-296
Kruger, J. & Dunning, D. (1999). Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology
Resnick, B. (June 26, 2019). An expert on human blind spots gives advice on how to think. Vox