By Gurwinder (@G_S_Bhogal)
In 40 tweets I will describe 40 powerful concepts for understanding the world. Some are complex so forgive me for oversimplifying, but the main purpose is to incite curiosity. Okay, here we go:
Things rarely happen for just 1 reason. Usually, outcomes result from many causes conspiring together. But our minds cannot process such a complex arrangement, so we tend to ascribe outcomes to single causes, reducing the web of causality to a mere thread.
A die rolled 100 times has equal probabilities to 100 dice rolled once; rolling a die is “ergodic”. But if the die gets chipped after 10 throws so it’s likelier to roll 4, then 1 die 100 times =/= 100 dice once (non-ergodic). Many treat non-ergodic systems as ergodic.
Awareness of the limitations of cognition (thinking) requires a proficiency in metacognition (thinking about thinking). In other words, being stupid makes you too stupid to realize how stupid you are.
When many simple objects interact with each other, they can form a system that has qualities that the objects themselves don’t. Examples: neurons creating consciousness, traders creating the stock-market, simple mathematical rules creating “living” patterns.
An ideology parasitizes the mind, changing the host’s behavior so they spread it to other people. Therefore, a successful ideology (the only kind we hear about) is not configured to be true; it is configured only to be easily transmitted and easily believed.
Mistakes grow. Beliefs are built on beliefs, so one wrong thought can snowball into a delusional worldview. Likewise, as an inaccuracy is reposted on the web, more is added to it, creating fake news. In our networked age, cumulative errors are the norm.
We overemphasize the examples that pass a visibility threshold e.g. our understanding of serial killers is based on the ones who got caught. Equally, news is only news if it’s an exception rather than the rule, but since it’s what we see we treat it as the rule
A trend can appear in groups of data but disappear when these groups are combined. This effect can easily be exploited by limiting a dataset so that it shows exactly what one wants it to show. Thus: beware of even the strongest correlations.
a special instance of Simpson’s paradox applied to elections, in which a populace prefers candidate A to candidate B, candidate B to C, and yet candidate C to A. This occurs because the majority that favors C is misleadingly divided among different groups.
A common tactic by journos & politicians of revealing intriguing but relatively innocent info to satisfy curiosity and prevent discovery of more incriminating info. E.g. a politician accused of snorting cocaine may confess to having smoked marijuana at college.
Nothing is ever as important as what you’re thinking about while you’re thinking about it. E.g. worrying about a thing makes the thing being worried about seem worse than it is. As Marcus Aurelius observed, “We suffer more often in imagination that in reality.”
As a social issue such as racism or sexual harassment becomes rarer, people react by expanding their definition of it, creating the illusion that the issue is actually getting worse. I explain the process in detail here
People tend to get their information from where it’s easiest to look. E.g. the majority of research uses only the sources that appear on the first page of Google search results, regardless of how factual they are. Cumulatively, this can skew an entire field.
Arguments we’d normally reject for being idiotic suddenly seem perfectly logical if they lead to conclusions we approve of. In other words, we judge an argument’s strength not by how strongly it supports the conclusion but by how strongly *we* support the conclusion.
Phenomenon where a group goes along with a norm, even though all of the group members secretly hate it, because each mistakenly believes that the others approve of it. (See also: Abilene Paradox)
The Petrie Multiplier
In fields in which men outnumber women, such as in STEM, women receive an underestimated amount of harassment due to the fact that there are more potential givers than receivers of harassment. (See also: Lotka–Volterra equations)
An article makes a claim without evidence, is then cited by another, which is cited by another, and so on, until the range of citations creates the impression that the claim has evidence, when really all articles are citing the same uncorroborated source.
As the living standards in a society rise, the people’s expectations of the society rise with it. The rise in expectations eventually surpasses the rise in living standards, inevitably resulting in disaffection (and sometimes populist uprisings).
Ultimate Attribution Error
We tend to attribute good acts by allies to their character, and bad acts by allies to situational factors. For opponents, it’s reversed: good acts are attributed to situational factors, and bad acts to character.
When someone, usually an intellectual who has gained a cultish following for popularizing a concept, becomes so drunk with power he thinks he can apply that concept to everything. Every mention of this concept should be accompanied by a picture of @nntaleb.
Pattern of nature in which ~80% of effects result from ~20% of causes. E.g. 80% of wealth is held by 20% of people, 80% of computer errors result from 20% of bugs, 80% of crimes are committed by 20% of criminals, 80% of box office revenue comes from 20% of films
When people reject a thing because it compares unfavorably to an ideal that in reality is unattainable. E.g. condemning capitalism due to the superiority of imagined socialism, condemning ruthlessness in war due to imagining humane (but unrealistic) ways to win.
Synonyms can yield positive or negative impressions without changing the basic meaning of a word. Example: someone who is obstinate (neutral term) can be “headstrong” (positive) or “pig-headed” (negative). This is the basis for much bias in journalism.
An excess of something can give rise to its opposite. E.g. A society that is too liberal will be tolerant of tyrants, who will eventually make it illiberal.
When a person sees an agreeable characteristic in something or someone, they assume other agreeable characteristics. Example: if a Trump supporter sees someone wearing a MAGA cap, he’s likely to think that person is also decent, honest, hard-working, etc.
Outgroup Homogeneity Effect
We tend to view outgroup members as all the same e.g. believing all Trump supporters would see someone wearing a MAGA cap, and think that person is also decent, honest, hard-working, etc.
Advantage begets advantage, leading to social, economic, and cultural oligopolies. The richer you are the easier it is to get even richer, the more recognition a scientist receives for a discovery the more recognition he’ll receive for future discoveries, etc.
People in a hierarchy such as a business or government will be promoted until they suck at their jobs, at which point they will remain where they are. As a result, the world is filled with people who suck at their jobs.
Fallacy where someone tries to defend a concept from criticism, or dismiss it as a myth, by unduly claiming it cannot be defined. E.g. “God works in mysterious ways” (god of the gaps), “race is biologically meaningless” (Lewontin’s fallacy).
We use different mental processes in different situations, so each of us is not a single character but a collection of different characters, who take turns to commandeer the body depending on the situation. There is an office “you”, a lover “you”, an online “you”, etc.
When a measure becomes a goal, it ceases to become a measure. E.g. British colonialists tried to control snakes in India. They measured progress by number of snakes killed, offering money for snake corpses. People responded by breeding snakes & killing them.
Radical Phase Transition (my term)
Extremist movements can behave like solids (tyrannies), liquids (insurgencies), and gases (conspiracy theories). Pressuring them causes them to go from solid => liquid => gas. Leaving them alone causes them to go from gas => liquid => solid.
We see a complex natural system, assume that because it looks messy that it must be disordered, then impose our own order on it to make it “legible”. But in removing the messiness we remove essential components of the system that we couldn’t grasp, and it fails.
Shifting Baseline Syndrome
Frog says to Fish, “how’s the water?”
Fish replies, “what’s water?”
We become blind to what we’re familiar with. And since the world is always changing, and we’re always getting used to it, we can even become blind to the slow march of catastrophe.
When a new concept enters the arena of ideas, people react to it, thereby amplifying it. The idea thus becomes more popular, causing even more people to amplify it by reacting to it, until everyone feels the need to talk about it.
It is often necessary to eat chocolate cake.
When someone is restricted from expressing a POV, or pressured to adopt a different POV, they usually react by believing their original POV even more. For a detailed example read my piece on my attempt to deradicalize a neo-Nazi:
There is no actual movement on a TV screen; your brain invents it. There are no actual spaces between spoken words; your brain inserts them. Human perception is like predictive text, replacing the unknown with the expected.
Predictive Coding leads to…
We impose our imaginations on arrangements of data, seeing patterns where no such patterns exist.
A common form of Apophenia is…
When we see a sequence of facts we interpret them as a story by threading them together into an imagined chain of cause & effect. If a drug addict commits suicide we assume the drug habit led to the suicide, even if it didn’t.
Another form of Apophenia is…
For aeons predators stalked us in undergrowth & shadow. In such times survival favored the paranoid—those who could discern a wolf from the vaguest of outlines. This paranoia preserved our species, but cursed us with pareidolia, so we now see wolves even in the skies.