Mind Expander

Learn about concepts and ideas you've probably never heard of.

Random concept

Anna Karenina principle

"Happy families are all alike; every unhappy family is unhappy in its own way", reads the infamous opening line in Tolstoy's Anna Karenina. It reveals a general pattern in life, where an undertaking can fail in a multitude of ways, yet succeed only when everything goes right. This is called the Anna Karenina principle.

Given this asymmetry in outcomes, you could expect most endeavours that follow the Anna Karenina principle to fail. It is simply statistically more likely. But which things follow the principle?

Well, we know that things that follow the Pareto principle don't follow the Anna Karenina principle. If the 20% can contribute 80% of the results, then you can fail in almost everything but still succeed overall. If you cannot identify any factors that have outsized impact on the end result - if the result truly is a sum of everything - then the Anna Karenina principle might apply.

  • Getting rich is probably not an Anna Karenina -type endeavor; randomness plays too big a part. But building a product worthy of reverence - it can only happen when everything goes right. So if you aim to just get money, apply the Pareto principle, shoot many shots, see what sticks. But if you aim at the latter, the Anna Karenina principle is the way to go.
  • To feel healthy in the short term, Pareto applies: any improvement in sleep, exercise and nutrition will most likely do it. But to feel healthy in your 80s or 90s, that happens only if everything goes right (including having luck on your side).

I think it's really important to distinguish which parts of your life you can 80/20 your way through, and where you really need to obsess over every detail.

Learn more:

  • Wikipedia
  • Similar observation: Idios kosmos ("The waking have one common world, but the sleeping turn aside each into a world of his own.").

Tangential goal trap

"The tangential goal trap: You can tell our relationship with learning is perverse when the main pursuit isn’t the action itself, but information about the action. This is very apparent in entrepreneurship: one pursues books, courses and content, rather than being entrepreneurial. There are many studying humanitarian aid for years, without actually aiding humanity – the main goal seems to be more knowledge about the field." (From my post: Information as a replacement for intelligence)

For most goals, there exists a related but distinctly different - and easier! - goal, trying to lure you away from the real thing. If we are not careful, we may fall into this tangential goal trap, thinking we're doing great progress on our goals. Indeed, usually it's easy to identify when you've been caught by the way you describe progress:

"I haven't written much yet but I set up my knowledge management system and signed up for a writing course."

"We don't have any paying customers yet but we're constantly adding features. And we just launched the company newsletter!"

Pursue the real goal, not some easy derivative of it!

Learn more:

  • Paul Graham says: "When startups are doing well, their investor updates are short and full of numbers. When they're not, their updates are long and mostly words."
  • Related concept: Coastline paradox

Load-bearing beliefs

In construction, a load-bearing structure is one you cannot remove, or the whole building falls down. In cognition, a load-bearing belief is one you cannot shake, or a person's whole worldview falls down.

Load-bearing beliefs are deeply engraved, seldomly reassessed. But if something does manage to topple such a belief, that's a moment when a person's life trajectory changes:

  • A hard-working, career-oriented person might hold a belief that the biggest impact they can have on the world is through work. Perhaps they sacrifice their personal health and relationships for it. But then something radical might happen - like getting fired or realizing their work is net negative to society - that makes them re-evaluate their entire life (and take a year-long "spiritual journey" to Asia).
  • The mid-life crisis is likely a toppling-down of the "I'm immortal" belief, a realization that you're going to die one day. When you're young, you tend to believe you have time for everything you want to do, some time in the future. But when this belief is finally disproven, a crisis ensues.
  • There are load-bearing beliefs in matters of religion, namely "I believe" and "There is no God". When one is at their darkest hour, they might give up their beliefs and become anew - the stereotypical person who 'finds the light' or believes God has abandoned them.

When a person's load-bearing belief is struck down, they either rebuild themselves into something new, or they cognitive dissonance their way through, as letting go of the belief would ruin them mentally.

Learn more:

Prime number maze

Rats in a maze can be taught to turn every other time to left, or every third, or on even numbers etc. But you cannot train them to turn based on prime numbers. It's too abstract for them, the concept of prime numbers cannot be taught to a rat.

There can be a very clear, logical, simple rule to get out of the maze, and with just a bit more cognitive ability, the rat could follow it. Or at least it could be taught to follow it. But it is doomed stuck in the maze, for this particular idea is outside its conceptual range.

Likewise, what are the very clear patterns that keep us humans trapped in a maze? How could we acquire just a bit more awareness, a bit more perspective to see the simple escape from the mazes we are in?

Big revelations seem simple in hindsight, once you have developed the cognitive ability, once you have cracked the pattern. Much of life advice passed on by the elderly seems very simple to them, but often this advice is not followed by the young, for they do not see the pattern the elderly do - they only receive the words. Only via experience - via first-hand observation of the pattern - can much be learnt. But can the knowledge, the awareness nevertheless be sped up? Can you twist and turn, explore the maze, look through it with new eyes and metaphors, until you crack the pattern? And once you do, can you explain it with such simplicity that others become aware of the pattern, too?

“Once you see the boundaries of your environment, they are no longer the boundaries of your environment.”

Learn more:

Doorman fallacy

Rory Sutherland, one of the great thinkers on behavioral economics and consumer psychology, explains the "doorman fallacy" as a seemingly reasonable cost-saving strategy that ultimately fails due to a disregard of the unmeasurable.

If you understand a doorman simply as a way to open the door, of course you can yield cost-savings by replacing him with an automated system. But if you understand a doorman as a way to elevate the status of the hotel & create a better guest experience (hail taxis, greet guests, carry bags...), then you realize the full role of the doorman cannot be replaced with an automated system.

Another example: consider a big tech company, who amid financial pressure, decides to cut spending on free lunches, bus transportation to work, on-site massage or other benefits. On paper, the company can quickly generate a hefty cost saving, but they might fall victim to the doorman fallacy. The short-term cost savings come at the expense of a brand as an employer who provides unrivaled employee perks and places employee satisfaction over investor satisfaction, and the long-term damage to attracting new talent and retaining current employees may cost the company a lot more in the years to come.

You can think of the doorman fallacy as an extension or example of the Chesterton's fence fallacy: getting rid of the fence (doorman) without fully understanding why it was there in the first place.

Learn more:

  • Signaling theory
  • Book: Alchemy by Rory Sutherland

Levinthal's paradox

Levinthal's paradox is that if we tried to predict how a protein folds among all decillions of possibilities, it would take us an astronomically long time. Yet, proteins in nature fold in seconds or less. What seems impossible to the human, paradoxically, happens all the time in nature, without us understanding how it is possible.

The paradox is a reminder about nature's immense "wiseness" and our lack thereof in comparison. Nature finds a way, even if we cannot see how. Often, then, the default action for us should be inaction, to let nature do its thing with minimal human intervention. (Related: iatrogenesis)

One can try to chase longevity with a diet of pills, an army of scientists tracking every possible thing about their body, an insane daily routine… Only to be outlived by a random rural person just trying to build stuff with their own hands and eating what they grow or hunt - as nature intended.

Even with constantly improving technology and theory, we ought to be humble and appreciate that nature might be smarter still.

(Note: Since the paradox was proposed, AI has advanced our understanding of protein folding dramatically.)

Learn more:

Coastline paradox

You'd think it would be easy to define the length of a coastline of any given country. Just look at the map and measure the length of the coast, right?

Turns out, it is notoriously difficult to measure this. The closer you look, the longer the coastline.

The coastline paradox - Sketchplanations
Image: Sketchplanations


At what point do you stop zooming in? The problem is fractal!

Many problems in personal life or business possess a similar quality. Seemingly straightforward at first, but increasing in complexity the more you look at it. For example, say you want to be healthier. How should you exercise? How often and when? Should you take supplements or vitamins? What is best for your unique body composition or genetic tendencies? Or if you want to write content online, you can research all the keywords, best website builders, headline formulas, writing advice from gurus, note-taking approaches and tools... until the point where you spend 5% of your time writing and 95% zooming in on the problem!

Suddenly you're overwhelmed and won't even start.

The solution? Just go for a decent ballpark - the 80% solution - and stop overanalyzing. In the health example, just sleep more, move more, eat better, and forget about the small stuff. Don't get sucked into the fractal.

Learn more:

Quantum Zeno effect

The Quantum Zeno effect is a quantum physics application of one of Zeno's paradoxes, proposed by the ancient Greek philosopher, whereby if you observe an object in motion at any given time, it doesn't appear to be moving. And since time is made of these infinitely short snapshots of idleness, there can be no motion.

Of course, motion exists and this is philosophical nonsense, but thanks to it, we have a cool name for the Quantum Zeno effect, which states you can slow down a particle's movement by measuring it more frequently.

The same idea can be applied more widely:

  • The more often you look at your investment portfolio or any analytics, the less changes. If you come back a month or a year later, then things have changed - so what's the purpose of looking daily?
  • You cannot see the effects of your exercise programme in the mirror, but you can in a photo taken 5 months ago.
  • The more often you check on progress with your team, the more you interrupt their work with reports and check in meetings, thus slowing down progress.

Learn more:

Enantiodromia

Enantiodromia is an idea from Carl Jung that things tend to change to their opposites, that there is a cyclical nature between things.

  • "Hard times create strong men, strong men create good times, good times create weak men, and weak men create hard times"
  • The more authoritarian a government, the more it gives rise to a rebellious movement
  • We use antibiotics to kill bacteria, which leads to the extreme, antibiotic-resistant bacteria
  • When you're dying of cold, you get a feeling of warmth; when you're really hot (like in a sauna), you might suddenly feel cold as your body adapts to the heat
  • If there was an extreme of something in your childhood, that might give rise to the opposite in adulthood
  • Cultural trends: Girlbossing vs trad-wifing, the "feminine" man vs "masculine" man ideals, an excess of rationality and modernism leads to wistful thoughts about the old ways...
  • “The empire, long divided, must unite; long united, must divide”

When there's a dominant trend, there is always a counter-trend. A mainstream opinion and the contrarian. Shift and countershift.

Learn more:

  • Wikipedia
  • Horseshoe theory: the political idea that the far-right and far-left resemble each other, rather than being polar opposites

Rohe's theorem

Rohe's theorem: "Designers of systems tend to design ways for themselves to bypass the system."

Or more generally: those who create or advance a system seek to prevent themselves from being exposed to it, perhaps as they have intimate knowledge of the faults or the predatory nature of the system.

Examples:

  • Government officials can exempt themselves from certain rules that other citizens must abide by
  • Big tech employees who block or limit the social media usage of their kids / want to move “off the grid”
  • (Fast food) restaurant workers who rarely eat out
  • Doctors who prescribe medicine but don't let their kids eat meds
  • Teachers who homeschool their children
  • Insurance company workers who don't have insurance themselves

In some cases, the following sayings relate: "never get high on your own product" and "Those who say the system works work for the system."

Learn more:

  • Book: The Systems Bible by John Gall

Ostranenie

Ostranenie (also known as defamiliarization) is about presenting familiar things in unfamiliar ways, in order to generate new perspectives or insights. One could argue that most art aims at this.

Stark example: a movie where the roles of men and women are swapped, so the viewer becomes more aware of the gender roles of the present.

Subtle example: any argument or idea that is presented with the help of a story or metaphor. A speech that just says "work hard, overcome obstacles" is boring and uninspiring, as it is nothing new, but a movie about a successful person who worked hard and overcame difficulties is inspiring, as it brings a new frame to the idea.

I suspect modern books are full of 10-page examples and stories for this reason. They don't have any new ideas, so they settle for stories. (A cliché reframed can be a revelation, though, but it requires more from the artist.)

Anything can be a line for a great comedian because they can describe it in a funny, new way.

Learn more:

Gardener vs carpenter

Alison Gopnik observes two opposing parental methods: the gardener and the carpenter. The carpenter tries to lead the child to a "good path", crafting them into a well-functioning human being, using all the parenting tricks in the book. A gardener is more concerned with creating an environment where the child can develop safely and healthily, and letting the growth happen organically.

Carpenter parenting is one-way, where the adult molds the child into what they believe is best. Gardener parenting is interactive, where the adult creates the conditions in which the child can become themselves. Every parent knows that if you try to "force teach" something to a child, they will probably not learn it, but if you let them learn it themselves....

We underestimate how wise nature is. A child is the most effective learner in the world, by design - so let them learn on their own, instead of giving instructions they don't need, and trapping their minds in the process. By ascribing age-appropriate activities and scheduling the child's weeks like they were on a training camp, you rob them of their own autonomy and creativity. You can do more harm when you try harder!

A bird doesn't need a lecture on flying, and a child doesn't need you to explain how to play with cubes or a cardboard box. Where else are we trying so hard, when we probably should just relax a bit and let nature do its magic?

Learn more:

  • Book: The Gardener and the Carpenter by Alison Gopnik

Thermocline of truth

In large organizations, the key to succeeding is to do great work. Or, at least, to appear so. That's why the culture might seem fake: everyone is doing super-amazing, everything is roses and sunshine!

This is due to the thermocline of truth effect, whereby people tend to report only on the positives and hush down on the negatives. Like warm water rises up and cold stays at the depths, upper management believes everything goes great while the people who do the work know better.

It is hard to resist this effect, as everyone has an incentive to appear successful to their peers and managers. Therefore, things inside an organization resemble a watermelon: green on the outside, red on the inside.

Learn more:

Diseases of affluence

As you get wealthier, your life tends to get more comfortable. And as you remove some discomforts, certain diseases take their place. While wealth or affluence isn't the real cause - behavior is - you may have noticed some of these patterns:

  • More income > eat out / take-out more > restaurant food is often unhealthier than home-cooked > diseases of affluence
  • Don't feel like cleaning / mowing the lawn > more income means now you have a choice to hire someone > lose opportunities for exercise > diseases of affluence
  • Buy car / electric bike instead of cycling, or walking to the public transport > diseases...

Even if you could afford to live more comfortably, should you?

"Advancements to make our lives less physically taxing have taxed us physically." - Katy Bowman

Learn more:

Precautionary principle

New things tend to be, by default, risky because limited evidence exists as to their full consequences. The precautionary principle is to remain skeptical and cautious of things that may have significant consequences, especially as some might be hidden and currently unforeseen.

The general idea is that when you change or add something, you introduce complexity, and we usually cannot predict all the results of this complexity. So there should be strong evidence or reason for change, otherwise the potential negative results may overweight the positives.

Consider that your friend recommends you stop eating meat and take in your protein in the form of insects instead, for ecological reasons. The precautionary principle here would have you question this recommendation, given the health impacts of meat consumption are known but the impact of insect consumption at big portions isn't.

Yes, you may receive the same amount of protein, but you would introduce significant change to the enormously complex system that is the body, so there are likely to be hidden consequences. Some of them might be positive, some disastrous, or maybe everything would be fine - we don't know!

The precautionary approach doesn't necessarily mean you're "stuck in your ways", never try new things and never innovate. Rather, it's a more risk-averse approach at trying new things and innovating; perhaps a more humble way to do things, as you appreciate the limits of human knowledge amid vast complexity.

Learn more:

Hysterical strength

Hysterical strength is the (anecdotal) observation that people have superhuman strength when under intense pressure. For example, an otherwise normal person being able to lift a car to save their family member who got stuck under. Hysterical strength is the "break in case of emergency" of human anatomy.

The implication here is that we have a huge reservoir of strength that we cannot tap into under ordinary consequences. Why? Possible explanations:

  • Self-protection: we are strong enough to accidentally break our bodies if our strength isn't regulated, so there might be an unconscious "Central Governor" who protects us... from ourselves.
  • Energy-conservation: if we could use 100% of our strength at will, would we have the strength when we really needed it? It's evolutionarily smart to always leave a bit of gas in the tank, just in case.

You can apply the idea to mental strength as well (sometimes called "surge capacity" in this context). You can work scarily hard when it's required of you. Today you think you're working your hardest, and next week multiple life stressors hit simultaneously, you must push through, and you realize you had probably double the capacity than you think you had.

But like with your body, there's probably a good reason why we cannot use this strength at will, at least easily. Intense stress causes mental damage like burnout and dissociation, similar to how intense physical stress leads to torn muscles and tendons.

How can knowledge of hysterical strength's existence benefit us?

  • Use constraints smartly to make progress faster than you thought possible. If you want a clean home, invite everyone over in 2 days. If you want to finish your presentation, schedule a meeting with your director for next week.
  • You don't need to give up when your body tells you to. If I'm shoveling snow or chopping wood and I get tired, I know I have at least 2 hours left in me. (Just don't get injured in the process.)
  • Take on bigger challenges, your mind and body can probably handle it. Just ensure you get enough rest after.

Many people who have been "tested" are in part happy about it, as they now know what they are capable of. It brings confidence to know that you are much stronger than you think.

Learn more:

Copyright trap

A copyright trap is usually a fake entry in one's original work, designed to trap copycats. If their work also has the fake entry, that means they copied your work - how else would they have the same fake information?

This technique can take many forms. A mapmaker can include a non-existent street or town. A dictionary can have fake words, while a telephone book could have fake phone numbers. A book could reference academic articles, facts or experts that don't exist.

(There are other cool, related instances like people who make rugs manually sometimes purposefully add a tiny flaw, to prove the rug was created by hand, not by a machine. Will something similar happen to writing – adding elements that the reader could fairly reliably assume an AI would not add?)

Elon Musk famously used a related technique, a canary trap, to find a Tesla leaker:

Learn more:

Kolmogorov complexity

Kolmogorov complexity measures the randomness or unpredictability of a particular object.

Consider the following strings of 20 characters:

  1. jkjkjkjkjkjkjkjkjkjk
  2. asifnwöoxnfewäohawxr

The first string has lower complexity, as you could describe the string as "write jk 10 times", while the second string has no discernable pattern, so you could only describe it as "write asifnwöoxnfewäohawxr".

You could assess the originality or quality of information via this type of complexity. For example, the shorter the book summary, the worse the book. If you can cut down 99% of the length while still containing the same key information, that tells something about the quality of the book.

By contrast, the longer the book summary, the better the book, as it implies there are more unique ideas that are taken to a greater depth, warranting more time to explain them properly.

Learn more:

Nutpicking

Nutpicking is a tactic to make the opposing side seem wrong/crazy.

Step 1: Pick the craziest, most extreme example or person from the opposing side

Step 2: Claim that this example/person represents the entire opposing side. Therefore, the entire enterprise must be crazy!

Of course, an extreme example doesn't represent the entire group - that's why it's an extreme example. But some have a hard time realizing this. If you see the nutpicking tactic being used, probably best to exit the conversation.


Learn more:

Semantic stopsign

A semantic stopsign essentially says: "Okay, that's enough thinking, you can stop here". It's a non-answer disguised as an answer, with a purpose to stop you from asking further questions and revealing the truth.

Imagine your friend claims you should rub mud on your face before heading outside, as it will protect your skin from the sun and keep it hydrated.

"Mud? Are you crazy? Why would I do that?"

"It's recommended by scientists."

"Oh... well in that case."

That's a semantic stopsign. It's not actually giving you an answer for why you should rub mud on your face, rather the purpose is to stop you from asking more questions.

Learn to recognize these stopsigns and disobey them. Often the reason someone presents you with a semantic stopsign is because there's an uncomfortable truth close by. When a politician is asked a tough question and they answer "to defend democracy" or "to fight terrorism", they aren't really giving you an answer - they are saying "okay, that's enough questions, you can stop here".

When someone hits you with a stopsign, your natural reaction is to obey and move on. Next time, do the opposite and see where that leads.

Learn more:

Mean world syndrome

You may think the world is more dangerous than it is, as you've been consistently exposed to news about crime, violence and terror.

One may rationally know that the news is mostly about the extraordinary (the ordinary is rarely newsworthy), yet still unconsciously have a biased view of the world as a dangerous place.

Remember that for every instance of danger, another of kindness exists - though it's probably not televised. I believe whether you are pessimistic or optimistic is mostly a question of how well you understand this idea of unlimited evidence and limited portrayal.

Learn more:

Politician's syllogism

The politician's syllogism is of the form:

  1. We must do something.
  2. This is something.
  3. Therefore, we must do this.

Similar form:

  1. To improve things, things must change.
  2. We are changing things.
  3. Therefore, we are improving things.

Watch out for this fallacy in politics but also in corporate speeches and work meetings. The lure of acting is strong, even if sometimes not intervening might be for the best. Suggesting inaction rarely ends well in politics/business, hence the existence of fallacies like this.

Learn more:

Fruit of the poisonous tree

Fruit of the poisonous tree is a legal metaphor used to describe evidence that is obtained illegally. The logic of the terminology is that if the source (the "tree") of the evidence or evidence itself is tainted, then anything gained (the "fruit") from it is tainted as well. Therefore, one may not need to refute the evidence, just prove that the evidence was gained illegally.

Similarly, at least in my mind, if you get money or fame in the wrong way, it doesn't count. The success becomes tainted and ceases to be success.

Learn more:

Autopoiesis

The term autopoiesis refers to a system capable of producing and maintaining itself by creating its own parts. A cell is autopoietic because it can maintain itself and create more cells. Imagine a 3D printer that could create its own parts and other 3D printers.

An autopoietic system is self-sufficient, thus very robust. Some people describe society and communities as autopoietic, maybe even describing bitcoin as an example of autopoiesis. But forget about following definitions too strictly for a second. If you apply the concept of autopoiesis liberally to your life, does it change how you shape your environment and lifestyle, your "personal system"? Does it help you become resistant to external circumstances?

I want to have as tight a hermetic seal in my life as possible:

  • Build my  house and furniture from wood that grows in my backyard. If anything needs repairing, I already have the material.
  • Eat what I grow from my land, and use any food waste as compost for the land to create more food. (Or feed excess apples and crops to deer that I hunt for meat.)

Learn more:

Affordances

A crucial part of human learning is understanding how to interact with the environment. Designers of objects can make this easier for us. An affordance is a property or feature of an object that hints what we can do with the object.

  • If a cord has a button on it, you push it. If a rope hangs from a lamp, you pull it.
  • If the door has a knob, you turn it. If it has a handle, you press it down. If it has a metal plate and no handle, you just push the door.
  • If a word in an article is blue, you can click on it and be taken to a different page.

When you take advantage of affordances, people know what to do. No instructions needed. The less you need to explain, the better the design.

Learn more:

Fredkin's Paradox

Fredkin's Paradox states that "The more equally attractive two alternatives seem, the harder it can be to choose between them—no matter that, to the same degree, the choice can only matter less."

If you had two really different options, it'd be easy to choose between them. But when they are similar, choosing becomes difficult. However, if the two options really are similar, it shouldn't matter too much which you choose, just go with one. The impact of choosing one over the other will be so small that any more time spent analyzing is time wasted.

Because of Fredkin's Paradox, we spend the most time on the least important decisions.

Learn more:

Fosbury Flop

Before 1968, there were many techniques that high jumpers used to get over the bar, though with most, you'd jump forward and land on your feet. Then Richard Fosbury won the Olympic gold with his "Fosbury flop" technique which involves jumping backward and landing on your back. Ever since, the Fosbury flop technique has been the most widespread and effective technique in high jumping, and in fact, it's the technique you have probably seen on TV if you've watched high jumping.

A Fosbury Flop moment happens to an industry or area when a new innovation, approach or technique is introduced that then becomes the dominant option. These are breakthrough improvements that change the course of history.

Learn more:

First penguin

When penguins want food of the fishy kind, they may need to jump off an iceberg into the water. The problem is that they can't see what lies beneath - a predator might be waiting for a herd of penguins to fall straight into their belly. The group of penguins needs to decide whether to find food elsewhere or jump into the unknown.

This is where the "first penguin" (sometimes called "courageous penguin") stands up. They take one for the team and jump off the cliff, while the rest wait to see whether the water is clear. The first penguin takes a huge personal risk to ensure the survival of the group.

Entrepreneurs and risk-takers take a similar leap into the unknown, perhaps with even worse odds. While each risk-taker may not survive (for example, few startups succeed), the world is a better place because these risk-takers exist.

Sebastião Salgado, Chinstrap penguins on icebergs between Zavodovski and  Visokoi islands, South Sandwich Islands, 2009 | Yancey Richardson
Sebastião Salgado, Chinstrap penguins on an iceberg

Learn more:

Expected historical value

Paul Graham wrote on Twitter:

"Let's invent a concept.

Expected historical value: how important something would be if it happened times the probability it will happen.

The expected historical value of current fusion projects is so enormous. I don't understand why people aren't talking more about them."

This concept, though similar to the original concept of expected value, puts a spin on it. Expected historical value is a concept to assess the future importance of things today.

What thing today could become "history" in the future? Sure, it may not get much attention now, but could it be recognized by the future as ground-breaking? What innovation, technology or idea, little discussed today, will be discussed a lot in the future?

There's a personal life spin on this as well: What experience or moment today could become a fond memory 50 years from now? It may not be the moments you'd expect (like a fancy trip) but an ordinary moment (like going for a midnight swim).

Learn more:

Confusopoly

A confusopoly is an industry or category of products that is so complex that an accurate comparison between products is near-impossible.

If you try buying a mobile phone from a teleoperator, you have unlimited options between phone tiers and texting and data limits. Or if there's a new video game launching, you could get an early-bird version, or the early-bird collector edition, or just the collector edition, or maybe a deluxe version which has some features of the other options but not all of them. If you try to buy insurance or banking/finance products, the options are so confusing that you're just inclined to choose one at random and stick with that for life.

My favorite example is toilet paper. Different brands have a different number of rolls in their packages, and they may have a different amount of paper in each roll, and maybe they'll have a value pack, but the normal pack may have a "get 1 roll free" promotion, but the other brand has a 20% discount, and then there's the matter of softness and thickness... In the end, you just give up and go with the most visually pleasing packaging.

Learn more:

Applause light statements

An applause light statement is designed to gain the support or agreement of an audience, the same way an actual applause light or sign indicates to the studio audience that this is the part where you should start clapping.

Usually there isn't much substance behind an applause light statement, and you can spot this by trying to reverse the statement. Consider someone saying "we need to ensure technology benefits everyone, not just a few people". If you reverse it, you get "we need to ensure technology benefits a few people, not everyone". Since the reversed statement sounds odd and abnormal, the original statement is probably conventional knowledge or the conventional perspective and doesn't contain new information. The statement isn't designed to deliver new information or substance, just to gain support.

Too many applause light statements is a characteristic of empty suit business talk, the "inspirational speaker" and the charismatic politician. Very little substance, merely an attempt at influencing perception. Sometimes, though, applause light statements can be useful, like when you want to establish common ground or set the stage for an argument. But most of the time, these statements are used merely as the "easy way" to write a speech and influence an audience.

And yes, applause light statements are more common in speech than in writing. In writing, the reader has more time to process what you said and call you out if you have zero substance behind your words. In speeches, you don't have this luxury, and it's pretty hard to hold a negative opinion of a speech if everyone else around you was nodding and applauding the entire time.

Eliezer Yudkowsky, who coined the term, wrote a tongue-in-cheek speech using only applause light statements. It looks eerily similar to just about every speech I've ever heard:

"I am here to propose to you today that we need to balance the risks and opportunities of advanced artificial intelligence. We should avoid the risks and, insofar as it is possible, realize the opportunities. We should not needlessly confront entirely unnecessary dangers. To achieve these goals, we must plan wisely and rationally. We should not act in fear and panic, or give in to technophobia; but neither should we act in blind enthusiasm. We should respect the interests of all parties with a stake in the Singularity. We must try to ensure that the benefits of advanced technologies accrue to as many individuals as possible, rather than being restricted to a few. We must try to avoid, as much as possible, violent conflicts using these technologies; and we must prevent massive destructive capability from falling into the hands of individuals. We should think through these issues before, not after, it is too late to do anything about them..."

Learn more:

Conjunction fallacy

When something is more specific, more detailed, we tend to think it is more probable, even though specific things are actually less probable. This is the conjunction fallacy - not understanding that there is an inverse correlation between specificity and probability.

For example, consider your standard marketing prediction article - they'll say something like "next year, more will be spent on digital advertising as big companies finally shift their TV budgets to online channels". This may sound logical, but, of course, it is more likely that the companies will spend more on digital advertising next year for any reason vs for one specific reason.

Details place your mind and gut against each other: your gut wants to believe the narrative because it sounds more believable with all those details - you can follow the steps to the conclusion. But your mind should realize that the more specific the prediction or narrative, the less you should believe it.

Imagine someone - perhaps a billionaire or someone with a big Twitter following - explaining their success. The more specific the "blueprint" they claim is the reason for their success, the less probable it actually is that their success can be attributed to it. The secret formula for their success sounds more convincing as they add detail, but it also becomes less probable as the actual cause for their success. Kind of defeats the purpose of buying their "7-step framework" master course, no?

Learn more:

Wittgenstein's Ruler

If you're measuring your height with a ruler and it shows you're 4 meters tall (13 feet), you gained no useful information about your height. The only information you gained is that the ruler is inaccurate.

Wittgenstein's Ruler is the idea that when you measure something, you are not only measuring the measured (your height), but also the measurer itself (the ruler). You might be getting more information about the measurer than the measured!

Another example: the fact that an employee you know to be brilliant receives a mediocre performance rating doesn’t necessarily mean they are mediocre, it could simply mean the performance rating process is broken.

Learn more:

Potemkin village

A Potemkin village is a country's attempt to signal that it's doing well, even though it's doing poorly.

Wikipedia: "The term comes from stories of a fake portable village built by Grigory Potemkin, former lover of Empress Catherine II, solely to impress the Empress during her journey to Crimea in 1787. While modern historians agree that accounts of this portable village are exaggerated, the original story was that Potemkin erected phony portable settlements along the banks of the Dnieper River in order to impress the Russian Empress; the structures would be disassembled after she passed, and re-assembled farther along her route to be viewed again as if another example."

Consider a more modern example of Turkmenistan's capital city, Ashgabat, with enormously expensive buildings, hotels, stadiums, indoor ferris wheels and 18-lane roads... that no one uses. Or consider how during the 1980 Olympics in the Soviet Union, they had painted only the road-facing walls of houses on the way to Moscow. Why bother with the entire house when one wall creates the perception? Similar stories exist from newer Olympics; global events are a juicy opportunity for government signaling.

The term Potemkin village has been used more metaphorically to refer to any construct aimed at signaling you're doing well when you're not. For example,  consider a creator trying to signal that their newsletter is doing well - they'll talk of "readers" and visitors to their website and other Potemkin metrics, but they won't talk of subscribers or money.

Learn more:

Charitable interpretation

Imagine someone presents an argument to you. You have roughly two ways to interpret it:

  • Charitable interpretation: Consider the strongest, most solid version or implication of the argument. You start by assuming that the argument might be true, and you try your best to see how it could be.
  • Uncharitable interpretation: Consider the weakest possible version of the argument. You start by assuming that the argument is wrong, and you try your best to confirm this assumption.

Open-minded people generally interpret what they hear charitably while close-minded people stick to what they think is right and don't bother entertaining other arguments. An open-minded person asks "how could this be right?", a close-minded person asks "how is this wrong?"

Especially on Twitter, snarky repliers will interpret your tweet in the least favorable way possible, identify a weakness in your tweet (which is easy when you've already weakened the tweet with your interpretation), then feel smart when they point out that weakness to you. Of course, everyone else thinks them a fool.

It would make one seem much smarter if they interpreted the tweet in the most charitable way possible, then identify a weakness in it and express it clearly.

Learn more:

Rare enemy effect

The rare enemy effect is a niche but interesting phenomenon in predator-prey interactions.

Consider the case of earthworms. Their number one enemy is the mole (a mole eats almost its body weight in worms each day). So the worms have adapted to recognize an approaching mole by sensing the vibrations of its feet as they dig in the soil. When they sense it, the worms escape to the surface, where mole encounters are unlikely.

However, other predators of worms have figured this out. For example, seagulls perform a "dance" to vibrate the ground, bringing up unwitting worms to the surface, ready for snacking. Wood turtles do a similar thing. Humans, in a process called "worm charming", plant a stick into the ground and vibrate the stick to summon the worms, often to collect them for fishing (but some do the process for fun and sport).

The rare enemy effect happens when Predator A exploits the prey's anti-predator response to Predator B (the main predator). It must be that Predator A is a rather rare encounter to the prey (hence the name of the effect); if they were a main predator, the prey would eventually develop an anti-predator response to them, too.

If you look closely, you can see a similar effect in the human realm, too:

  • One (rather known) trick burglars may perform on you is very similar to what seagulls do. Imagine a man in a suit knocks at your door, informing you that burglars have been spotted in this neighborhood. But you can protect yourself by buying his company's security system. Worry not - before you commit to anything, he can do a free security assessment right away, inspecting potential points-of-entry (and where you keep your valuables). Your anti-predator response is to protect yourself from the burglars, so you let the man in. Whether you sign any contracts or not, you may find your house broken into within a few days.
  • A similar scam happens digitally all the time. You receive an email, saying that your passwords or information security is in danger, and scammers can steal your money or data if you don't act. Of course, acting in this case means paying another scammer, or giving him access to your or your company's accounts, where he will change all passwords and require ransom to let you back into your own systems.
  • Naturally, there are legal forms of exploiting one's anti-predator response, namely fear-mongering advertisements or the exaggerating (real) door-to-door home security salesman.

And do not think that these processes are limited to capturing your money; they want your support, information, attention and time, too. If there's a predictable anti-predator response, there's someone who exploits that response, legally or illegally.

Learn more:

Shibboleth

A shibboleth is a word or a phrase that distinguishes one group of people from another.

Wikipedia has many examples (some rather funny): 

  • Some United States soldiers in the Pacific theater in World War II used the word lollapalooza as a shibboleth to challenge unidentified persons, on the premise that Japanese people often pronounce the letter L as R or confuse Rs with Ls. A shibboleth such as "lollapalooza" would be used by the sentry, who, if the first two syllables come back as rorra, would "open fire without waiting to hear the remainder".
  • During World War II, a homosexual US sailor might call himself a "friend of Dorothy", a tongue-in-cheek acknowledgment of a stereotypical affinity for Judy Garland in The Wizard of Oz. This code was so effective that the Naval Investigative Service, upon learning that the phrase was a way for gay sailors to identify each other, undertook a search for this "Dorothy", whom they believed to be an actual woman with connections to homosexual servicemen in the Chicago area.
  • In Cologne, a common shibboleth to tell someone who was born in Cologne from someone who had moved there is to ask the suspected individual, Saag ens "Blodwoosch" (say "blood sausage", in Kölsch). However, the demand is a trick; no matter how well one says Blodwoosch, they'll fail the test; the correct answer is to say a different word entirely; namely, Flönz, the other Kölsch word for blood sausage.

In many professions, one may have shibboleths as well, to distinguish those who are more experienced or knowledgeable from those who are not. These may not be used in a password-like manner, rather in a manner like this: "If they use certain words or concepts correctly, it's safe to assume they aren't new to this field".

Other times, shibboleths are used as signals of one’s belonging to or belief in a cult – like “wagmi, gm, fud, diamond hands” in the NFT/crypto space.

Learn more:

Relevance theory

Relevance theory explains why a word or a phrase can convey much more information than its literal version. For example, imagine you own a really bad car - it breaks down often, looks ugly and it can barely hold itself together. You swear you'd demolish it yourself if only you had money to buy a new one. Your neighbor asks what you think of your car and you say "Well, it's no Ferrari". What you communicate is very little, but they understand the message.

The reason we can convey more than we say is self-assembly. When something is being communicated to you, you add relevant information from the context to that message (who said it to you, how they said it, who else is there, what has been said before, previous experiences...) until you arrive at the final message. So only a small part of the intended message needs to actually be said explicitly, everything else can happen due to self-assembly. Just think of an inside joke - one word can be enough to convey an entire story.

Once you understand this effect, you have a framework to understand different types of communication.

For example, a good joke gives away a strong enough connection that you understand what is meant, but the connection must be weak enough as not to ruin the punchline. A joke where the connection is too strong isn't fun because you can guess the punchline; there is very little self-assembly required by the receiver of the joke.

A good aphorism (or good advice) gives away a practical enough idea that you can apply it to your own experiences, but it must be abstract enough that you come up with the application yourself. Consider very wise advice: it's never straightforward "do-exactly-like-this" communication, you usually need to decipher the message, to construct your own meaning of it. It's more impactful to make you realize what to do yourself than to tell you exactly what to do.

In advertising, there is an incentive to communicate the most with the least amount of information or time. The bigger portion of the message can be outsourced to the context, the better for the advertiser, since now the receiver of the information does more self-assembly in their heads. This leads to a sort of mental IKEA effect where they'll enjoy your message more, now that it isn't pushed into their heads, but when they themselves are a co-creator of that message.

"How could I not believe their knives are sharp - I practically came up with that thought!"

Generally, the smaller the ratio of what is communicated to what is understood, the better the aphorism, advice, joke, tagline or ad.

Learn more:

  • Wikipedia
  • Inception (Planting "seeds" into someone's head. Plus, it's a good movie)

Category thinking

You're engaging in category thinking when, instead of trying to solve one problem, you try to solve a whole category of problems.

For example, in computer security, you can look at risks within individual systems, or you can look at risks within an entire category of systems, for example, every system that uses Log4j. You can consider what happens if one autonomous vehicle got hacked, or you can consider what happens if all of them were hacked simultaneously because of a security issue affecting every autonomous vehicle.

In your personal life, you could tackle all of your health problems individually, maybe buying medicine for one problem and changing your diet to tackle another problem. Or you could address the entire category by minimizing evolutionary mismatch, to live more like humans have evolved to live.

Category thinking requires a mindset shift, to abstract one level upwards. You need a different set of solutions for an individual problem vs the whole category; it's a different beast altogether to help one homeless person than to help all of them.

Learn more:

Evolutionary mismatch

Evolutionary traits generally change slower than the environment. Therefore, the traits that were once helpful can now be harmful because the environment is different. When this is the case, we talk of evolutionary mismatch.

While evolutionary mismatch can and does happen in animals and plants, humans are a prime example because our environment changes so rapidly.

  • We evolved to value sweet and fatty foods. But now this trait is a disadvantage because sugary and fatty foods aren't scarce anymore - you can get them delivered to your door with one button.
  • Our bodies evolved to be alert under stress, such as when hunting. But now we have chronic stress, and the constant alertness leads to burnout and health issues.
  • We evolved to value newness: for example, a new female/male is a new chance at spreading your genes, or new information in the form of gossip is valuable in choosing who to trust or mate. But now we can get 10,000x the amount of newness and stimuli over the internet (social media, news, porn...), leading to addiction and mental health issues.

It is important to understand that we essentially have a hunter-gatherer brain and body in a modern society, so there is huge (and rapidly increasing) mismatch. Obesity, most health issues, most mental health issues, addiction, lack of meaning, loneliness... many negative matters of the body or brain can be due to evolutionary mismatch.

This is not to say that everything was perfect when we lived in huts and that everything is wrong now, but that we evolved in the former environment and therefore it is only logical that issues emerge when we live in environments we didn't evolve to live in. That's the idea behind evolutionary mismatch.

So if you have issues, that's not necessarily your fault. It's like blaming a penguin for not thriving in a desert.

Learn more:

Supernormal stimuli

 Consider these findings from Tinbergen, a researcher who coined the term supernormal stimulus:

  • Songbirds have light blue and small eggs. The exaggerated, supernormal version of those eggs would be a huge, bright blue dummy egg. When such a dummy is shown to the songbirds, they would abandon their real eggs and instead sit on top of the dummy; they choose the artificial over the real because the artificial is more stimulating.
  • There's more: the songbirds would feed dummy children over their real children, if the dummies had wider and redder mouths. The children themselves would rather beg food from a dummy mother than the real mother, if the dummy had a more stimulating beak.

Supernormal stimuli are exaggerated versions of the things we evolved to desire. Supernormal stimuli hijack the weak spots in our brains by supplying so much of what we desire that we don't want the boring, real thing anymore. Thus, we act in evolutionarily harmful ways - much like the bird who abandons their eggs. When you can eat pizza, who wants broccoli? When you can have 500 porn stars, who wants a "normal" mate? Why bother with the real life, you have much more fun in the game world, with much less effort.

While there are physical superstimuli, like drugs and junk food, we must be especially careful with digital superstimuli because there is virtually no limit how much stimuli can be pumped into you. A pizza can include only so much fat and salt and other stuff that makes your brain go brrr... But there's no limit as to how stimulating a video or game or virtual world can be. Nature has set no ceilings; it has only determined what we are stimulated by, so we always want more, at all costs.

We're already approaching a point with VR porn and "teledildonics" where one may rather mate the technology than a real person. Some would much rather move a video game character than their real bodies. At some point, we'll maybe create artificial children (perhaps in the metaverse) who simply are much cuter than real babies, and much less trouble. As fewer and fewer people have babies - intentionally or unintentionally - we realize that we aren't much smarter than the birds sitting on huge blue balls instead of their own eggs. As I write in a longer post, superstimuli may be the thing that leads to human extinction.

Learn more:

Minority rule

Nassim Taleb popularized the idea of minority rule, where change is driven not by the majority but instead by a passionate minority. If something a minority insists on is fine with the majority, that generally becomes the norm.

For example, imagine you're a baker and want to sell buns. It doesn't make sense for you to make two sets of buns - one lactose-free and one "normal" - because everyone is fine eating lactose-free buns, but not everyone can eat your buns that include lactose. It's just easier to make everything lactose-free. So almost all pastries you find are lactose-free (at least in Finland).

It's key to notice the asymmetry at play: the majority must be quite indifferent to the change while the minority must be quite inflexible about it.

The idea of minority rule can be seen everywhere.

  • Why we have pronouns next to people's names on social platforms and conference badges is not because the majority insisted on it, but because a very passionate minority did.
  • Lunch in many cafés and restaurants is increasingly plant-based because vegetarians will take their money elsewhere otherwise, and non-vegetarians don't generally mind if their lunch has fewer meat options, or none at all on some days.
  • In corporate life, you can usually change something if you're passionate enough about it. Most people don't really care that much, so if it's a deal-breaker for you, they'll let you change it (especially if they don't need to work on it).

Learn more:

Convergent evolution

Convergent evolution occurs when different species independently evolved similar features. For example, a bat, a fly and many birds have independently of each other evolved wings and an ability to fly.

This is much the same as human civilizations that hadn't been in touch with each other independently invented writing and the idea of stacking big bricks on top of each other in a pyramidic fashion. (No, there's no conspiracy - just convergent evolution.)

If you see the same pattern in independent groups, there's probably a reason for that pattern. If most flying animals developed wings, that's a good sign that wings are one of the best structures for flying. If most swimming animals like fish and dolphins have a similar, streamlined shape, that's probably the shape that works best in water.

Similarly, if you see a pattern repeating itself across disciplines, that's a sign that there's something to that pattern. Consider this tweet from Naval:

Convergent evolution is a validation mechanism.

Learn more:

Compulsion loop

A compulsion loop is a sequence of activities that makes you continue the sequence of activities. The loop rewards you for every completion with a bit of dopamine, keeping you hooked.

Gaming example from Wikipedia

The point of most games (and gamified systems) is to keep you on this loop for as long as possible, before you realize what they are doing.

While this kind of loop is most easily noticeable in games, you can see a similar loop underlying nearly all compulsive behavior that isn't due to physical matters (like drugs or illnesses).

  • Social media loop: post things, get dopamine, repeat.
  • Consumerist loop: buy things, get dopamine, repeat.
  • Eating loop: eat junk, get dopamine, repeat.

Learn more:

Kayfabe

Kayfabe is a term most closely associated with professional show wrestling. It's a code word for fake; an agreement for the insiders not to admit to the outsiders that it's fake.

Eric Weinstein has popularized the term among intellectual spheres and compared politics to pro wrestling, in the sense that maintaining perceptions seems to be the main objective. For example, consider a President who wants to get re-elected or win back the public acceptance. They want to highlight all the good they've done for the country. The President could claim responsibility for something positive (like economic growth) even when they've had no part in it, or they could outright lie that the country/economy is doing well, even when it isn't. Reality doesn't matter, perception does.

One of the reasons kayfabe exists is our desire to believe the fakery. When we expect too much of reality, we create demand for lies; it simply becomes too difficult to meet our expectations in reality, so they are met merely in perception. We don't care about the difference because we are desperate to get our expectations met. The average citizen wants their country to succeed, and the average politician wants to stay on the citizens' good side - so even if the country is doing poorly, it's preferred for both parties to perpetuate a common fantasy.

Learn more:

Ostrich effect

The ostrich effect happens when you deliberately try to avoid negative information or thoughts. A form of "if I don't think about it, it can't hurt me" thinking.

Wikipedia says that "the name comes from the common (but false) legend that ostriches bury their heads in the sand to avoid danger."

Examples:

  • You aren't happy right now, but thinking about the meaning of your life is stressful and can make you regret your choices. So you decide not to think about it, believing it will solve the problem.
  • You can feel your relationship going downhill, but it's scary to talk about it. So you try your best to ignore the bad signs.
  • You've been spending a bit generously today, so you avoid looking at your bank account for the rest of the day.

Learn more:

Floodgate effect

A floodgate effect occurs when you permit a small thing, which then leads to the permission of many more things.

The name comes from the way a floodgate works: if you open the gate even a little bit, water will flow in until both sides of the gate are in equilibrium. Even the tiniest permission will lead to total permission.

Examples of the floodgate effect:

  • If you do a favor to a certain type of person, they'll be always asking for more favors. In Finnish, we say "I gave my pinky, they took the whole arm" (I believe the English equivalent is "give them an inch, they'll take a mile")
  • A drop of alcohol to a recovering addict can lead to total relapse
  • If you allow one citizen to burn the American flag, you'll allow all of them to do that.

As you can see from the examples, the small permission creates a precedent, which is then exploited/leveraged repeatedly, leading to a flood of such events.

Learn more:

Bothsidesism

Bothsidesism occurs when the media presents an issue as being more both-sided, contested or balanced than it really is. For example, there is overwhelming evidence that climate change is real, even though some (very few) scientists disagree. By giving equal voice to scientists on both sides, the media creates an illusion that the issue is contested or a matter of perspective, even though it isn't.

Similarly, some media outlets may blow the minority opinion out of proportion, making it seem like there are two equally strong sides, even though this isn't the case.

Funnily, bothsidesism is a bias that occurs when the media tries to minimize bias (by presenting both sides of the argument). The average media consumer may feel more informed (since the issue seems more nuanced and complicated) but actually they are misinformed.

We are often told that it is good practice to include both sides of the argument and consider all perspectives. However, in some issues, it's better to not give equal weight to both sides or to "come to a compromise". One can consider all arguments, without giving equal importance or merit to each.

Learn more:

Paperclip maximizer

The paperclip maximizer is a thought experiment that goes something like this:

"Imagine you task an AI with creating as many paperclips as possible. It could decide to turn all matters of the universe into paperclips, including humans. Since an AI may not value human life, and it only cares about maximizing paperclips, the risk to humans is great."

The paperclip maximizer is an example of instrumental convergence, a term that suggests an AI could seek to fulfil a harmless goal in harmful ways (from the human point of view). Harnessing a true AI to simple, seemingly innocent tasks could pose existential risk to humans if we don't safely know how to make a machine value human life.

Learn more:

Perverse incentive

An incentive is perverse if it leads to unintended and undesirable results. The solution (incentive) makes the problem worse.

The Cobra Effect is the most famous example. From Wikipedia:

"The British government, concerned about the number of venomous cobras in Delhi, offered a bounty for every dead cobra. Initially, this was a successful strategy; large numbers of snakes were killed for the reward. Eventually, however, enterprising people began to breed cobras for the income. When the government became aware of this, the reward program was scrapped. When cobra breeders set their now-worthless snakes free, the wild cobra population further increased."

People will seek to satisfy the measurable target to get the incentive, irrelevant of the method. Thus, designing an incentive poorly can lead to the opposite effect than was intended. There are various examples in sales representative's bonuses, journalism, alcohol/drug prohibition, all kinds of buyback programs...

Learn more:

Redundancy in biology

Redundancy protects against uncertainty and shocks. Then, unsurprisingly, you'll see redundancy all over biology, as we are observing those organisms that survived uncertainty and shocks. Here are just a few ways nature creates redundancy:

  • Degeneracy
  • Extra pairs
  • Functional equivalence

We talk of degeneracy when two parts perform different functions under normal conditions but are capable of doing the same function when needed. So degeneracy is nature's way of ensuring the failure of one part doesn't doom the entire entity.

  • If one part of your brain is damaged, another part of your brain could take over the function of the damaged part. (See neuroplasticity)

We have an extra pair of some organs, which is a form of redundancy: if there is a loss of one eye, we still survive. While degeneracy refers to parts that perform different functions in normal conditions, an extra pair performs the same function all the time.

  • There are two kidneys, two ears, two eyes, two lungs... but we can survive with one.

Functional equivalence is like an extra pair, but in ecological scale. Multiple species share the same function, so the loss or weakening of one species won't jeopardize the entire ecology.

I find it telling that nature has not just one, but multiple levels of redundancy. If we are to survive long-term, we ought to add redundancy to our lives, to copy nature (biomimicry is the term).

  • In engineering, there's a similar concept of redundancy: if you have a critical function, it's good practice to have a back-up component to perform that function in case the main component breaks.
  • In team work, it may be a good idea to teach a critical function to multiple people, so if the main employee becomes absent, another person could take over their function.
  • With money / time management, it's a good idea to have slack in the system. If you have all of your money in assets that could go to 0, you are one shock away from being bankrupt.

Learn more:

Biological bet hedging

Biological bet hedging is nature's method of diversification.

  • A plant could try to go "all in" this year and get all of its seeds to germinate. But if this year's environment is not optimal, all of the plant's seeds could die. Instead of this risky endeavor, the plant could have half of its seeds germinate this year, and half germinate next year, thus increasing chances of survival.
  • A bird may produce eggs of different sizes, each size being optimal for one potential environment of the offspring. This means that some of the offspring will be at a disadvantage, but as a collective, it is unlikely none will survive.

The purpose of bet hedging isn't to maximize short-term success, but long-term survival. If an individual doesn't hedge their bets, one unexpected and disastrous event (black swan event) could wipe out all of its offspring, potentially ruining chances of survival for that individual's genes.

If you're into investing, you probably see parallels here.

Learn more:

Crab mentality

Someone exhibits "crab mentality" if they try to drag people down who are doing better than themselves. A form of "If I can't have it, neither should you" thinking.

Wikipedia says that "the metaphor is derived from a pattern of behavior noted in crabs when they are trapped in a bucket. While any one crab could easily escape, its efforts will be undermined by others, ensuring the group's collective demise."

Live crabs in a bucket

For example, if one person from the family succeeds, the other siblings may become envious and try to undermine their success. A similar effect may occur in friend groups, if one friend gets a new job and starts earning twice as much as the other friends. Or on a societal level - some people don't want humanity to flourish because they themselves aren't where they want to be!

Learn more:

Cargo cult

"Cargo cults are religious practices that have appeared in many traditional tribal societies in the wake of interaction with technologically advanced cultures. They focus on obtaining the material wealth (the "cargo") of the advanced culture by imitating the actions they believe cause the appearance of cargo: by building landing strips, mock aircraft, mock radios, and the like." (From Wikipedia)

While this may sound ridiculous, we do the same thing. We imitate the morning routines of a billionaire or the writing process of the nearest writing guru, thinking that will bring us the same results. We read the books Warren Buffett recommends, hoping we'd become like him in the process. Or we rename the project manager into "scrum master" and the weekly meeting into a "sprint", thinking that will give us the results of agile project management.

There's Cargo cult programming, Cargo cult science, Cargo cult startups and, most likely, an equivalent to whatever field you can think of.

Whenever we copy the obvious, surface level behaviors but not the deeper behaviors (that actually produce the results), you can describe the situation with the Cargo cult metaphor.

Learn more:

Zugzwang

Zugzwang is a situation in board games where you must make a move, and any move you can make will worsen your position. For example, in checkers, you must capture the opponent's piece when you have an opportunity to do so, and a clever opponent can thus lay a trap that you must fall into.

Situations in real life (outside of board games) resemble a zugzwang when anything you do will make the situation worse, and you don't have a choice not to make a choice.

For example, consider you've lied to your employer that you know Excel, to get the job. Now they want you to complete a long Excel task within the week. Either you need to tell them you've lied all along or you need to learn Excel and work day-and-night to complete the task on time - probably still producing sub-standard work; you're in a zugzwang.

A more philosophical example: imagine you hate capitalism. You can either live contrary to your values or take a great personal cost by trying to exit the system, though you can never completely. Any move you make isn’t ideal, and you can’t not play.

Learn more:

Victim playing

Victim playing occurs when you try to make yourself look like a victim, perhaps to gain sympathy, avoid hard work, manipulate others or protect your self-esteem. Usually you do this by exaggerating a mild obstacle or by faking an obstacle altogether.

There's huge support for sentiments like "life is hard" and "adulting is hard" because it's easier to play a victim than to do something about the problem. It's easier to say "I'd love to catch up but I'm sooo busy with work" than to arrange your matters in a way that you'd have time to meet.

It seems the pressure to succeed is so strong that some people secretly hope for a health issue or injustice to explain why they aren't meeting those unrealistic expectations. In absence of a "better" reason, they'll procrastinate so at least they'll be a victim of time - they couldn't have possibly succeeded, not with that schedule.

Certain people tend to believe things that take away personal responsibility for their lives. “Oh don’t bother learning to code, AI will take up all programming jobs soon anyways”, “don’t bother with your retirement plan, Collapse is coming”, “you can’t remove microplastics from your life, no use trying to limit them in your home”...

Victim playing is, in many ways, the default option because very few people call you out on it. You're the "victim" after all, not many people dare question that, lest they want to be seen as a victim-blamer. Thus, this victim mentality (1) becomes more engraved in a person, because it is a strategy that consistently works, and (2) becomes more accepted in society, infecting more people, because you can't call people out on it.

Learn more:

Theory of Mind

Did you know you can read people's minds? This mind-reading ability, otherwise known as theory of mind, is something we develop at around 1-5 years old (depending on your definition).

Imagine you, a friend and a 2-year-old are eating Pringles. The friend goes away to another room. You decide to empty all of the Pringles from the tube to the trash and fill the tube with pencils. Once the operation is done, your friend comes back to the room. You ask the 2-year-old: "What does my friend think is in the tube?"

A child without theory of mind would say "pencils". A child with theory of mind would say "Pringles", knowing that your friend believes so even if reality is different.

A complete theory of mind is important for empathy, anticipating other people's reactions, understanding beliefs, communication and so much more. If you think about it, you're constantly mind-reading people, even without being aware of it.

There are various mind-reading tasks we engage in, and we usually learn them in this order:

  • Understanding that people want things (for example, a baby wants a toy when they reach for it)
  • Understanding that people can think things differently from me (for example, my favorite color can be red and theirs green)
  • Understanding that people can behave differently based on their experience (for example, if someone wasn't shown that there's candy in a box, they wouldn't know where to look if someone asked them to fetch candy)
  • Understanding that people can have false beliefs (the Pringles example above)
  • Understanding that people can fake or hide their emotions (for example, mommy can be sad even if she isn't crying)

Learn more:

Vestigiality

Vestigial structures are those that used to serve a purpose, but through the process of evolution, no longer do. For example, an ostrich and an emu have wings, but they are no longer used for flying; the wings are vestigial. So are the human wisdom teeth.

One might think of vestigiality in matters beyond anatomy: 

  • As a startup grows, some processes or structures become vestigial.
  • As you grow mentally, some of your learned habits and beliefs become vestigial.

Many trauma victims have built up coping and protection mechanisms that used to get them through tough times, but once they’re through, they haven’t learnt to let go of them. Many overachievers have a hard time slowing down and enjoying life because constant overachieving is what got them to where they are now! 

It pays to occasionally look at your life and think, does this serve a purpose anymore?

Learn more:

Hormesis

Hormesis, in biological processes, is a phenomenon where a low dose of x produces an opposite effect as a high dose of x.

For example, low doses of stress energize you and increase your performance. Very high doses of stress paralyze you and decrease performance. Low stress on your muscles (working out 3 times a week) strengthens your muscles, while very high stress (working out 3 times a day) leads to injuries.

One may apply the same idea outside of biological processes: a low dose of your hobby (5h a week) may make you happy, while turning that hobby into a job (40h a week) may make you miserable. Seeing your friends once a week produces a different result than living with them 24/7.

File:Hormesis dose response graph.svg
From Wikipedia

Dose-response relationships, a related term, describes how something harmless can become harmful, given enough dose or exposure to it. To be more precise, there aren't necessarily "harmful" and "harmless" things, just harmful and harmless doses of things. It is the dose or exposure that matters.

Learn more:

Russell Conjugation

Consider this classic example:

"I have reconsidered the matter, you have changed your mind, he has gone back on his word."

The core idea behind each of these statements is the same but the emotional connotations of the specific word choice differ.

With Russell Conjugation, you can manipulate someone's opinion without falsifying facts; you can steer their perspective while still remaining "objective". Once you understand this concept, you see the news doing it all the time.

The reason Russell Conjugation (think of it as emotional loading) can exist is because different words or phrases can factually refer to the same thing, yet be emotionally different.

Learn more:

Gell-Mann amnesia

When you know marketing, you understand most marketing advice online is bullshit. When you’re an expert in statistics, you see news gets the numbers twisted. If you're a professional football coach, it pains you to hear people at the bar talk about football - such are the inaccuracies in their discussion.

Then you read the next thing in the newspaper or Reddit or Twitter - it’s a topic you’re not an expert in - and you blindly accept the information as fact.

Gell-Mann amnesia is a tendency to forget how unreliable the news - or any information - is. We're reminded of the inaccuracy whenever there's a topic we know a lot about, yet we're not critical enough when we face a less familiar topic.

Learn more:

Overton window

The Overton window describes the range of topics or opinions that are acceptable to talk about in  public.

What we deem “controversial” or "extreme" isn't actually extreme, it's just one end of the Overton window. Truly extreme opinions aren't shared publicly, for fear of social condemnation.

Overton window - Wikipedia
From Wikipedia

If you understand the Overton window, you understand that there exist opinions and policies that aren't talked about in the mainstream. What you hear isn't all there is.

The term originally concerned public policy but it's applicable in all matters of free speech.

Learn more:

Convexity (antifragility)

If there is low or limited downside and high or unlimited upside, the situation is convex.

For example, consider you're looking for a job. If you send a prospective email to a company, asking for employment, there is asymmetry in outcomes: low downside (you risk maybe 1 hour and a bit of ego) but high upside (job).

If you promote your drawing on Reddit, you have limited downside (maybe 5 minutes of creating the post) but virtually unlimited upside (millions of views, if it goes viral).

In a convex situation, you win by repeatedly exposing yourself to randomness because any randomness is more likely to benefit you than harm you. Take many shots because you have little to lose but everything to gain. For this reason, convexity is also called antifragility: not only are you not harmed by randomness, you are actively benefiting from it.

Learn more:

Ergodicity

Ergodicity is a concept that helps us differentiate between what happens to the group (ensemble probability) and what happens to an individual (time probability). On average, the stock market can yield 8% annual returns - this is what happens to the group. But an individual could get bankrupt, so they get -100%.

In ergodic systems, all possible states are visited. So if an unlimited number of people participated in the lottery, each with $1, we could expect there to be some who visit the minimum (losing $1), some who visit the maximum (jackpot) and everything in between. Here it is fine to think of ensemble probability; if the group is big enough, we can expect the group to visit all possible states. If something is possible, it will happen, given enough exposure.

Unfortunately, the lottery and stock market are non-ergodic for the individual. Once an individual goes broke, they can’t play anymore. They won’t visit every possible state. Startups as a whole may be ergodic (some unicorns, many more busts), but for the individual, startups are non-ergodic. You can found 20 and none of them may succeed.

Don't mix ergodic and non-ergodic systems with each other, and don’t think that every individual will get the average return for the group. It is possible for everything to look great on average, but disastrous for the individual.

Learn more:

See new random concept
Mind Expander Premium

Learn more concepts with Mind Expander Premium

Designed for the intellectually curious:

If you like this website, you'll love the premium version.

Check it out

All 60+ free concepts

Anna Karenina principle

"Happy families are all alike; every unhappy family is unhappy in its own way", reads the infamous opening line in Tolstoy's Anna Karenina. It reveals a general pattern in life, where an undertaking can fail in a multitude of ways, yet succeed only when everything goes right. This is called the Anna Karenina principle.

Given this asymmetry in outcomes, you could expect most endeavours that follow the Anna Karenina principle to fail. It is simply statistically more likely. But which things follow the principle?

Well, we know that things that follow the Pareto principle don't follow the Anna Karenina principle. If the 20% can contribute 80% of the results, then you can fail in almost everything but still succeed overall. If you cannot identify any factors that have outsized impact on the end result - if the result truly is a sum of everything - then the Anna Karenina principle might apply.

  • Getting rich is probably not an Anna Karenina -type endeavor; randomness plays too big a part. But building a product worthy of reverence - it can only happen when everything goes right. So if you aim to just get money, apply the Pareto principle, shoot many shots, see what sticks. But if you aim at the latter, the Anna Karenina principle is the way to go.
  • To feel healthy in the short term, Pareto applies: any improvement in sleep, exercise and nutrition will most likely do it. But to feel healthy in your 80s or 90s, that happens only if everything goes right (including having luck on your side).

I think it's really important to distinguish which parts of your life you can 80/20 your way through, and where you really need to obsess over every detail.

Learn more:

  • Wikipedia
  • Similar observation: Idios kosmos ("The waking have one common world, but the sleeping turn aside each into a world of his own.").
Back to the list

Tangential goal trap

"The tangential goal trap: You can tell our relationship with learning is perverse when the main pursuit isn’t the action itself, but information about the action. This is very apparent in entrepreneurship: one pursues books, courses and content, rather than being entrepreneurial. There are many studying humanitarian aid for years, without actually aiding humanity – the main goal seems to be more knowledge about the field." (From my post: Information as a replacement for intelligence)

For most goals, there exists a related but distinctly different - and easier! - goal, trying to lure you away from the real thing. If we are not careful, we may fall into this tangential goal trap, thinking we're doing great progress on our goals. Indeed, usually it's easy to identify when you've been caught by the way you describe progress:

"I haven't written much yet but I set up my knowledge management system and signed up for a writing course."

"We don't have any paying customers yet but we're constantly adding features. And we just launched the company newsletter!"

Pursue the real goal, not some easy derivative of it!

Learn more:

  • Paul Graham says: "When startups are doing well, their investor updates are short and full of numbers. When they're not, their updates are long and mostly words."
  • Related concept: Coastline paradox
Back to the list

Load-bearing beliefs

In construction, a load-bearing structure is one you cannot remove, or the whole building falls down. In cognition, a load-bearing belief is one you cannot shake, or a person's whole worldview falls down.

Load-bearing beliefs are deeply engraved, seldomly reassessed. But if something does manage to topple such a belief, that's a moment when a person's life trajectory changes:

  • A hard-working, career-oriented person might hold a belief that the biggest impact they can have on the world is through work. Perhaps they sacrifice their personal health and relationships for it. But then something radical might happen - like getting fired or realizing their work is net negative to society - that makes them re-evaluate their entire life (and take a year-long "spiritual journey" to Asia).
  • The mid-life crisis is likely a toppling-down of the "I'm immortal" belief, a realization that you're going to die one day. When you're young, you tend to believe you have time for everything you want to do, some time in the future. But when this belief is finally disproven, a crisis ensues.
  • There are load-bearing beliefs in matters of religion, namely "I believe" and "There is no God". When one is at their darkest hour, they might give up their beliefs and become anew - the stereotypical person who 'finds the light' or believes God has abandoned them.

When a person's load-bearing belief is struck down, they either rebuild themselves into something new, or they cognitive dissonance their way through, as letting go of the belief would ruin them mentally.

Learn more:

Back to the list

Prime number maze

Rats in a maze can be taught to turn every other time to left, or every third, or on even numbers etc. But you cannot train them to turn based on prime numbers. It's too abstract for them, the concept of prime numbers cannot be taught to a rat.

There can be a very clear, logical, simple rule to get out of the maze, and with just a bit more cognitive ability, the rat could follow it. Or at least it could be taught to follow it. But it is doomed stuck in the maze, for this particular idea is outside its conceptual range.

Likewise, what are the very clear patterns that keep us humans trapped in a maze? How could we acquire just a bit more awareness, a bit more perspective to see the simple escape from the mazes we are in?

Big revelations seem simple in hindsight, once you have developed the cognitive ability, once you have cracked the pattern. Much of life advice passed on by the elderly seems very simple to them, but often this advice is not followed by the young, for they do not see the pattern the elderly do - they only receive the words. Only via experience - via first-hand observation of the pattern - can much be learnt. But can the knowledge, the awareness nevertheless be sped up? Can you twist and turn, explore the maze, look through it with new eyes and metaphors, until you crack the pattern? And once you do, can you explain it with such simplicity that others become aware of the pattern, too?

“Once you see the boundaries of your environment, they are no longer the boundaries of your environment.”

Learn more:

Back to the list

Doorman fallacy

Rory Sutherland, one of the great thinkers on behavioral economics and consumer psychology, explains the "doorman fallacy" as a seemingly reasonable cost-saving strategy that ultimately fails due to a disregard of the unmeasurable.

If you understand a doorman simply as a way to open the door, of course you can yield cost-savings by replacing him with an automated system. But if you understand a doorman as a way to elevate the status of the hotel & create a better guest experience (hail taxis, greet guests, carry bags...), then you realize the full role of the doorman cannot be replaced with an automated system.

Another example: consider a big tech company, who amid financial pressure, decides to cut spending on free lunches, bus transportation to work, on-site massage or other benefits. On paper, the company can quickly generate a hefty cost saving, but they might fall victim to the doorman fallacy. The short-term cost savings come at the expense of a brand as an employer who provides unrivaled employee perks and places employee satisfaction over investor satisfaction, and the long-term damage to attracting new talent and retaining current employees may cost the company a lot more in the years to come.

You can think of the doorman fallacy as an extension or example of the Chesterton's fence fallacy: getting rid of the fence (doorman) without fully understanding why it was there in the first place.

Learn more:

  • Signaling theory
  • Book: Alchemy by Rory Sutherland
Back to the list

Levinthal's paradox

Levinthal's paradox is that if we tried to predict how a protein folds among all decillions of possibilities, it would take us an astronomically long time. Yet, proteins in nature fold in seconds or less. What seems impossible to the human, paradoxically, happens all the time in nature, without us understanding how it is possible.

The paradox is a reminder about nature's immense "wiseness" and our lack thereof in comparison. Nature finds a way, even if we cannot see how. Often, then, the default action for us should be inaction, to let nature do its thing with minimal human intervention. (Related: iatrogenesis)

One can try to chase longevity with a diet of pills, an army of scientists tracking every possible thing about their body, an insane daily routine… Only to be outlived by a random rural person just trying to build stuff with their own hands and eating what they grow or hunt - as nature intended.

Even with constantly improving technology and theory, we ought to be humble and appreciate that nature might be smarter still.

(Note: Since the paradox was proposed, AI has advanced our understanding of protein folding dramatically.)

Learn more:

Back to the list

Coastline paradox

You'd think it would be easy to define the length of a coastline of any given country. Just look at the map and measure the length of the coast, right?

Turns out, it is notoriously difficult to measure this. The closer you look, the longer the coastline.

The coastline paradox - Sketchplanations
Image: Sketchplanations


At what point do you stop zooming in? The problem is fractal!

Many problems in personal life or business possess a similar quality. Seemingly straightforward at first, but increasing in complexity the more you look at it. For example, say you want to be healthier. How should you exercise? How often and when? Should you take supplements or vitamins? What is best for your unique body composition or genetic tendencies? Or if you want to write content online, you can research all the keywords, best website builders, headline formulas, writing advice from gurus, note-taking approaches and tools... until the point where you spend 5% of your time writing and 95% zooming in on the problem!

Suddenly you're overwhelmed and won't even start.

The solution? Just go for a decent ballpark - the 80% solution - and stop overanalyzing. In the health example, just sleep more, move more, eat better, and forget about the small stuff. Don't get sucked into the fractal.

Learn more:

Back to the list

Quantum Zeno effect

The Quantum Zeno effect is a quantum physics application of one of Zeno's paradoxes, proposed by the ancient Greek philosopher, whereby if you observe an object in motion at any given time, it doesn't appear to be moving. And since time is made of these infinitely short snapshots of idleness, there can be no motion.

Of course, motion exists and this is philosophical nonsense, but thanks to it, we have a cool name for the Quantum Zeno effect, which states you can slow down a particle's movement by measuring it more frequently.

The same idea can be applied more widely:

  • The more often you look at your investment portfolio or any analytics, the less changes. If you come back a month or a year later, then things have changed - so what's the purpose of looking daily?
  • You cannot see the effects of your exercise programme in the mirror, but you can in a photo taken 5 months ago.
  • The more often you check on progress with your team, the more you interrupt their work with reports and check in meetings, thus slowing down progress.

Learn more:

Back to the list

Enantiodromia

Enantiodromia is an idea from Carl Jung that things tend to change to their opposites, that there is a cyclical nature between things.

  • "Hard times create strong men, strong men create good times, good times create weak men, and weak men create hard times"
  • The more authoritarian a government, the more it gives rise to a rebellious movement
  • We use antibiotics to kill bacteria, which leads to the extreme, antibiotic-resistant bacteria
  • When you're dying of cold, you get a feeling of warmth; when you're really hot (like in a sauna), you might suddenly feel cold as your body adapts to the heat
  • If there was an extreme of something in your childhood, that might give rise to the opposite in adulthood
  • Cultural trends: Girlbossing vs trad-wifing, the "feminine" man vs "masculine" man ideals, an excess of rationality and modernism leads to wistful thoughts about the old ways...
  • “The empire, long divided, must unite; long united, must divide”

When there's a dominant trend, there is always a counter-trend. A mainstream opinion and the contrarian. Shift and countershift.

Learn more:

  • Wikipedia
  • Horseshoe theory: the political idea that the far-right and far-left resemble each other, rather than being polar opposites
Back to the list

Rohe's theorem

Rohe's theorem: "Designers of systems tend to design ways for themselves to bypass the system."

Or more generally: those who create or advance a system seek to prevent themselves from being exposed to it, perhaps as they have intimate knowledge of the faults or the predatory nature of the system.

Examples:

  • Government officials can exempt themselves from certain rules that other citizens must abide by
  • Big tech employees who block or limit the social media usage of their kids / want to move “off the grid”
  • (Fast food) restaurant workers who rarely eat out
  • Doctors who prescribe medicine but don't let their kids eat meds
  • Teachers who homeschool their children
  • Insurance company workers who don't have insurance themselves

In some cases, the following sayings relate: "never get high on your own product" and "Those who say the system works work for the system."

Learn more:

  • Book: The Systems Bible by John Gall
Back to the list

Ostranenie

Ostranenie (also known as defamiliarization) is about presenting familiar things in unfamiliar ways, in order to generate new perspectives or insights. One could argue that most art aims at this.

Stark example: a movie where the roles of men and women are swapped, so the viewer becomes more aware of the gender roles of the present.

Subtle example: any argument or idea that is presented with the help of a story or metaphor. A speech that just says "work hard, overcome obstacles" is boring and uninspiring, as it is nothing new, but a movie about a successful person who worked hard and overcame difficulties is inspiring, as it brings a new frame to the idea.

I suspect modern books are full of 10-page examples and stories for this reason. They don't have any new ideas, so they settle for stories. (A cliché reframed can be a revelation, though, but it requires more from the artist.)

Anything can be a line for a great comedian because they can describe it in a funny, new way.

Learn more:

Back to the list

Gardener vs carpenter

Alison Gopnik observes two opposing parental methods: the gardener and the carpenter. The carpenter tries to lead the child to a "good path", crafting them into a well-functioning human being, using all the parenting tricks in the book. A gardener is more concerned with creating an environment where the child can develop safely and healthily, and letting the growth happen organically.

Carpenter parenting is one-way, where the adult molds the child into what they believe is best. Gardener parenting is interactive, where the adult creates the conditions in which the child can become themselves. Every parent knows that if you try to "force teach" something to a child, they will probably not learn it, but if you let them learn it themselves....

We underestimate how wise nature is. A child is the most effective learner in the world, by design - so let them learn on their own, instead of giving instructions they don't need, and trapping their minds in the process. By ascribing age-appropriate activities and scheduling the child's weeks like they were on a training camp, you rob them of their own autonomy and creativity. You can do more harm when you try harder!

A bird doesn't need a lecture on flying, and a child doesn't need you to explain how to play with cubes or a cardboard box. Where else are we trying so hard, when we probably should just relax a bit and let nature do its magic?

Learn more:

  • Book: The Gardener and the Carpenter by Alison Gopnik
Back to the list

Thermocline of truth

In large organizations, the key to succeeding is to do great work. Or, at least, to appear so. That's why the culture might seem fake: everyone is doing super-amazing, everything is roses and sunshine!

This is due to the thermocline of truth effect, whereby people tend to report only on the positives and hush down on the negatives. Like warm water rises up and cold stays at the depths, upper management believes everything goes great while the people who do the work know better.

It is hard to resist this effect, as everyone has an incentive to appear successful to their peers and managers. Therefore, things inside an organization resemble a watermelon: green on the outside, red on the inside.

Learn more:

Back to the list

Diseases of affluence

As you get wealthier, your life tends to get more comfortable. And as you remove some discomforts, certain diseases take their place. While wealth or affluence isn't the real cause - behavior is - you may have noticed some of these patterns:

  • More income > eat out / take-out more > restaurant food is often unhealthier than home-cooked > diseases of affluence
  • Don't feel like cleaning / mowing the lawn > more income means now you have a choice to hire someone > lose opportunities for exercise > diseases of affluence
  • Buy car / electric bike instead of cycling, or walking to the public transport > diseases...

Even if you could afford to live more comfortably, should you?

"Advancements to make our lives less physically taxing have taxed us physically." - Katy Bowman

Learn more:

Back to the list

Precautionary principle

New things tend to be, by default, risky because limited evidence exists as to their full consequences. The precautionary principle is to remain skeptical and cautious of things that may have significant consequences, especially as some might be hidden and currently unforeseen.

The general idea is that when you change or add something, you introduce complexity, and we usually cannot predict all the results of this complexity. So there should be strong evidence or reason for change, otherwise the potential negative results may overweight the positives.

Consider that your friend recommends you stop eating meat and take in your protein in the form of insects instead, for ecological reasons. The precautionary principle here would have you question this recommendation, given the health impacts of meat consumption are known but the impact of insect consumption at big portions isn't.

Yes, you may receive the same amount of protein, but you would introduce significant change to the enormously complex system that is the body, so there are likely to be hidden consequences. Some of them might be positive, some disastrous, or maybe everything would be fine - we don't know!

The precautionary approach doesn't necessarily mean you're "stuck in your ways", never try new things and never innovate. Rather, it's a more risk-averse approach at trying new things and innovating; perhaps a more humble way to do things, as you appreciate the limits of human knowledge amid vast complexity.

Learn more:

Back to the list

Hysterical strength

Hysterical strength is the (anecdotal) observation that people have superhuman strength when under intense pressure. For example, an otherwise normal person being able to lift a car to save their family member who got stuck under. Hysterical strength is the "break in case of emergency" of human anatomy.

The implication here is that we have a huge reservoir of strength that we cannot tap into under ordinary consequences. Why? Possible explanations:

  • Self-protection: we are strong enough to accidentally break our bodies if our strength isn't regulated, so there might be an unconscious "Central Governor" who protects us... from ourselves.
  • Energy-conservation: if we could use 100% of our strength at will, would we have the strength when we really needed it? It's evolutionarily smart to always leave a bit of gas in the tank, just in case.

You can apply the idea to mental strength as well (sometimes called "surge capacity" in this context). You can work scarily hard when it's required of you. Today you think you're working your hardest, and next week multiple life stressors hit simultaneously, you must push through, and you realize you had probably double the capacity than you think you had.

But like with your body, there's probably a good reason why we cannot use this strength at will, at least easily. Intense stress causes mental damage like burnout and dissociation, similar to how intense physical stress leads to torn muscles and tendons.

How can knowledge of hysterical strength's existence benefit us?

  • Use constraints smartly to make progress faster than you thought possible. If you want a clean home, invite everyone over in 2 days. If you want to finish your presentation, schedule a meeting with your director for next week.
  • You don't need to give up when your body tells you to. If I'm shoveling snow or chopping wood and I get tired, I know I have at least 2 hours left in me. (Just don't get injured in the process.)
  • Take on bigger challenges, your mind and body can probably handle it. Just ensure you get enough rest after.

Many people who have been "tested" are in part happy about it, as they now know what they are capable of. It brings confidence to know that you are much stronger than you think.

Learn more:

Back to the list

Copyright trap

A copyright trap is usually a fake entry in one's original work, designed to trap copycats. If their work also has the fake entry, that means they copied your work - how else would they have the same fake information?

This technique can take many forms. A mapmaker can include a non-existent street or town. A dictionary can have fake words, while a telephone book could have fake phone numbers. A book could reference academic articles, facts or experts that don't exist.

(There are other cool, related instances like people who make rugs manually sometimes purposefully add a tiny flaw, to prove the rug was created by hand, not by a machine. Will something similar happen to writing – adding elements that the reader could fairly reliably assume an AI would not add?)

Elon Musk famously used a related technique, a canary trap, to find a Tesla leaker:

Learn more:

Back to the list

Kolmogorov complexity

Kolmogorov complexity measures the randomness or unpredictability of a particular object.

Consider the following strings of 20 characters:

  1. jkjkjkjkjkjkjkjkjkjk
  2. asifnwöoxnfewäohawxr

The first string has lower complexity, as you could describe the string as "write jk 10 times", while the second string has no discernable pattern, so you could only describe it as "write asifnwöoxnfewäohawxr".

You could assess the originality or quality of information via this type of complexity. For example, the shorter the book summary, the worse the book. If you can cut down 99% of the length while still containing the same key information, that tells something about the quality of the book.

By contrast, the longer the book summary, the better the book, as it implies there are more unique ideas that are taken to a greater depth, warranting more time to explain them properly.

Learn more:

Back to the list

Nutpicking

Nutpicking is a tactic to make the opposing side seem wrong/crazy.

Step 1: Pick the craziest, most extreme example or person from the opposing side

Step 2: Claim that this example/person represents the entire opposing side. Therefore, the entire enterprise must be crazy!

Of course, an extreme example doesn't represent the entire group - that's why it's an extreme example. But some have a hard time realizing this. If you see the nutpicking tactic being used, probably best to exit the conversation.


Learn more:

Back to the list

Semantic stopsign

A semantic stopsign essentially says: "Okay, that's enough thinking, you can stop here". It's a non-answer disguised as an answer, with a purpose to stop you from asking further questions and revealing the truth.

Imagine your friend claims you should rub mud on your face before heading outside, as it will protect your skin from the sun and keep it hydrated.

"Mud? Are you crazy? Why would I do that?"

"It's recommended by scientists."

"Oh... well in that case."

That's a semantic stopsign. It's not actually giving you an answer for why you should rub mud on your face, rather the purpose is to stop you from asking more questions.

Learn to recognize these stopsigns and disobey them. Often the reason someone presents you with a semantic stopsign is because there's an uncomfortable truth close by. When a politician is asked a tough question and they answer "to defend democracy" or "to fight terrorism", they aren't really giving you an answer - they are saying "okay, that's enough questions, you can stop here".

When someone hits you with a stopsign, your natural reaction is to obey and move on. Next time, do the opposite and see where that leads.

Learn more:

Back to the list

Mean world syndrome

You may think the world is more dangerous than it is, as you've been consistently exposed to news about crime, violence and terror.

One may rationally know that the news is mostly about the extraordinary (the ordinary is rarely newsworthy), yet still unconsciously have a biased view of the world as a dangerous place.

Remember that for every instance of danger, another of kindness exists - though it's probably not televised. I believe whether you are pessimistic or optimistic is mostly a question of how well you understand this idea of unlimited evidence and limited portrayal.

Learn more:

Back to the list

Politician's syllogism

The politician's syllogism is of the form:

  1. We must do something.
  2. This is something.
  3. Therefore, we must do this.

Similar form:

  1. To improve things, things must change.
  2. We are changing things.
  3. Therefore, we are improving things.

Watch out for this fallacy in politics but also in corporate speeches and work meetings. The lure of acting is strong, even if sometimes not intervening might be for the best. Suggesting inaction rarely ends well in politics/business, hence the existence of fallacies like this.

Learn more:

Back to the list

Fruit of the poisonous tree

Fruit of the poisonous tree is a legal metaphor used to describe evidence that is obtained illegally. The logic of the terminology is that if the source (the "tree") of the evidence or evidence itself is tainted, then anything gained (the "fruit") from it is tainted as well. Therefore, one may not need to refute the evidence, just prove that the evidence was gained illegally.

Similarly, at least in my mind, if you get money or fame in the wrong way, it doesn't count. The success becomes tainted and ceases to be success.

Learn more:

Back to the list

Autopoiesis

The term autopoiesis refers to a system capable of producing and maintaining itself by creating its own parts. A cell is autopoietic because it can maintain itself and create more cells. Imagine a 3D printer that could create its own parts and other 3D printers.

An autopoietic system is self-sufficient, thus very robust. Some people describe society and communities as autopoietic, maybe even describing bitcoin as an example of autopoiesis. But forget about following definitions too strictly for a second. If you apply the concept of autopoiesis liberally to your life, does it change how you shape your environment and lifestyle, your "personal system"? Does it help you become resistant to external circumstances?

I want to have as tight a hermetic seal in my life as possible:

  • Build my  house and furniture from wood that grows in my backyard. If anything needs repairing, I already have the material.
  • Eat what I grow from my land, and use any food waste as compost for the land to create more food. (Or feed excess apples and crops to deer that I hunt for meat.)

Learn more:

Back to the list

Affordances

A crucial part of human learning is understanding how to interact with the environment. Designers of objects can make this easier for us. An affordance is a property or feature of an object that hints what we can do with the object.

  • If a cord has a button on it, you push it. If a rope hangs from a lamp, you pull it.
  • If the door has a knob, you turn it. If it has a handle, you press it down. If it has a metal plate and no handle, you just push the door.
  • If a word in an article is blue, you can click on it and be taken to a different page.

When you take advantage of affordances, people know what to do. No instructions needed. The less you need to explain, the better the design.

Learn more:

Back to the list

Fredkin's Paradox

Fredkin's Paradox states that "The more equally attractive two alternatives seem, the harder it can be to choose between them—no matter that, to the same degree, the choice can only matter less."

If you had two really different options, it'd be easy to choose between them. But when they are similar, choosing becomes difficult. However, if the two options really are similar, it shouldn't matter too much which you choose, just go with one. The impact of choosing one over the other will be so small that any more time spent analyzing is time wasted.

Because of Fredkin's Paradox, we spend the most time on the least important decisions.

Learn more:

Back to the list

Fosbury Flop

Before 1968, there were many techniques that high jumpers used to get over the bar, though with most, you'd jump forward and land on your feet. Then Richard Fosbury won the Olympic gold with his "Fosbury flop" technique which involves jumping backward and landing on your back. Ever since, the Fosbury flop technique has been the most widespread and effective technique in high jumping, and in fact, it's the technique you have probably seen on TV if you've watched high jumping.

A Fosbury Flop moment happens to an industry or area when a new innovation, approach or technique is introduced that then becomes the dominant option. These are breakthrough improvements that change the course of history.

Learn more:

Back to the list

First penguin

When penguins want food of the fishy kind, they may need to jump off an iceberg into the water. The problem is that they can't see what lies beneath - a predator might be waiting for a herd of penguins to fall straight into their belly. The group of penguins needs to decide whether to find food elsewhere or jump into the unknown.

This is where the "first penguin" (sometimes called "courageous penguin") stands up. They take one for the team and jump off the cliff, while the rest wait to see whether the water is clear. The first penguin takes a huge personal risk to ensure the survival of the group.

Entrepreneurs and risk-takers take a similar leap into the unknown, perhaps with even worse odds. While each risk-taker may not survive (for example, few startups succeed), the world is a better place because these risk-takers exist.

Sebastião Salgado, Chinstrap penguins on icebergs between Zavodovski and  Visokoi islands, South Sandwich Islands, 2009 | Yancey Richardson
Sebastião Salgado, Chinstrap penguins on an iceberg

Learn more:

Back to the list

Expected historical value

Paul Graham wrote on Twitter:

"Let's invent a concept.

Expected historical value: how important something would be if it happened times the probability it will happen.

The expected historical value of current fusion projects is so enormous. I don't understand why people aren't talking more about them."

This concept, though similar to the original concept of expected value, puts a spin on it. Expected historical value is a concept to assess the future importance of things today.

What thing today could become "history" in the future? Sure, it may not get much attention now, but could it be recognized by the future as ground-breaking? What innovation, technology or idea, little discussed today, will be discussed a lot in the future?

There's a personal life spin on this as well: What experience or moment today could become a fond memory 50 years from now? It may not be the moments you'd expect (like a fancy trip) but an ordinary moment (like going for a midnight swim).

Learn more:

Back to the list

Confusopoly

A confusopoly is an industry or category of products that is so complex that an accurate comparison between products is near-impossible.

If you try buying a mobile phone from a teleoperator, you have unlimited options between phone tiers and texting and data limits. Or if there's a new video game launching, you could get an early-bird version, or the early-bird collector edition, or just the collector edition, or maybe a deluxe version which has some features of the other options but not all of them. If you try to buy insurance or banking/finance products, the options are so confusing that you're just inclined to choose one at random and stick with that for life.

My favorite example is toilet paper. Different brands have a different number of rolls in their packages, and they may have a different amount of paper in each roll, and maybe they'll have a value pack, but the normal pack may have a "get 1 roll free" promotion, but the other brand has a 20% discount, and then there's the matter of softness and thickness... In the end, you just give up and go with the most visually pleasing packaging.

Learn more:

Back to the list

Applause light statements

An applause light statement is designed to gain the support or agreement of an audience, the same way an actual applause light or sign indicates to the studio audience that this is the part where you should start clapping.

Usually there isn't much substance behind an applause light statement, and you can spot this by trying to reverse the statement. Consider someone saying "we need to ensure technology benefits everyone, not just a few people". If you reverse it, you get "we need to ensure technology benefits a few people, not everyone". Since the reversed statement sounds odd and abnormal, the original statement is probably conventional knowledge or the conventional perspective and doesn't contain new information. The statement isn't designed to deliver new information or substance, just to gain support.

Too many applause light statements is a characteristic of empty suit business talk, the "inspirational speaker" and the charismatic politician. Very little substance, merely an attempt at influencing perception. Sometimes, though, applause light statements can be useful, like when you want to establish common ground or set the stage for an argument. But most of the time, these statements are used merely as the "easy way" to write a speech and influence an audience.

And yes, applause light statements are more common in speech than in writing. In writing, the reader has more time to process what you said and call you out if you have zero substance behind your words. In speeches, you don't have this luxury, and it's pretty hard to hold a negative opinion of a speech if everyone else around you was nodding and applauding the entire time.

Eliezer Yudkowsky, who coined the term, wrote a tongue-in-cheek speech using only applause light statements. It looks eerily similar to just about every speech I've ever heard:

"I am here to propose to you today that we need to balance the risks and opportunities of advanced artificial intelligence. We should avoid the risks and, insofar as it is possible, realize the opportunities. We should not needlessly confront entirely unnecessary dangers. To achieve these goals, we must plan wisely and rationally. We should not act in fear and panic, or give in to technophobia; but neither should we act in blind enthusiasm. We should respect the interests of all parties with a stake in the Singularity. We must try to ensure that the benefits of advanced technologies accrue to as many individuals as possible, rather than being restricted to a few. We must try to avoid, as much as possible, violent conflicts using these technologies; and we must prevent massive destructive capability from falling into the hands of individuals. We should think through these issues before, not after, it is too late to do anything about them..."

Learn more:

Back to the list

Conjunction fallacy

When something is more specific, more detailed, we tend to think it is more probable, even though specific things are actually less probable. This is the conjunction fallacy - not understanding that there is an inverse correlation between specificity and probability.

For example, consider your standard marketing prediction article - they'll say something like "next year, more will be spent on digital advertising as big companies finally shift their TV budgets to online channels". This may sound logical, but, of course, it is more likely that the companies will spend more on digital advertising next year for any reason vs for one specific reason.

Details place your mind and gut against each other: your gut wants to believe the narrative because it sounds more believable with all those details - you can follow the steps to the conclusion. But your mind should realize that the more specific the prediction or narrative, the less you should believe it.

Imagine someone - perhaps a billionaire or someone with a big Twitter following - explaining their success. The more specific the "blueprint" they claim is the reason for their success, the less probable it actually is that their success can be attributed to it. The secret formula for their success sounds more convincing as they add detail, but it also becomes less probable as the actual cause for their success. Kind of defeats the purpose of buying their "7-step framework" master course, no?

Learn more:

Back to the list

Wittgenstein's Ruler

If you're measuring your height with a ruler and it shows you're 4 meters tall (13 feet), you gained no useful information about your height. The only information you gained is that the ruler is inaccurate.

Wittgenstein's Ruler is the idea that when you measure something, you are not only measuring the measured (your height), but also the measurer itself (the ruler). You might be getting more information about the measurer than the measured!

Another example: the fact that an employee you know to be brilliant receives a mediocre performance rating doesn’t necessarily mean they are mediocre, it could simply mean the performance rating process is broken.

Learn more:

Back to the list

Potemkin village

A Potemkin village is a country's attempt to signal that it's doing well, even though it's doing poorly.

Wikipedia: "The term comes from stories of a fake portable village built by Grigory Potemkin, former lover of Empress Catherine II, solely to impress the Empress during her journey to Crimea in 1787. While modern historians agree that accounts of this portable village are exaggerated, the original story was that Potemkin erected phony portable settlements along the banks of the Dnieper River in order to impress the Russian Empress; the structures would be disassembled after she passed, and re-assembled farther along her route to be viewed again as if another example."

Consider a more modern example of Turkmenistan's capital city, Ashgabat, with enormously expensive buildings, hotels, stadiums, indoor ferris wheels and 18-lane roads... that no one uses. Or consider how during the 1980 Olympics in the Soviet Union, they had painted only the road-facing walls of houses on the way to Moscow. Why bother with the entire house when one wall creates the perception? Similar stories exist from newer Olympics; global events are a juicy opportunity for government signaling.

The term Potemkin village has been used more metaphorically to refer to any construct aimed at signaling you're doing well when you're not. For example,  consider a creator trying to signal that their newsletter is doing well - they'll talk of "readers" and visitors to their website and other Potemkin metrics, but they won't talk of subscribers or money.

Learn more:

Back to the list

Charitable interpretation

Imagine someone presents an argument to you. You have roughly two ways to interpret it:

  • Charitable interpretation: Consider the strongest, most solid version or implication of the argument. You start by assuming that the argument might be true, and you try your best to see how it could be.
  • Uncharitable interpretation: Consider the weakest possible version of the argument. You start by assuming that the argument is wrong, and you try your best to confirm this assumption.

Open-minded people generally interpret what they hear charitably while close-minded people stick to what they think is right and don't bother entertaining other arguments. An open-minded person asks "how could this be right?", a close-minded person asks "how is this wrong?"

Especially on Twitter, snarky repliers will interpret your tweet in the least favorable way possible, identify a weakness in your tweet (which is easy when you've already weakened the tweet with your interpretation), then feel smart when they point out that weakness to you. Of course, everyone else thinks them a fool.

It would make one seem much smarter if they interpreted the tweet in the most charitable way possible, then identify a weakness in it and express it clearly.

Learn more:

Back to the list

Rare enemy effect

The rare enemy effect is a niche but interesting phenomenon in predator-prey interactions.

Consider the case of earthworms. Their number one enemy is the mole (a mole eats almost its body weight in worms each day). So the worms have adapted to recognize an approaching mole by sensing the vibrations of its feet as they dig in the soil. When they sense it, the worms escape to the surface, where mole encounters are unlikely.

However, other predators of worms have figured this out. For example, seagulls perform a "dance" to vibrate the ground, bringing up unwitting worms to the surface, ready for snacking. Wood turtles do a similar thing. Humans, in a process called "worm charming", plant a stick into the ground and vibrate the stick to summon the worms, often to collect them for fishing (but some do the process for fun and sport).

The rare enemy effect happens when Predator A exploits the prey's anti-predator response to Predator B (the main predator). It must be that Predator A is a rather rare encounter to the prey (hence the name of the effect); if they were a main predator, the prey would eventually develop an anti-predator response to them, too.

If you look closely, you can see a similar effect in the human realm, too:

  • One (rather known) trick burglars may perform on you is very similar to what seagulls do. Imagine a man in a suit knocks at your door, informing you that burglars have been spotted in this neighborhood. But you can protect yourself by buying his company's security system. Worry not - before you commit to anything, he can do a free security assessment right away, inspecting potential points-of-entry (and where you keep your valuables). Your anti-predator response is to protect yourself from the burglars, so you let the man in. Whether you sign any contracts or not, you may find your house broken into within a few days.
  • A similar scam happens digitally all the time. You receive an email, saying that your passwords or information security is in danger, and scammers can steal your money or data if you don't act. Of course, acting in this case means paying another scammer, or giving him access to your or your company's accounts, where he will change all passwords and require ransom to let you back into your own systems.
  • Naturally, there are legal forms of exploiting one's anti-predator response, namely fear-mongering advertisements or the exaggerating (real) door-to-door home security salesman.

And do not think that these processes are limited to capturing your money; they want your support, information, attention and time, too. If there's a predictable anti-predator response, there's someone who exploits that response, legally or illegally.

Learn more:

Back to the list

Shibboleth

A shibboleth is a word or a phrase that distinguishes one group of people from another.

Wikipedia has many examples (some rather funny): 

  • Some United States soldiers in the Pacific theater in World War II used the word lollapalooza as a shibboleth to challenge unidentified persons, on the premise that Japanese people often pronounce the letter L as R or confuse Rs with Ls. A shibboleth such as "lollapalooza" would be used by the sentry, who, if the first two syllables come back as rorra, would "open fire without waiting to hear the remainder".
  • During World War II, a homosexual US sailor might call himself a "friend of Dorothy", a tongue-in-cheek acknowledgment of a stereotypical affinity for Judy Garland in The Wizard of Oz. This code was so effective that the Naval Investigative Service, upon learning that the phrase was a way for gay sailors to identify each other, undertook a search for this "Dorothy", whom they believed to be an actual woman with connections to homosexual servicemen in the Chicago area.
  • In Cologne, a common shibboleth to tell someone who was born in Cologne from someone who had moved there is to ask the suspected individual, Saag ens "Blodwoosch" (say "blood sausage", in Kölsch). However, the demand is a trick; no matter how well one says Blodwoosch, they'll fail the test; the correct answer is to say a different word entirely; namely, Flönz, the other Kölsch word for blood sausage.

In many professions, one may have shibboleths as well, to distinguish those who are more experienced or knowledgeable from those who are not. These may not be used in a password-like manner, rather in a manner like this: "If they use certain words or concepts correctly, it's safe to assume they aren't new to this field".

Other times, shibboleths are used as signals of one’s belonging to or belief in a cult – like “wagmi, gm, fud, diamond hands” in the NFT/crypto space.

Learn more:

Back to the list

Relevance theory

Relevance theory explains why a word or a phrase can convey much more information than its literal version. For example, imagine you own a really bad car - it breaks down often, looks ugly and it can barely hold itself together. You swear you'd demolish it yourself if only you had money to buy a new one. Your neighbor asks what you think of your car and you say "Well, it's no Ferrari". What you communicate is very little, but they understand the message.

The reason we can convey more than we say is self-assembly. When something is being communicated to you, you add relevant information from the context to that message (who said it to you, how they said it, who else is there, what has been said before, previous experiences...) until you arrive at the final message. So only a small part of the intended message needs to actually be said explicitly, everything else can happen due to self-assembly. Just think of an inside joke - one word can be enough to convey an entire story.

Once you understand this effect, you have a framework to understand different types of communication.

For example, a good joke gives away a strong enough connection that you understand what is meant, but the connection must be weak enough as not to ruin the punchline. A joke where the connection is too strong isn't fun because you can guess the punchline; there is very little self-assembly required by the receiver of the joke.

A good aphorism (or good advice) gives away a practical enough idea that you can apply it to your own experiences, but it must be abstract enough that you come up with the application yourself. Consider very wise advice: it's never straightforward "do-exactly-like-this" communication, you usually need to decipher the message, to construct your own meaning of it. It's more impactful to make you realize what to do yourself than to tell you exactly what to do.

In advertising, there is an incentive to communicate the most with the least amount of information or time. The bigger portion of the message can be outsourced to the context, the better for the advertiser, since now the receiver of the information does more self-assembly in their heads. This leads to a sort of mental IKEA effect where they'll enjoy your message more, now that it isn't pushed into their heads, but when they themselves are a co-creator of that message.

"How could I not believe their knives are sharp - I practically came up with that thought!"

Generally, the smaller the ratio of what is communicated to what is understood, the better the aphorism, advice, joke, tagline or ad.

Learn more:

  • Wikipedia
  • Inception (Planting "seeds" into someone's head. Plus, it's a good movie)
Back to the list

Category thinking

You're engaging in category thinking when, instead of trying to solve one problem, you try to solve a whole category of problems.

For example, in computer security, you can look at risks within individual systems, or you can look at risks within an entire category of systems, for example, every system that uses Log4j. You can consider what happens if one autonomous vehicle got hacked, or you can consider what happens if all of them were hacked simultaneously because of a security issue affecting every autonomous vehicle.

In your personal life, you could tackle all of your health problems individually, maybe buying medicine for one problem and changing your diet to tackle another problem. Or you could address the entire category by minimizing evolutionary mismatch, to live more like humans have evolved to live.

Category thinking requires a mindset shift, to abstract one level upwards. You need a different set of solutions for an individual problem vs the whole category; it's a different beast altogether to help one homeless person than to help all of them.

Learn more:

Back to the list

Evolutionary mismatch

Evolutionary traits generally change slower than the environment. Therefore, the traits that were once helpful can now be harmful because the environment is different. When this is the case, we talk of evolutionary mismatch.

While evolutionary mismatch can and does happen in animals and plants, humans are a prime example because our environment changes so rapidly.

  • We evolved to value sweet and fatty foods. But now this trait is a disadvantage because sugary and fatty foods aren't scarce anymore - you can get them delivered to your door with one button.
  • Our bodies evolved to be alert under stress, such as when hunting. But now we have chronic stress, and the constant alertness leads to burnout and health issues.
  • We evolved to value newness: for example, a new female/male is a new chance at spreading your genes, or new information in the form of gossip is valuable in choosing who to trust or mate. But now we can get 10,000x the amount of newness and stimuli over the internet (social media, news, porn...), leading to addiction and mental health issues.

It is important to understand that we essentially have a hunter-gatherer brain and body in a modern society, so there is huge (and rapidly increasing) mismatch. Obesity, most health issues, most mental health issues, addiction, lack of meaning, loneliness... many negative matters of the body or brain can be due to evolutionary mismatch.

This is not to say that everything was perfect when we lived in huts and that everything is wrong now, but that we evolved in the former environment and therefore it is only logical that issues emerge when we live in environments we didn't evolve to live in. That's the idea behind evolutionary mismatch.

So if you have issues, that's not necessarily your fault. It's like blaming a penguin for not thriving in a desert.

Learn more:

Back to the list

Supernormal stimuli

 Consider these findings from Tinbergen, a researcher who coined the term supernormal stimulus:

  • Songbirds have light blue and small eggs. The exaggerated, supernormal version of those eggs would be a huge, bright blue dummy egg. When such a dummy is shown to the songbirds, they would abandon their real eggs and instead sit on top of the dummy; they choose the artificial over the real because the artificial is more stimulating.
  • There's more: the songbirds would feed dummy children over their real children, if the dummies had wider and redder mouths. The children themselves would rather beg food from a dummy mother than the real mother, if the dummy had a more stimulating beak.

Supernormal stimuli are exaggerated versions of the things we evolved to desire. Supernormal stimuli hijack the weak spots in our brains by supplying so much of what we desire that we don't want the boring, real thing anymore. Thus, we act in evolutionarily harmful ways - much like the bird who abandons their eggs. When you can eat pizza, who wants broccoli? When you can have 500 porn stars, who wants a "normal" mate? Why bother with the real life, you have much more fun in the game world, with much less effort.

While there are physical superstimuli, like drugs and junk food, we must be especially careful with digital superstimuli because there is virtually no limit how much stimuli can be pumped into you. A pizza can include only so much fat and salt and other stuff that makes your brain go brrr... But there's no limit as to how stimulating a video or game or virtual world can be. Nature has set no ceilings; it has only determined what we are stimulated by, so we always want more, at all costs.

We're already approaching a point with VR porn and "teledildonics" where one may rather mate the technology than a real person. Some would much rather move a video game character than their real bodies. At some point, we'll maybe create artificial children (perhaps in the metaverse) who simply are much cuter than real babies, and much less trouble. As fewer and fewer people have babies - intentionally or unintentionally - we realize that we aren't much smarter than the birds sitting on huge blue balls instead of their own eggs. As I write in a longer post, superstimuli may be the thing that leads to human extinction.

Learn more:

Back to the list

Minority rule

Nassim Taleb popularized the idea of minority rule, where change is driven not by the majority but instead by a passionate minority. If something a minority insists on is fine with the majority, that generally becomes the norm.

For example, imagine you're a baker and want to sell buns. It doesn't make sense for you to make two sets of buns - one lactose-free and one "normal" - because everyone is fine eating lactose-free buns, but not everyone can eat your buns that include lactose. It's just easier to make everything lactose-free. So almost all pastries you find are lactose-free (at least in Finland).

It's key to notice the asymmetry at play: the majority must be quite indifferent to the change while the minority must be quite inflexible about it.

The idea of minority rule can be seen everywhere.

  • Why we have pronouns next to people's names on social platforms and conference badges is not because the majority insisted on it, but because a very passionate minority did.
  • Lunch in many cafés and restaurants is increasingly plant-based because vegetarians will take their money elsewhere otherwise, and non-vegetarians don't generally mind if their lunch has fewer meat options, or none at all on some days.
  • In corporate life, you can usually change something if you're passionate enough about it. Most people don't really care that much, so if it's a deal-breaker for you, they'll let you change it (especially if they don't need to work on it).

Learn more:

Back to the list

Convergent evolution

Convergent evolution occurs when different species independently evolved similar features. For example, a bat, a fly and many birds have independently of each other evolved wings and an ability to fly.

This is much the same as human civilizations that hadn't been in touch with each other independently invented writing and the idea of stacking big bricks on top of each other in a pyramidic fashion. (No, there's no conspiracy - just convergent evolution.)

If you see the same pattern in independent groups, there's probably a reason for that pattern. If most flying animals developed wings, that's a good sign that wings are one of the best structures for flying. If most swimming animals like fish and dolphins have a similar, streamlined shape, that's probably the shape that works best in water.

Similarly, if you see a pattern repeating itself across disciplines, that's a sign that there's something to that pattern. Consider this tweet from Naval:

Convergent evolution is a validation mechanism.

Learn more:

Back to the list

Compulsion loop

A compulsion loop is a sequence of activities that makes you continue the sequence of activities. The loop rewards you for every completion with a bit of dopamine, keeping you hooked.

Gaming example from Wikipedia

The point of most games (and gamified systems) is to keep you on this loop for as long as possible, before you realize what they are doing.

While this kind of loop is most easily noticeable in games, you can see a similar loop underlying nearly all compulsive behavior that isn't due to physical matters (like drugs or illnesses).

  • Social media loop: post things, get dopamine, repeat.
  • Consumerist loop: buy things, get dopamine, repeat.
  • Eating loop: eat junk, get dopamine, repeat.

Learn more:

Back to the list

Kayfabe

Kayfabe is a term most closely associated with professional show wrestling. It's a code word for fake; an agreement for the insiders not to admit to the outsiders that it's fake.

Eric Weinstein has popularized the term among intellectual spheres and compared politics to pro wrestling, in the sense that maintaining perceptions seems to be the main objective. For example, consider a President who wants to get re-elected or win back the public acceptance. They want to highlight all the good they've done for the country. The President could claim responsibility for something positive (like economic growth) even when they've had no part in it, or they could outright lie that the country/economy is doing well, even when it isn't. Reality doesn't matter, perception does.

One of the reasons kayfabe exists is our desire to believe the fakery. When we expect too much of reality, we create demand for lies; it simply becomes too difficult to meet our expectations in reality, so they are met merely in perception. We don't care about the difference because we are desperate to get our expectations met. The average citizen wants their country to succeed, and the average politician wants to stay on the citizens' good side - so even if the country is doing poorly, it's preferred for both parties to perpetuate a common fantasy.

Learn more:

Back to the list

Ostrich effect

The ostrich effect happens when you deliberately try to avoid negative information or thoughts. A form of "if I don't think about it, it can't hurt me" thinking.

Wikipedia says that "the name comes from the common (but false) legend that ostriches bury their heads in the sand to avoid danger."

Examples:

  • You aren't happy right now, but thinking about the meaning of your life is stressful and can make you regret your choices. So you decide not to think about it, believing it will solve the problem.
  • You can feel your relationship going downhill, but it's scary to talk about it. So you try your best to ignore the bad signs.
  • You've been spending a bit generously today, so you avoid looking at your bank account for the rest of the day.

Learn more:

Back to the list

Floodgate effect

A floodgate effect occurs when you permit a small thing, which then leads to the permission of many more things.

The name comes from the way a floodgate works: if you open the gate even a little bit, water will flow in until both sides of the gate are in equilibrium. Even the tiniest permission will lead to total permission.

Examples of the floodgate effect:

  • If you do a favor to a certain type of person, they'll be always asking for more favors. In Finnish, we say "I gave my pinky, they took the whole arm" (I believe the English equivalent is "give them an inch, they'll take a mile")
  • A drop of alcohol to a recovering addict can lead to total relapse
  • If you allow one citizen to burn the American flag, you'll allow all of them to do that.

As you can see from the examples, the small permission creates a precedent, which is then exploited/leveraged repeatedly, leading to a flood of such events.

Learn more:

Back to the list

Bothsidesism

Bothsidesism occurs when the media presents an issue as being more both-sided, contested or balanced than it really is. For example, there is overwhelming evidence that climate change is real, even though some (very few) scientists disagree. By giving equal voice to scientists on both sides, the media creates an illusion that the issue is contested or a matter of perspective, even though it isn't.

Similarly, some media outlets may blow the minority opinion out of proportion, making it seem like there are two equally strong sides, even though this isn't the case.

Funnily, bothsidesism is a bias that occurs when the media tries to minimize bias (by presenting both sides of the argument). The average media consumer may feel more informed (since the issue seems more nuanced and complicated) but actually they are misinformed.

We are often told that it is good practice to include both sides of the argument and consider all perspectives. However, in some issues, it's better to not give equal weight to both sides or to "come to a compromise". One can consider all arguments, without giving equal importance or merit to each.

Learn more:

Back to the list

Paperclip maximizer

The paperclip maximizer is a thought experiment that goes something like this:

"Imagine you task an AI with creating as many paperclips as possible. It could decide to turn all matters of the universe into paperclips, including humans. Since an AI may not value human life, and it only cares about maximizing paperclips, the risk to humans is great."

The paperclip maximizer is an example of instrumental convergence, a term that suggests an AI could seek to fulfil a harmless goal in harmful ways (from the human point of view). Harnessing a true AI to simple, seemingly innocent tasks could pose existential risk to humans if we don't safely know how to make a machine value human life.

Learn more:

Back to the list

Perverse incentive

An incentive is perverse if it leads to unintended and undesirable results. The solution (incentive) makes the problem worse.

The Cobra Effect is the most famous example. From Wikipedia:

"The British government, concerned about the number of venomous cobras in Delhi, offered a bounty for every dead cobra. Initially, this was a successful strategy; large numbers of snakes were killed for the reward. Eventually, however, enterprising people began to breed cobras for the income. When the government became aware of this, the reward program was scrapped. When cobra breeders set their now-worthless snakes free, the wild cobra population further increased."

People will seek to satisfy the measurable target to get the incentive, irrelevant of the method. Thus, designing an incentive poorly can lead to the opposite effect than was intended. There are various examples in sales representative's bonuses, journalism, alcohol/drug prohibition, all kinds of buyback programs...

Learn more:

Back to the list

Redundancy in biology

Redundancy protects against uncertainty and shocks. Then, unsurprisingly, you'll see redundancy all over biology, as we are observing those organisms that survived uncertainty and shocks. Here are just a few ways nature creates redundancy:

  • Degeneracy
  • Extra pairs
  • Functional equivalence

We talk of degeneracy when two parts perform different functions under normal conditions but are capable of doing the same function when needed. So degeneracy is nature's way of ensuring the failure of one part doesn't doom the entire entity.

  • If one part of your brain is damaged, another part of your brain could take over the function of the damaged part. (See neuroplasticity)

We have an extra pair of some organs, which is a form of redundancy: if there is a loss of one eye, we still survive. While degeneracy refers to parts that perform different functions in normal conditions, an extra pair performs the same function all the time.

  • There are two kidneys, two ears, two eyes, two lungs... but we can survive with one.

Functional equivalence is like an extra pair, but in ecological scale. Multiple species share the same function, so the loss or weakening of one species won't jeopardize the entire ecology.

I find it telling that nature has not just one, but multiple levels of redundancy. If we are to survive long-term, we ought to add redundancy to our lives, to copy nature (biomimicry is the term).

  • In engineering, there's a similar concept of redundancy: if you have a critical function, it's good practice to have a back-up component to perform that function in case the main component breaks.
  • In team work, it may be a good idea to teach a critical function to multiple people, so if the main employee becomes absent, another person could take over their function.
  • With money / time management, it's a good idea to have slack in the system. If you have all of your money in assets that could go to 0, you are one shock away from being bankrupt.

Learn more:

Back to the list

Biological bet hedging

Biological bet hedging is nature's method of diversification.

  • A plant could try to go "all in" this year and get all of its seeds to germinate. But if this year's environment is not optimal, all of the plant's seeds could die. Instead of this risky endeavor, the plant could have half of its seeds germinate this year, and half germinate next year, thus increasing chances of survival.
  • A bird may produce eggs of different sizes, each size being optimal for one potential environment of the offspring. This means that some of the offspring will be at a disadvantage, but as a collective, it is unlikely none will survive.

The purpose of bet hedging isn't to maximize short-term success, but long-term survival. If an individual doesn't hedge their bets, one unexpected and disastrous event (black swan event) could wipe out all of its offspring, potentially ruining chances of survival for that individual's genes.

If you're into investing, you probably see parallels here.

Learn more:

Back to the list

Crab mentality

Someone exhibits "crab mentality" if they try to drag people down who are doing better than themselves. A form of "If I can't have it, neither should you" thinking.

Wikipedia says that "the metaphor is derived from a pattern of behavior noted in crabs when they are trapped in a bucket. While any one crab could easily escape, its efforts will be undermined by others, ensuring the group's collective demise."

Live crabs in a bucket

For example, if one person from the family succeeds, the other siblings may become envious and try to undermine their success. A similar effect may occur in friend groups, if one friend gets a new job and starts earning twice as much as the other friends. Or on a societal level - some people don't want humanity to flourish because they themselves aren't where they want to be!

Learn more:

Back to the list

Cargo cult

"Cargo cults are religious practices that have appeared in many traditional tribal societies in the wake of interaction with technologically advanced cultures. They focus on obtaining the material wealth (the "cargo") of the advanced culture by imitating the actions they believe cause the appearance of cargo: by building landing strips, mock aircraft, mock radios, and the like." (From Wikipedia)

While this may sound ridiculous, we do the same thing. We imitate the morning routines of a billionaire or the writing process of the nearest writing guru, thinking that will bring us the same results. We read the books Warren Buffett recommends, hoping we'd become like him in the process. Or we rename the project manager into "scrum master" and the weekly meeting into a "sprint", thinking that will give us the results of agile project management.

There's Cargo cult programming, Cargo cult science, Cargo cult startups and, most likely, an equivalent to whatever field you can think of.

Whenever we copy the obvious, surface level behaviors but not the deeper behaviors (that actually produce the results), you can describe the situation with the Cargo cult metaphor.

Learn more:

Back to the list

Zugzwang

Zugzwang is a situation in board games where you must make a move, and any move you can make will worsen your position. For example, in checkers, you must capture the opponent's piece when you have an opportunity to do so, and a clever opponent can thus lay a trap that you must fall into.

Situations in real life (outside of board games) resemble a zugzwang when anything you do will make the situation worse, and you don't have a choice not to make a choice.

For example, consider you've lied to your employer that you know Excel, to get the job. Now they want you to complete a long Excel task within the week. Either you need to tell them you've lied all along or you need to learn Excel and work day-and-night to complete the task on time - probably still producing sub-standard work; you're in a zugzwang.

A more philosophical example: imagine you hate capitalism. You can either live contrary to your values or take a great personal cost by trying to exit the system, though you can never completely. Any move you make isn’t ideal, and you can’t not play.

Learn more:

Back to the list

Victim playing

Victim playing occurs when you try to make yourself look like a victim, perhaps to gain sympathy, avoid hard work, manipulate others or protect your self-esteem. Usually you do this by exaggerating a mild obstacle or by faking an obstacle altogether.

There's huge support for sentiments like "life is hard" and "adulting is hard" because it's easier to play a victim than to do something about the problem. It's easier to say "I'd love to catch up but I'm sooo busy with work" than to arrange your matters in a way that you'd have time to meet.

It seems the pressure to succeed is so strong that some people secretly hope for a health issue or injustice to explain why they aren't meeting those unrealistic expectations. In absence of a "better" reason, they'll procrastinate so at least they'll be a victim of time - they couldn't have possibly succeeded, not with that schedule.

Certain people tend to believe things that take away personal responsibility for their lives. “Oh don’t bother learning to code, AI will take up all programming jobs soon anyways”, “don’t bother with your retirement plan, Collapse is coming”, “you can’t remove microplastics from your life, no use trying to limit them in your home”...

Victim playing is, in many ways, the default option because very few people call you out on it. You're the "victim" after all, not many people dare question that, lest they want to be seen as a victim-blamer. Thus, this victim mentality (1) becomes more engraved in a person, because it is a strategy that consistently works, and (2) becomes more accepted in society, infecting more people, because you can't call people out on it.

Learn more:

Back to the list

Theory of Mind

Did you know you can read people's minds? This mind-reading ability, otherwise known as theory of mind, is something we develop at around 1-5 years old (depending on your definition).

Imagine you, a friend and a 2-year-old are eating Pringles. The friend goes away to another room. You decide to empty all of the Pringles from the tube to the trash and fill the tube with pencils. Once the operation is done, your friend comes back to the room. You ask the 2-year-old: "What does my friend think is in the tube?"

A child without theory of mind would say "pencils". A child with theory of mind would say "Pringles", knowing that your friend believes so even if reality is different.

A complete theory of mind is important for empathy, anticipating other people's reactions, understanding beliefs, communication and so much more. If you think about it, you're constantly mind-reading people, even without being aware of it.

There are various mind-reading tasks we engage in, and we usually learn them in this order:

  • Understanding that people want things (for example, a baby wants a toy when they reach for it)
  • Understanding that people can think things differently from me (for example, my favorite color can be red and theirs green)
  • Understanding that people can behave differently based on their experience (for example, if someone wasn't shown that there's candy in a box, they wouldn't know where to look if someone asked them to fetch candy)
  • Understanding that people can have false beliefs (the Pringles example above)
  • Understanding that people can fake or hide their emotions (for example, mommy can be sad even if she isn't crying)

Learn more:

Back to the list

Vestigiality

Vestigial structures are those that used to serve a purpose, but through the process of evolution, no longer do. For example, an ostrich and an emu have wings, but they are no longer used for flying; the wings are vestigial. So are the human wisdom teeth.

One might think of vestigiality in matters beyond anatomy: 

  • As a startup grows, some processes or structures become vestigial.
  • As you grow mentally, some of your learned habits and beliefs become vestigial.

Many trauma victims have built up coping and protection mechanisms that used to get them through tough times, but once they’re through, they haven’t learnt to let go of them. Many overachievers have a hard time slowing down and enjoying life because constant overachieving is what got them to where they are now! 

It pays to occasionally look at your life and think, does this serve a purpose anymore?

Learn more:

Back to the list

Hormesis

Hormesis, in biological processes, is a phenomenon where a low dose of x produces an opposite effect as a high dose of x.

For example, low doses of stress energize you and increase your performance. Very high doses of stress paralyze you and decrease performance. Low stress on your muscles (working out 3 times a week) strengthens your muscles, while very high stress (working out 3 times a day) leads to injuries.

One may apply the same idea outside of biological processes: a low dose of your hobby (5h a week) may make you happy, while turning that hobby into a job (40h a week) may make you miserable. Seeing your friends once a week produces a different result than living with them 24/7.

File:Hormesis dose response graph.svg
From Wikipedia

Dose-response relationships, a related term, describes how something harmless can become harmful, given enough dose or exposure to it. To be more precise, there aren't necessarily "harmful" and "harmless" things, just harmful and harmless doses of things. It is the dose or exposure that matters.

Learn more:

Back to the list

Russell Conjugation

Consider this classic example:

"I have reconsidered the matter, you have changed your mind, he has gone back on his word."

The core idea behind each of these statements is the same but the emotional connotations of the specific word choice differ.

With Russell Conjugation, you can manipulate someone's opinion without falsifying facts; you can steer their perspective while still remaining "objective". Once you understand this concept, you see the news doing it all the time.

The reason Russell Conjugation (think of it as emotional loading) can exist is because different words or phrases can factually refer to the same thing, yet be emotionally different.

Learn more:

Back to the list

Gell-Mann amnesia

When you know marketing, you understand most marketing advice online is bullshit. When you’re an expert in statistics, you see news gets the numbers twisted. If you're a professional football coach, it pains you to hear people at the bar talk about football - such are the inaccuracies in their discussion.

Then you read the next thing in the newspaper or Reddit or Twitter - it’s a topic you’re not an expert in - and you blindly accept the information as fact.

Gell-Mann amnesia is a tendency to forget how unreliable the news - or any information - is. We're reminded of the inaccuracy whenever there's a topic we know a lot about, yet we're not critical enough when we face a less familiar topic.

Learn more:

Back to the list

Overton window

The Overton window describes the range of topics or opinions that are acceptable to talk about in  public.

What we deem “controversial” or "extreme" isn't actually extreme, it's just one end of the Overton window. Truly extreme opinions aren't shared publicly, for fear of social condemnation.

Overton window - Wikipedia
From Wikipedia

If you understand the Overton window, you understand that there exist opinions and policies that aren't talked about in the mainstream. What you hear isn't all there is.

The term originally concerned public policy but it's applicable in all matters of free speech.

Learn more:

Back to the list

Convexity (antifragility)

If there is low or limited downside and high or unlimited upside, the situation is convex.

For example, consider you're looking for a job. If you send a prospective email to a company, asking for employment, there is asymmetry in outcomes: low downside (you risk maybe 1 hour and a bit of ego) but high upside (job).

If you promote your drawing on Reddit, you have limited downside (maybe 5 minutes of creating the post) but virtually unlimited upside (millions of views, if it goes viral).

In a convex situation, you win by repeatedly exposing yourself to randomness because any randomness is more likely to benefit you than harm you. Take many shots because you have little to lose but everything to gain. For this reason, convexity is also called antifragility: not only are you not harmed by randomness, you are actively benefiting from it.

Learn more:

Back to the list

Ergodicity

Ergodicity is a concept that helps us differentiate between what happens to the group (ensemble probability) and what happens to an individual (time probability). On average, the stock market can yield 8% annual returns - this is what happens to the group. But an individual could get bankrupt, so they get -100%.

In ergodic systems, all possible states are visited. So if an unlimited number of people participated in the lottery, each with $1, we could expect there to be some who visit the minimum (losing $1), some who visit the maximum (jackpot) and everything in between. Here it is fine to think of ensemble probability; if the group is big enough, we can expect the group to visit all possible states. If something is possible, it will happen, given enough exposure.

Unfortunately, the lottery and stock market are non-ergodic for the individual. Once an individual goes broke, they can’t play anymore. They won’t visit every possible state. Startups as a whole may be ergodic (some unicorns, many more busts), but for the individual, startups are non-ergodic. You can found 20 and none of them may succeed.

Don't mix ergodic and non-ergodic systems with each other, and don’t think that every individual will get the average return for the group. It is possible for everything to look great on average, but disastrous for the individual.

Learn more:

Back to the list
Jaakko Joutsenmeri

Hope you find Mind Expander useful, I'm constantly updating it. If you're interested to see how I connect and apply these concepts, you can read my blog posts.