Learn about concepts and ideas you've probably never heard of.
Refresh the page for a new concept!
Fruit of the poisonous tree is a legal metaphor used to describe evidence that is obtained illegally. The logic of the terminology is that if the source (the "tree") of the evidence or evidence itself is tainted, then anything gained (the "fruit") from it is tainted as well. Therefore, one may not need to refute the evidence, just prove that the evidence was gained illegally.
The term autopoiesis refers to a system capable of producing and maintaining itself by creating its own parts. A cell is autopoietic because it can maintain itself and create more cells. Imagine a 3D printer that could create its own parts and other 3D printers.
An autopoietic system is self-sufficient, thus very robust. Some people describe society and communities as autopoietic, maybe even describing bitcoin as an example of autopoiesis. But forget about following definitions too strictly for a second. If you apply the concept of autopoiesis liberally to your life, does it change how you shape your environment and lifestyle, your "personal system"? Does it help you become resistant to external circumstances?
A crucial part of human learning is understanding how to interact with the environment. Designers of objects can make this easier for us. An affordance is a property or feature of an object that hints what we can do with the object.
When you take advantage of affordances, people know what to do. No instructions needed. The less you need to explain, the better the design.
Fredkin's Paradox states that "The more equally attractive two alternatives seem, the harder it can be to choose between them—no matter that, to the same degree, the choice can only matter less."
If you had two really different options, it'd be easy to choose between them. But when they are similar, choosing becomes difficult. However, if the two options really are similar, it shouldn't matter too much which you choose, just go with one. The impact of choosing one over the other will be so small that any more time spent analyzing is time wasted.
Because of Fredkin's Paradox, we spend the most time on the least important decisions.
Before 1968, there were many techniques that high jumpers used to get over the bar, though with most, you'd jump forward and land on your feet. Then Richard Fosbury won the Olympic gold with his "Fosbury flop" technique which involves jumping backward and landing on your back. Ever since, the Fosbury flop technique has been the most widespread and effective technique in high jumping, and in fact, it's the technique you have probably seen on TV if you've watched high jumping.
A Fosbury Flop moment happens to an industry or area when a new innovation, approach or technique is introduced that then becomes the dominant option. These are breakthrough improvements that change the course of history.
When penguins want food of the fishy kind, they may need to jump off an iceberg into the water. The problem is that they can't see what lies beneath - a predator might be waiting for a herd of penguins to fall straight into their belly. The group of penguins needs to decide whether to find food elsewhere or jump into the unknown.
This is where the "first penguin" (sometimes called "courageous penguin") stands up. They take one for the team and jump off the cliff, while the rest wait to see whether the water is clear. The first penguin takes a huge personal risk to ensure the survival of the group.
Entrepreneurs and risk-takers take a similar leap into the unknown, perhaps with even worse odds. While each risk-taker may not survive (for example, few startups succeed), the world is a better place because these risk-takers exist. Praise the first penguins, for we wouldn't be where we are without their sacrifice.
Paul Graham wrote on Twitter:
"Let's invent a concept.
Expected historical value: how important something would be if it happened times the probability it will happen.
The expected historical value of current fusion projects is so enormous. I don't understand why people aren't talking more about them."
This concept, though similar to the original concept of expected value, puts a spin on it. Expected historical value is a concept to assess the future importance of things today.
What thing today could become "history" in the future? Sure, it may not get much attention now, but could it be recognized by the future as ground-breaking? What innovation, technology or idea, little discussed today, will be discussed a lot in the future?
There's a personal life spin on this as well: What experience or moment today could become a fond memory 50 years from now? It may not be the moments you'd expect (like a fancy trip) but an ordinary moment (like going for a midnight swim).
A confusopoly is an industry or category of products that is so complex that an accurate comparison between products is near-impossible.
If you try buying a mobile phone from a teleoperator, you have unlimited options between phone tiers and texting and data limits. Or if there's a new video game launching, you could get an early-bird version, or the early-bird collector edition, or just the collector edition, or maybe a deluxe version which has some features of the other options but not all of them. If you try to buy insurance or banking/finance products, the options are so confusing that you're just inclined to choose one at random and stick with that for life.
My favorite example is toilet paper. Different brands have a different number of rolls in their packages, and they may have a different amount of paper in each roll, and maybe they'll have a value pack, but the normal pack may have a "get 1 roll free" promotion, but the other brand has a 20% discount, and then there's the matter of softness and thickness... In the end, you just give up and go with the most visually pleasing packaging.
An applause light statement is designed to gain the support or agreement of an audience, the same way an actual applause light or sign indicates to the studio audience that this is the part where you should start clapping.
Usually there isn't much substance behind an applause light statement, and you can spot this by trying to reverse the statement. Consider someone saying "we need to ensure technology benefits everyone, not just a few people". If you reverse it, you get "we need to ensure technology benefits a few people, not everyone". Since the reversed statement sounds odd and abnormal, the original statement is probably conventional knowledge or the conventional perspective and doesn't contain new information. The statement isn't designed to deliver new information or substance, just to gain support.
Too many applause light statements is a characteristic of empty suit business talk, the "inspirational speaker" and the charismatic politician. Very little substance, merely an attempt at influencing perception. Sometimes, though, applause light statements can be useful, like when you want to establish common ground or set the stage for an argument. But most of the time, these statements are used merely as the "easy way" to write a speech and influence an audience.
And yes, applause light statements are more common in speech than in writing. In writing, the reader has more time to process what you said and call you out if you have zero substance behind your words. In speeches, you don't have this luxury, and it's pretty hard to hold a negative opinion of a speech if everyone else around you was nodding and applauding the entire time.
Eliezer Yudkowsky, who coined the term, wrote a tongue-in-cheek speech using only applause light statements. It looks eerily similar to just about every speech I've ever heard:
"I am here to propose to you today that we need to balance the risks and opportunities of advanced artificial intelligence. We should avoid the risks and, insofar as it is possible, realize the opportunities. We should not needlessly confront entirely unnecessary dangers. To achieve these goals, we must plan wisely and rationally. We should not act in fear and panic, or give in to technophobia; but neither should we act in blind enthusiasm. We should respect the interests of all parties with a stake in the Singularity. We must try to ensure that the benefits of advanced technologies accrue to as many individuals as possible, rather than being restricted to a few. We must try to avoid, as much as possible, violent conflicts using these technologies; and we must prevent massive destructive capability from falling into the hands of individuals. We should think through these issues before, not after, it is too late to do anything about them..."
When something is more specific, more detailed, we tend to think it is more probable, even though specific things are actually less probable. This is the conjunction fallacy - not understanding that there is an inverse correlation between specificity and probability.
For example, consider your standard marketing prediction article - they'll say something like "next year, more will be spent on digital advertising as big companies finally shift their TV budgets to online channels". This may sound logical, but, of course, it is more likely that the companies will spend more on digital advertising next year for any reason vs for one specific reason.
Details place your mind and gut against each other: your gut wants to believe the narrative because it sounds more believable with all those details - you can follow the steps to the conclusion. But your mind should realize that the more specific the prediction or narrative, the less you should believe it.
Imagine someone - perhaps a billionaire or someone with a big Twitter following - explaining their success. The more specific the "blueprint" they claim is the reason for their success, the less probable it actually is that their success can be attributed to it. The secret formula for their success sounds more convincing as they add detail, but it also becomes less probable as the actual cause for their success. Kind of defeats the purpose of buying their "7-step framework" master course, no?
A Potemkin village is a country's attempt to signal that it's doing well, even though it's doing poorly.
Wikipedia: "The term comes from stories of a fake portable village built by Grigory Potemkin, former lover of Empress Catherine II, solely to impress the Empress during her journey to Crimea in 1787. While modern historians agree that accounts of this portable village are exaggerated, the original story was that Potemkin erected phony portable settlements along the banks of the Dnieper River in order to impress the Russian Empress; the structures would be disassembled after she passed, and re-assembled farther along her route to be viewed again as if another example."
Consider a more modern example of Turkmenistan's capital city, Ashgabat, with enormously expensive buildings, hotels, stadiums, indoor ferris wheels and 18-lane roads... that no one uses. Or consider how during the 1980 Olympics in the Soviet Union, they had painted only the road-facing walls of houses on the way to Moscow. Why bother with the entire house when one wall creates the perception? Similar stories exist from newer Olympics; global events are a juicy opportunity for government signaling.
The term Potemkin village has been used more metaphorically to refer to any construct aimed at signaling you're doing well when you're not. For example, consider a creator trying to signal that their newsletter is doing well - they'll talk of "readers" and visitors to their website and other Potemkin metrics, but they won't talk of subscribers or money.
Imagine someone presents an argument to you. You have roughly two ways to interpret it:
Open-minded people generally interpret what they hear charitably while close-minded people stick to what they think is right and don't bother entertaining other arguments. An open-minded person asks "how could this be right?", a close-minded person asks "how is this wrong?"
Especially on Twitter, snarky repliers will interpret your tweet in the least favorable way possible, identify a weakness in your tweet (which is easy when you've already weakened the tweet with your interpretation), then feel smart when they point out that weakness to you. Of course, everyone else thinks them a fool.
It would make one seem much smarter if they interpreted the tweet in the most charitable way possible, then identify a weakness in it and express it clearly.
The rare enemy effect is a niche but interesting phenomenon in predator-prey interactions.
Consider the case of earthworms. Their number one enemy is the mole (a mole eats almost its body weight in worms each day). So the worms have adapted to recognize an approaching mole by sensing the vibrations of its feet as they dig in the soil. When they sense it, the worms escape to the surface, where mole encounters are unlikely.
However, other predators of worms have figured this out. For example, seagulls perform a "dance" to vibrate the ground, bringing up unwitting worms to the surface, ready for snacking. Wood turtles do a similar thing. Humans, in a process called "worm charming", plant a stick into the ground and vibrate the stick to summon the worms, often to collect them for fishing (but some do the process for fun and sport).
The rare enemy effect happens when Predator A exploits the prey's anti-predator response to Predator B (the main predator). It must be that Predator A is a rather rare encounter to the prey (hence the name of the effect); if they were a main predator, the prey would eventually develop an anti-predator response to them, too.
If you look closely, you can see a similar effect in the human realm, too:
And do not think that these processes are limited to capturing your money; they want your support, information, attention and time, too. If there's a predictable anti-predator response, there's someone who exploits that response, legally or illegally.
A shibboleth is a word or a phrase that distinguishes one group of people from another.
Wikipedia has many examples (some rather funny):
In many professions, one may have shibboleths as well, to distinguish those who are more experienced or knowledgeable from those who are not. These may not be used in a password-like manner, rather in a manner like this: "If they use certain words or concepts correctly, it's safe to assume they aren't new to this field".
Relevance theory explains why a word or a phrase can convey much more information than its literal version. For example, imagine you own a really bad car - it breaks down often, looks ugly and it can barely hold itself together. You swear you'd demolish it yourself if only you had money to buy a new one. Your neighbor asks what you think of your car and you say "Well, it's no Ferrari". What you communicate is very little, but they understand the message.
The reason we can convey more than we say is self-assembly. When something is being communicated to you, you add relevant information from the context to that message (who said it to you, how they said it, who else is there, what has been said before, previous experiences...) until you arrive at the final message. So only a small part of the intended message needs to actually be said explicitly, everything else can happen due to self-assembly. Just think of an inside joke - one word can be enough to convey an entire story.
Once you understand this effect, you have a framework to understand different types of communication.
For example, a good joke gives away a strong enough connection that you understand what is meant, but the connection must be weak enough as not to ruin the punchline. A joke where the connection is too strong isn't fun because you can guess the punchline; there is very little self-assembly required by the receiver of the joke.
A good aphorism (or good advice) gives away a practical enough idea that you can apply it to your own experiences, but it must be abstract enough that you come up with the application yourself. Consider very wise advice: it's never straightforward "do-exactly-like-this" communication, you usually need to decipher the message, to construct your own meaning of it. It's more impactful to make you realize what to do yourself than to tell you exactly what to do.
In advertising, there is an incentive to communicate the most with the least amount of information or time. The bigger portion of the message can be outsourced to the context, the better for the advertiser, since now the receiver of the information does more self-assembly in their heads. This leads to a sort of mental IKEA effect where they'll enjoy your message more, now that it isn't pushed into their heads, but when they themselves are a co-creator of that message.
Generally, the smaller the ratio of what is communicated to what is understood, the better the aphorism, advice, joke, tagline or ad.
You're engaging in category thinking when, instead of trying to solve one problem, you try to solve a whole category of problems.
For example, in computer security, you can look at risks within individual systems, or you can look at risks within an entire category of systems, for example, every system that uses Log4j. You can consider what happens if one autonomous vehicle got hacked, or you can consider what happens if all of them were hacked simultaneously because of a security issue affecting every autonomous vehicle.
In your personal life, you could tackle all of your health problems individually, maybe buying medicine for one problem and changing your diet to tackle another problem. Or you could address the entire category by minimizing evolutionary mismatch, to live more like humans have evolved to live.
Category thinking requires a mindset shift, to abstract one level upwards. You need a different set of solutions for an individual problem vs the whole category; it's a different beast altogether to help one homeless person than to help all of them.
Evolutionary traits generally change slower than the environment. Therefore, the traits that were once helpful can now be harmful because the environment is different. When this is the case, we talk of evolutionary mismatch.
While evolutionary mismatch can and does happen in animals and plants, humans are a prime example because our environment changes so rapidly.
It is important to understand that we essentially have a hunter-gatherer brain and body in a modern society, so there is huge (and rapidly increasing) mismatch. Obesity, most health issues, most mental health issues, addiction, lack of meaning, loneliness... many negative matters of the body or brain can be due to evolutionary mismatch.
This is not to say that everything was perfect when we lived in huts and that everything is wrong now, but that we evolved in the former environment and therefore it is only logical that issues emerge when we live in environments we didn't evolve to live in. That's the idea behind evolutionary mismatch.
So if you have issues, that's not necessarily your fault. It's like blaming a penguin for not thriving in a desert.
Consider these findings from Tinbergen, a researcher who coined the term supernormal stimulus:
Supernormal stimuli are exaggerated versions of the things we evolved to desire. Supernormal stimuli hijack the weak spots in our brains by supplying so much of what we desire that we don't want the boring, real thing anymore. Thus, we act in evolutionarily harmful ways - much like the bird who abandons their eggs. When you can eat pizza, who wants broccoli? When you can have 500 porn stars, who wants a "normal" mate? Why bother with the real life, you have much more fun in the game world, with much less effort.
While there are physical superstimuli, like drugs and junk food, we must be especially careful with digital superstimuli because there is virtually no limit how much stimuli can be pumped into you. A pizza can include only so much fat and salt and other stuff that makes your brain go brrr... But there's no limit as to how stimulating a video or game or virtual world can be. Nature has set no ceilings; it has only determined what we are stimulated by, so we always want more, at all costs.
We're already approaching a point with VR porn and "teledildonics" where one may rather mate the technology than a real person. Some would much rather move a video game character than their real bodies. At some point, we'll maybe create artificial children (perhaps in the metaverse) who simply are much cuter than real babies, and much less trouble. As fewer and fewer people have babies - intentionally or unintentionally - we realize that we aren't much smarter than the birds sitting on huge blue balls instead of their own eggs. As Yudkowsky writes, superstimuli may be the thing that leads to human extinction.
Convergent evolution occurs when different species independently evolved similar features. For example, a bat, a fly and many birds have independently of each other evolved wings and an ability to fly.
This is much the same as human civilizations that hadn't been in touch with each other independently invented writing and the idea of stacking big bricks on top of each other in a pyramidic fashion. (No, there's no conspiracy - just convergent evolution.)
If you see the same pattern in independent groups, there's probably a reason for that pattern. If most flying animals developed wings, that's a good sign that wings are one of the best structures for flying. If most swimming animals like fish and dolphins have a similar, streamlined shape, that's probably the shape that works best in water.
Similarly, if you see a pattern repeating itself across disciplines, that's a sign that there's something to that pattern. Consider this tweet from Naval:
Convergent evolution is a validation mechanism.
A compulsion loop is a sequence of activities that makes you continue the sequence of activities. The loop rewards you for every completion with a bit of dopamine, keeping you hooked.
The point of most games (and gamified systems) is to keep you on this loop for as long as possible, before you realize what they are doing.
While this kind of loop is most easily noticeable in games, you can see a similar loop underlying nearly all compulsive behavior that isn't due to physical matters (like drugs or illnesses).
What loop are you on?
Kayfabe is a term most closely associated with professional show wrestling. It's a code word for fake; an agreement for the insiders not to admit to the outsiders that it's fake.
Eric Weinstein has popularized the term among intellectual spheres and compared politics to pro wrestling, in the sense that maintaining perceptions seems to be the main objective. For example, consider a President who wants to get re-elected or win back the public acceptance. They want to highlight all the good they've done for the country. The President could claim responsibility for something positive (like economic growth) even when they've had no part in it, or they could outright lie that the country/economy is doing well, even when it isn't. Reality doesn't matter, perception does.
One of the reasons kayfabe exists is our desire to believe the fakery. When we expect too much of reality, we create demand for lies; it simply becomes too difficult to meet our expectations in reality, so they are met merely in perception. We don't care about the difference because we are desperate to get our expectations met. The average citizen wants their country to succeed, and the average politician wants to stay on the citizens' good side - so even if the country is doing poorly, it's preferred for both parties to perpetuate a common fantasy.
The ostrich effect happens when you deliberately try to avoid negative information or thoughts. A form of "if I don't think about it, it can't hurt me" thinking.
Wikipedia says that "the name comes from the common (but false) legend that ostriches bury their heads in the sand to avoid danger."
A floodgate effect occurs when you permit a small thing, which then leads to the permission of many more things.
The name comes from the way a floodgate works: if you open the gate even a little bit, water will flow in until both sides of the gate are in equilibrium. Even the tiniest permission will lead to total permission.
Examples of the floodgate effect:
As you can see from the examples, the small permission creates a precedent, which is then exploited/leveraged repeatedly, leading to a flood of such events.
Bothsidesism occurs when the media presents an issue as being more both-sided, contested or balanced than it really is. For example, there is overwhelming evidence that climate change is real, even though some (very few) scientists disagree. By giving equal voice to scientists on both sides, the media creates an illusion that the issue is contested or a matter of perspective, even though it isn't.
Similarly, some media outlets may blow the minority opinion out of proportion, making it seem like there are two equally strong sides, even though this isn't the case.
Funnily, bothsidesism is a bias that occurs when the media tries to minimize bias (by presenting both sides of the argument). The average media consumer may feel more informed (since the issue seems more nuanced and complicated) but actually they are misinformed.
We are often told that it is good practice to include both sides of the argument and consider all perspectives. However, in some issues, it's better to not give equal weight to both sides or to "come to a compromise". One can consider all arguments, without giving equal importance or merit to each.
The paperclip maximizer is a thought experiment that goes something like this:
"Imagine you task an AI with creating as many paperclips as possible. It could decide to turn all matters of the universe into paperclips, including humans. Since an AI may not value human life, and it only cares about maximizing paperclips, the risk to humans is great."
The paperclip maximizer is an example of instrumental convergence, a term that suggests that an AI could seek to fulfil a harmless goal in harmful ways (from the human point of view). Harnessing a true AI to simple, seemingly innocent tasks could pose existential risk to humans if we don't safely know how to make a machine value human life.
An incentive is perverse if it leads to unintended and undesirable results. The solution (incentive) makes the problem worse.
The Cobra Effect is the most famous example. From Wikipedia:
"The British government, concerned about the number of venomous cobras in Delhi, offered a bounty for every dead cobra. Initially, this was a successful strategy; large numbers of snakes were killed for the reward. Eventually, however, enterprising people began to breed cobras for the income. When the government became aware of this, the reward program was scrapped. When cobra breeders set their now-worthless snakes free, the wild cobra population further increased."
People will seek to satisfy the measurable target to get the incentive, irrelevant of the method. Thus, designing an incentive poorly can lead to the opposite effect than was intended. There are various examples in sales representative's bonuses, journalism, alcohol/drug prohibition, all kinds of buyback programs...
Redundancy protects against uncertainty and shocks. Then, unsurprisingly, you'll see redundancy all over biology, as we are observing those organisms that survived uncertainty and shocks. Here are just a few ways nature creates redundancy:
We talk of degeneracy when two parts perform different functions under normal conditions but are capable of doing the same function when needed. So degeneracy is nature's way of ensuring the failure of one part doesn't doom the entire entity.
We have an extra pair of some organs, which is a form of redundancy: if there is a loss of one eye, we still survive. While degeneracy refers to parts that perform different functions in normal conditions, an extra pair performs the same function all the time.
Functional equivalence is like an extra pair, but in ecological scale. Multiple species share the same function, so the loss or weakening of one species won't jeopardize the entire ecology.
I find it telling that nature has not just one, but multiple levels of redundancy. If we are to survive long-term, we ought to add redundancy to our lives, to copy nature (biomimicry is the term).
Biological bet hedging is nature's method of diversification.
The purpose of bet hedging isn't to maximize short-term success, but long-term survival. If an individual doesn't hedge their bets, one unexpected and disastrous event (black swan event) could wipe out all of its offspring, potentially ruining chances of survival for that individual's genes.
If you're into investing, you probably see parallels here.
Someone exhibits "crab mentality" if they try to drag people down who are doing better than themselves. A form of "If I can't have it, neither should you" thinking.
Wikipedia says that "the metaphor is derived from a pattern of behavior noted in crabs when they are trapped in a bucket. While any one crab could easily escape, its efforts will be undermined by others, ensuring the group's collective demise."
For example, if one person from the family succeeds, the other siblings may become envious and try to undermine their success. A similar effect may occur in friend groups, for example if one friend gets a new job and starts earning twice as much as the other friends.
More lethal examples come to mind as well. Consider this all-too-common story: a man and a woman break up. The woman commences a new relationship. The man, now bitter, kills his ex-girlfriend because "if I can't have her, no one can".
"Cargo cults are religious practices that have appeared in many traditional tribal societies in the wake of interaction with technologically advanced cultures. They focus on obtaining the material wealth (the "cargo") of the advanced culture by imitating the actions they believe cause the appearance of cargo: by building landing strips, mock aircraft, mock radios, and the like." (From Wikipedia)
While this may sound ridiculous, we do the same thing. We imitate the morning routines of a billionaire or the writing process of the nearest writing guru, thinking that will bring us the same results. We read the books Warren Buffett recommends, hoping we'd become like him in the process. Or we rename the project manager into "scrum master" and the weekly meeting into a "sprint", thinking that will give us the results of agile project management.
Whenever we copy the obvious, surface level behaviors but not the deeper behaviors (that actually produce the results), you can describe the situation with the Cargo cult metaphor.
Zugzwang is a situation in board games where you must make a move, and any move you can make will worsen your position. For example, in checkers, you must capture the opponent's piece when you have an opportunity to do so, and a clever opponent can thus lay a trap that you must fall into.
Situations in real life (outside of board games) resemble a zugzwang when anything you do will make the situation worse, and you don't have a choice not to make a choice.
For example, consider you've lied to your employer that you know Excel, to get the job. Now they want you to complete a long Excel task within the week. Either you need to tell them you've lied all along or you need to learn Excel and work day-and-night to complete the task on time - probably still producing sub-standard work; you're in a zugzwang.
Victim playing occurs when you try to make yourself look like a victim, perhaps to gain sympathy, avoid hard work, manipulate others or protect your self-esteem. Usually you do this by exaggerating a mild obstacle or by faking an obstacle altogether.
There's huge support for sentiments like "life is hard" and "adulting is hard" because it's easier to play a victim than to do something about the problem. It's easier to say "I'd love to catch up but I'm sooo busy with work" than to arrange your matters in a way that you'd have time to meet.
It seems the pressure to succeed is so strong that some people secretly hope for a health issue or injustice to explain why they aren't meeting those unrealistic expectations. In absence of a "better" reason, they'll procrastinate so at least they'll be a victim of time - they couldn't have possibly succeeded, not with that schedule.
Victim playing is, in many ways, the default option because very few people call you out on it. You're the "victim" after all, not many people dare question that, lest they want to be seen as a victim-blamer. Thus, this victim mentality (1) becomes more engraved in a person, because it is a strategy that consistently works, and (2) becomes more accepted in society, infecting more people, because you can't call people out on it. Beware the lure of victim mentality.
Did you know you can read people's minds? This mind-reading ability, otherwise known as theory of mind, is something we develop at around 1-5 years old (depending on your definition).
Imagine you, a friend and a 2-year-old are eating Pringles. The friend goes away to another room. You decide to empty all of the Pringles from the tube to the trash and fill the tube with pencils. Once the operation is done, your friend comes back to the room. You ask the 2-year-old: "What does my friend think is in the tube?"
A child without theory of mind would say "pencils". A child with theory of mind would say "Pringles", knowing that your friend believes so even if reality is different.
A complete theory of mind is important for empathy, anticipating other people's reactions, understanding beliefs, communication and so much more. If you think about it, you're constantly mind-reading people, even without being aware of it.
There are various mind-reading tasks we engage in, and we usually learn them in this order:
Vestigial structures are those that used to serve a purpose, but through the process of evolution, no longer do. For example, an ostrich and an emu have wings, but they are no longer used for flying; the wings are vestigial. So are the human wisdom teeth.
One might think of vestigiality in matters beyond anatomy:
I know I'm stretching the boundaries of the concept, but I think the idea of vestigiality is useful when applied liberally: "In your life or business, what's something that used to serve a purpose but no longer does?"
Hormesis, in biological processes, is a phenomenon where a low dose of x produces an opposite effect as a high dose of x.
For example, low doses of stress energize you and increase your performance. Very high doses of stress paralyze you and decrease performance. Low stress on your muscles (working out 3 times a week) strengthens your muscles, while very high stress (working out 3 times a day) leads to injuries.
One may apply the same idea outside of biological processes: a low dose of your hobby (5h a week) may make you happy, while turning that hobby into a job (40h a week) may make you miserable. Seeing your friends once a week produces a different result than living with them 24/7.
Dose-response relationships, a related term, describes how something harmless can become harmful, given enough dose or exposure to it. To be more precise, there aren't necessarily "harmful" and "harmless" things, just harmful and harmless doses of things. It is the dose or exposure that matters.
Consider this classic example:
"I have reconsidered the matter, you have changed your mind, he has gone back on his word."
The core idea behind each of these statements is the same, but the emotional connotations of the specific word choice differ.
With Russell Conjugation, you can manipulate someone's opinion without falsifying facts; you can steer their perspective while still remaining "objective". Once you understand this concept, you see the news doing it all the time.
The reason Russell Conjugation (think of it as emotional loading) can exist is because different words or phrases can factually refer to the same thing, yet be emotionally different.
When you know marketing, you understand most marketing advice online is bullshit. When you’re an expert in statistics, you see news gets the numbers twisted. If you're a professional football coach, it pains you to hear people at the bar talk about football - such are the inaccuracies in their discussion.
Then you read the next thing in the newspaper or Reddit or Twitter - it’s a topic you’re not an expert in - and you blindly accept the information as fact.
So Gell-Mann amnesia is a tendency to forget how unreliable the news - or any information - is. We're reminded of the inaccuracy whenever there's a topic we know a lot about, yet we're not critical enough when we face a less familiar topic.
The Overton window describes the range of topics or opinions that are acceptable to talk about in public.
What we deem “controversial” or "extreme" isn't actually extreme, it's just one end of the Overton window. Truly extreme opinions aren't shared publicly, for fear of social condemnation.
If you understand the Overton window, you understand that there exist opinions and policies that aren't talked about in the mainstream. What you hear isn't all there is.
The term originally concerned public policy but it's applicable in all matters of free speech.
If there is low or limited downside and high or unlimited upside, the situation is convex.
For example, consider you're looking for a job. If you send a prospective email to a company, asking for employment, there is asymmetry in outcomes: low downside (you risk maybe 1 hour and a bit of ego) but high upside (job).
If you promote your drawing on Reddit, you have limited downside (maybe 5 minutes of creating the post) but virtually unlimited upside (millions of views, if it goes viral).
In a convex situation, you win by repeatedly exposing yourself to randomness because any randomness is more likely to benefit you than harm you. Take many shots because you have little to lose but everything to gain. For this reason, convexity is also called antifragility: not only are you not harmed by randomness, you are actively benefiting from it.
Ergodicity is a concept that helps us differentiate between what happens to the group (ensemble probability) and what happens to an individual (time probability). On average, the stock market can yield 8% annual returns - this is what happens to the group. But an individual could get bankrupt, so they get -100%.
In ergodic systems, all the possible states are visited. So if an unlimited number of people participated in the lottery, each with $1, we could expect there to be some who visit the minimum (losing $1), some who visit the maximum (jackpot) and everything in between. Here it is fine to think of ensemble probability; if the group is big enough, we can expect the group to visit all possible states. If something is possible, it will happen, given enough exposure.
Unfortunately, the lottery and stock market are non-ergodic for the individual. Once an individual goes broke, they can’t play anymore. They won’t visit every possible state.
Don't mix ergodic and non-ergodic systems with each other, and don’t think that every individual will get the average return for the group. It is possible for everything to look great on average, but disastrous for the individual.
I've collected all 50+ concepts into one list, so it's easier to read through them and learn.
Feel free to check the premium version out (or continue using the free tool above, either way!)
Every time you refresh the page (or press the button), a code snippet randomizes which concept you see. Unfortunately, it has no memory, so it can show you a concept you've already seen.
I'm adding more concepts constantly, so there will be a higher probability you'll see concepts you haven't before. I'm also trying to create a better randomizer (one with memory).