While we can learn a lot from what successful people do in the mornings, as Nassim Taleb points out, we can learn a lot from what failed people do before breakfast too.
Inversion is actually one of the most powerful mental models in our arsenal. Not only does inversion help us innovate, but it also helps us deal with uncertainty.
“It is in the nature of things,” says Charlie Munger, “that many hard problems are best solved when they are addressed backward.”
Sometimes we can’t articulate what we want. Sometimes we don’t know. Sometimes there is so much uncertainty that the best approach is to attempt to avoid certain outcomes rather than attempt to guide towards the ones we desire. In short, we don’t always know what we want, but we know what we don’t want.
Avoiding stupidity is often easier than seeking brilliance.
[quote]“For the Arab scholar and religious leader Ali Bin Abi-Taleb (no relation), keeping one’s distance from an ignorant person is equivalent to keeping company with a wise man.”[/quote]
The “apophatic,” writes Nassim Taleb in Antifragile, “focuses on what cannot be said directly in words, from the greek apophasis (saying no, or mentioning without meaning).”
The method began as an avoidance of direct description, leading to a focus on negative description, what is called in Latin via negativa, the negative way, after theological traditions, particularly in the Eastern Orthodox Church. Via negativa does not try to express what God is— leave that to the primitive brand of contemporary thinkers and philosophasters with scientistic tendencies. It just lists what God is not and proceeds by the process of elimination.
Statues are carved by subtraction.
Michelangelo was asked by the pope about the secret of his genius, particularly how he carved the statue of David, largely considered the masterpiece of all masterpieces. His answer was: “It’s simple. I just remove everything that is not David.”
Where Is the Charlatan?
Recall that the interventionista focuses on positive action—doing. Just like positive definitions, we saw that acts of commission are respected and glorified by our primitive minds and lead to, say, naive government interventions that end in disaster, followed by generalized complaints about naive government interventions, as these, it is now accepted, end in disaster, followed by more naive government interventions. Acts of omission, not doing something, are not considered acts and do not appear to be part of one’s mission.
[…]
I have used all my life a wonderfully simple heuristic: charlatans are recognizable in that they will give you positive advice, and only positive advice, exploiting our gullibility and sucker-proneness for recipes that hit you in a flash as just obvious, then evaporate later as you forget them. Just look at the “how to” books with, in their title, “Ten Steps for—” (fill in: enrichment, weight loss, making friends, innovation, getting elected, building muscles, finding a husband, running an orphanage, etc.).
We learn the most from the negative.
[I]n practice it is the negative that’s used by the pros, those selected by evolution: chess grandmasters usually win by not losing; people become rich by not going bust (particularly when others do); religions are mostly about interdicts; the learning of life is about what to avoid. You reduce most of your personal risks of accident thanks to a small number of measures.
Skill doesn’t always win.
In anything requiring a combination of skill and luck, most skillful doesn’t always win. That’s one of the key messages of Michael Mauboussin’s book The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing. This is hard for us to swallow because we intuitively feel that if you are successful you have skill for the same reasons that if the outcome is good we think you made a good decision. We can’t predict whether a person who has skills will succeed but Taleb argues that we can “pretty much predict” that a person without skills will eventually have their luck run out.
Subtractive Knowledge
Taleb argues that the greatest “and most robust contribution to knowledge consists in removing what we think is wrong—subtractive epistemology.” He continues that “we know a lot more about what is wrong than what is right.” What does not work, that is negative knowledge, is more robust than positive knowledge. This is because it’s a lot easier for something we know to fail than it is for something we know that isn’t so to succeed.
There is a whole book on the half-life of what we consider to be ‘knowledge or fact’ called The Half-Life of Facts. Basically, because of our partial understanding of the world, which is constantly evolving, we believe things that are not true. That’s not the only reason that we believe things that are not true but it’s a big one.
The thing is we’re not so smart. If I’ve only seen white swans, saying “all swans are white” may be accurate given my limited view of the world but we can never be sure that there are no black swans until we’ve seen everything.
Or as Taleb puts it: “since one small observation can disprove a statement, while millions can hardly confirm it, disconfirmation is more rigorous than confirmation.”
Most people attribute this philosophical argument to Karl Popper but Taleb dug up some evidence that it goes back to the “skeptical-empirical” medical schools of the post-classical era in the Eastern Mediterranean.
Being antifragile isn’t about what you do, but rather what you avoid. Avoid fragility. Avoid stupidity. Don’t be the sucker. Be like Darwin.