More on this book
Community
Kindle Notes & Highlights
Read between
February 19 - March 11, 2024
Past events will always look less random than they were (it is called the hindsight bias).
Probability is not a mere computation of odds on the dice or more complicated variants; it is the acceptance of the lack of certainty in our knowledge and the development of methods for dealing with our ignorance.
that which came with the help of luck could be taken away by luck (and often rapidly and unexpectedly at that). The flipside, which deserves to be considered as well (in fact it is even more of our concern), is that things that come with little help from luck are more resistant to randomness.
it does not matter how frequently something succeeds if failure is too costly to bear.
Mild success can be explainable by skills and labor. Wild success is attributable to variance.
The idea of taking into account both the observed and unobserved possible outcomes sounds like lunacy. For most people, probability is about what may happen in the future, not events in the observed past; an event that has already taken place has 100% probability, i.e., certainty.
I start with the platitude that one cannot judge a performance in any given field (war, politics, medicine, investments) by the results, but by the costs of the alternative (i.e., if history played out in a different way). Such substitute courses of events are called alternative histories.
the quality of a decision cannot be solely judged based on its outcome, but such a point seems to be voiced only by people who fail (those who succeed attribute their success to the quality of their decision).
Reality is far more vicious than Russian roulette. First, it delivers the fatal bullet rather infrequently, like a revolver that would have hundreds, even thousands, of chambers instead of six. After a few dozen tries, one forgets about the existence of a bullet, under a numbing false sense of security.
Second, unlike a well-defined, precise game like Russian roulette, where the risks are visible to anyone capable of multiplying and dividing by six, one does not observe the barrel of reality. Very rarely is the generator visible to the naked eye. One is thus capable of unwittingly playing Russian roulette—and calling it by some alternative “low risk” name.
Finally, there is an ingratitude factor in warning people about something abstract (by definition anything that did not happen is abstract).
Heroes are heroes because they are heroic in behavior, not because they won or lost.
As a derivatives trader I noticed that people do not like to insure against something abstract; the risk that merits their attention is always something vivid.
a fact that our brain tends to go for superficial clues when it comes to risk and probability, these clues being largely determined by what emotions they elicit or the ease with which they come to mind.
In addition to such problems with the perception of risk, it is also a scientific fact, and a shocking one, that both risk detection and risk avoidance are not mediated in the “thinking” part of the brain but largely in the emotional one (the “risk as feelings” theory).
It means that rational thinking has little, very little, to do with risk avoidance. Much of what rational thinking seems to do is rationalize one...
This highlight has been truncated due to consecutive passage length restrictions.
I remind myself of Einstein’s remark that common sense is nothing but a collection of misconceptions acquired by age eighteen. Furthermore, What sounds intelligent in a conversation or a meeting, or, particularly, in the media, is suspicious.
A mistake is not something to be determined after the fact, but in the light of the information until that point.
A more vicious effect of such hindsight bias is that those who are very good at predicting the past will think of themselves as good at predicting the future, and feel confident about their ability to do so.
This is why events like those of September 11, 2001, never teach us that we live in a world where important events are not predictable—even the Twin Towers’ col...
This highlight has been truncated due to consecutive passage length restrictions.
ergodicity. It means, roughly, that (under certain conditions) very long sample paths would end up resembling each other.
Those who were unlucky in life in spite of their skills would eventually rise. The lucky fool might have benefited from some luck in life; over the longer run he would slowly converge to the state of a less-lucky idiot. Each one would revert to his long-term properties.
Science is method and rigor; it can be identified in the simplest of prose writing. For instance, what struck me while reading Charles Darwin’s On the Origin of Species is that, although the text does not exhibit a single equation, it seems as if it were translated from the language of mathematics.
Modern life seems to invite us to do the exact opposite; become extremely realistic and intellectual when it comes to such matters as religion and personal behavior, yet as irrational as possible when it comes to matters ruled by randomness (say, portfolio or real estate investments).
firehouse effect. He had observed that firemen with much downtime who talk to each other for too long come to agree on many things that an outside, impartial observer would find ludicrous (they develop political ideas that are very similar).
cross-sectional problem: At a given time in the market, the most successful traders are likely to be those that are best fit to the latest cycle. This does not happen too often with dentists or pianists—because these professions are more immune to randomness.
Asymmetric odds means that probabilities are not 50% for each event, but that the probability on one side is higher than the probability on the other. Asymmetric outcomes mean that the payoffs are not equal.
Accordingly, it is not how likely an event is to happen that matters, it is how much is made when it happens that should be the consideration. How frequent the profit is irrelevant; it is the magnitude of the outcome that counts.
No amount of observations of white swans can allow the inference that all swans are white, but the observation of a single black swan is sufficient to refute that conclusion.
One cannot infer much from a single experiment in a random environment—an experiment needs a repeatability showing some causal component.
surviving traders I know seem to have done the same. They trade on ideas based on some observation (that includes past history) but, like the Popperian scientists, they make sure that the costs of being wrong are limited (and their probability is not derived from past data).
The major problem with inference in general is that those whose profession is to derive conclusions from data often fall into the trap faster and more confidently than others. The more data we have, the more likely we are to drown in it.
we tend to mistake one realization among all possible random histories as the most representative one, forgetting that there may be others. In a nutshell, the survivorship bias implies that the highest performing realization will be the most visible. Why? Because the losers do not show up.
Remember that nobody accepts randomness in his own success, only his failure.
Chaos theory concerns itself primarily with functions in which a small input can lead to a disproportionate response.
A normative science (clearly a self-contradictory concept) offers prescriptive teachings; it studies how things should be.
The opposite is a positive science, which is based on how people actually are observed to behave.
The fact that your mind cannot retain and use everything you know at once is the cause of such biases. One central aspect of a heuristic is that it is blind to reasoning.
Your attitude toward the risks and rewards of the gamble will vary according to whether you look at your net worth or changes in it. But in fact in real life you will be put in situations where you will only look at your changes.
The fact that the losses hurt more than the gains, and differently, makes your accumulated performance, that is, your total wealth, less relevant than the last change in it.
This anchoring to a number is the reason people do not react to their total accumulated wealth, but to differences of wealth from whatever number they are currently anchored to. This is the major conflict with economic theory,
It is not the estimate or the forecast that matters so much as the degree of confidence with the opinion.
My lesson from Soros is to start every meeting at my boutique by convincing everyone that we are a bunch of idiots who know nothing and are mistake-prone, but happen to be endowed with the rare privilege of knowing it.