The Black Swan In Review


It was back to the nonfiction section of my bookshelf and The Black Swan by Nassim Nicholas Taleb was one book I had wanted to get to for a long time. I like reading books which are mentioned by other authors I have read and I recall The Black Swan being mentioned in a few. Right off the bat, I was impressed by his erudition which is only something one can aspire to unless you have started young or read abnormally fast (with proper comprehension, of course). His style struck a chord with me as it is quite direct and there is little pandering while he is making his points. This, once again, will only be about the first 100 pages of the book.

The issue that seems to be at the core of his writing is that there needs to be a shift in the way we think about and view the systems we have created in our societies. This has a psychological implication as well, due to the fact that we tend to willfully ignore the potential significance of events that we cannot predict. Taleb goes into depth about why and how this blindness occurs, along with the improper instruments we use to make predictions and finally some tips on how to tweak our thinking so as to not be so vulnerable to potential “Black Swans”. Personally, I don’t like when people draw attention to problems without suggesting viable solutions
but Taleb definitely gives some actionable steps one can take to be prepared for the unseen, a Black Swan.

For example, say there is a turkey which has been fed for 1,000 days. It is content and feels that it can predict with relative certainly when its next meal is coming. But, the whole time, unbeknownst to it, the hand that was feeding it was only doing so to fatten it up for a meal. This meal would occur on the one thousand and first day. On this day, the turkey awakens from its slumber ready to be fed but instead, it is greeted with a much different reality. The sudden realization that the system it thought it once knew, turned out to be totally wrong.

That is the the illustration of the situation individuals or organizations are in due to the possibility of Black Swan events. 

What are these “Black Swan” events anyway?

“Black Swans” are categorized as events that are unpredictable, rare and have a high impact. These sorts of things are so rare that they appear as unpredictable outliers on statistical models so there is a high probability that their possibility is ignored. They are also high impact meaning that if they do occur they can cause significant disruptions to our life. After they do occur, people have a tendency to believe that the events were actually foreseeable despite the fact that this is an illusion. But, that doesn’t mean that for one person they could have been foreseeable and for another they were not. There is an aspect of individual point of view as well. Have these sort of events happened before? Absolutely. The dot com boom and bust, the 2008 financial crisis, 9/11, the Arab spring and more are what would be considered “Black Swan” events as they fit the prior constraints mentioned above.

But what exactly aids our exposure to these events?

Well, to get to that question, it is necessary we have an understanding of two different “domains”, if you will, which are proposed by Taleb as “Extremistan” and “Mediocristan”. Taleb describes things in Mediocristan as “when your sample is large, no single instance will significantly change the aggregate of the total. The largest observation will remain impressive, but eventually insignificant, to the sum.” Whereas in “Extremistan”, “inequalities are such that one single observation can disproportionately impact the aggregate or the total.” Mediocristan largely consists of things which are bound by physical constraints such as height and weight. However, there are a plethora of things which we measure and quantify in day to day life which aren’t bound by some of the same rules, such as a corporation’s market cap. These sorts of things are to be found in “Extremistan” and are not bound by the same rules as the things found in “Mediocristan”.

This brings us to the fallacy of the bell curve, an idea, which Taleb rallies behind throughout the book. He even refers to the bell curve as “The Great Intellectual Fraud” (or GIF), particularly when it is applied to things that fall under the domain of “Extremistan”. This part is technically outside of the first one hundred pages but it is a part of the book I wanted more clarity so I decided to write a section about it to get a better understanding of what Taleb means.

But, let’s start from the beginning.

A man named Karl Friedrich Gauss first described “normal distribution” in 1809. Normal distribution is a continuous probability distribution. In this case, distribution here refers to how potential outcomes appear related to the mean. A normal distribution is symmetric with respect to the mean with higher probabilities around the mean which taper off as the outcomes become less likely or smaller. This is the famous “bell curve” or “Gaussian curve”. I have included an image (not of my own doing) to illustrate what I have written.

 

Standard deviations, shown along the x-axis, are used to segment the different
probabilities located throughout the distribution which is symmetrical to the
mean.

Now we can recall the features of the two domains, “Extremistan” and ‘Mediocristan” and think of them together with the types of things we can find in each domain with respect to randomness. Data like height and weight in a population is random but ultimately has physical constraints which allow the bell curve to be applied to it. There is no single observation that will significantly change the aggregate of the total. This applies neatly to examples of height in humans. The chance of seeing someone twenty centimeters taller than average (6’ 2” or 187cm) is 1 in 44 and would be 1 in 32,000 to see someone forty centimeters over average. The odds quickly accelerate the further you get away from the mean. Taleb notes that the chances of seeing someone 8 feet 9 inches
tall (who doesn’t have a defect in their pituitary gland) is 1 in  30,000,000,000,000,000,000,000 and because the chance is so small, we ignore this outcome completely.

However, things which lie in Extremistan, such as market capitalization, wealth distribution, and book sales, don’t follow the same pattern. If Gaussian laws are applied to the world of financial markets, then the likelihood of the market crash in 1987, which was more than twenty standard deviations away from the mean, would have theoretically taken place once in several billion lifetimes of the universe. In fact, this is all too prevalent in the financial world as Taleb points out that “in the last fifty years, the ten most extreme days in the financial markets represent half the returns.” This is a clear indicator that the Gaussian rules don’t apply in the domain of Extremistan. Taleb provides a good illustration of this when he recounts the story of the collapse of a hedge fund called Long-Term Capital Management (LTCM). LTCM had an all-star team of mathematicians, economists and more. They were all high achievers in their fields and two of the founders even earned Nobel prizes in mathematics. Their hedge fund failed, however, and only lasted five years between 1993 – 1998. It it imploded due to its portfolio of highly leveraged derivatives. Their portfolio management was based entirely on using fancy Gaussian style metrics which worked until they didn’t. The hedge fund couldn’t survive its exposure to black swans and took billions of dollars down with it.

You may ask: How do we let this happen?

Why hasn’t anybody realized this and changed their approach? Taleb offers three primary defects in our reasoning which helps keep the problem at hand alive and well. He relates these three as the confirmation bias, the ludic fallacy and the narrative fallacy. The confirmation bias deals with our natural tendency to look only for corroboration. There is an interesting study conducted by the psychologist Peter Catchart Wason who came up with the 2-4-6 task. It’s a fairly simple arrangement so read the instructions and test it on your friends. The experimenter presents the subject with a three-number sequence such as 2, 4, 6 and the subject must guess the rule which generates the sequence. They can only guess the rule by suggesting another three-number sequence to which the experimenter can only respond yes or no. The underlying rule is quite simple and is only that the numbers must be in ascending order. The subject can guess any range of rules by using other three number sequences such as adding two digits, even numbers but perhaps they can’t understand why 1, 2, 3 would also work. The reason why this task is difficult is because for subjects to realize that the rule is “numbers in ascending order”, they would have to guess a series of numbers that were in descending order to which the experimenter would say “no” to. What P.C. Wason showed was that people have a tendency to formulate a rule in their mind and then stick to it by only offering sequences, guesses or answers which abide by that rule even though the rule was entirely generated by themselves. It blinded them to the task at hand. This is how our confirmation bias can take over and influence the decisions we make.

The second is the ludic fallacy which has to do with the complexity of the world around us and how we perceive it to be more manageable than it is. Taleb describes it as “basing studies of chance on the narrow world of games and dice” which he compares to a situation such as applying the bell curve to true randomness where it is doomed to fail as we have already discussed. 

The third is the narrative fallacy which relates to “our vulnerability to overinterpretation and our predilection for compact stories over raw truths.” We subconsciously love to draw conclusions, establish a logical link, or create reasoning when presented with facts because it helps the facts make more sense and be easier to remember. Oftentimes, stories just feel better to us but Taleb points out that the problem here is that creating these links or narratives may lead us to a belief of a higher understanding or become more confidence. This false confidence is where things are all too likely go wrong.