4.21.2008

Taleb on Black Swans, EconTalk Permanent Podcast Link: Library of Economics and Liberty - Sent Using Google Toolbar

Taleb on Black Swans, EconTalk Permanent Podcast Link: Library of Economics and Liberty

Taleb on Black Swans

April 30, 2007, Featuring Nassim Taleb

Nassim Taleb talks about the challenges of coping with uncertainty, predicting events, and understanding history. This wide-ranging conversation looks at investment, health, history and other areas where data play a key role. Taleb, the author of Fooled By Randomness and The Black Swan, imagines two countries, Mediocristan and Extremistan where the ability to understand the past and predict the future is radically different. In Mediocristan, events are generated by a underlying random process that is normally distributed. These events are often physical and observable and they tend to cluster around the middle. Most people are near the average height and no adult is more than nine feet tall. But in Extremistan, the right-hand tail of events is thick and long and the outlier, the seemingly wildly unlikely event is more common than our experience with Mediocristan would indicate. Bill Gates is more than a little wealthier than the average. The civil war in Lebabon or the events of 9/11 were more worse than just a typical bad day in the Beirut or New York City. Taleb's contention is that we often bring our intuition from Mediocristan for the events of Extremistan, leading us to error. The result is a tendency to be blind-sided by the unexpected.

Size: 19.2 MB
Right-click or Option-click, and select "Save Link/Target As MP3.

Readings and Links related to this podcast

Podcast Readings
HIDE READINGS
About this week's guest: About ideas and people mentioned in this podcast:

Highlights

Time
Podcast Highlights
HIDE HIGHLIGHTS
0:37Intro. Large library, not as large as Umberto Eco. Thematic quote from Fooled by Randomness:
We favor the visible, the embedded, the personal, the narrated, the tangible. We scorn the abstract.
Tend to believe all swans are white because you've probably never seen a black swan. Confirmation bias is not taking seriously what you don't see. Works well in primitive environment but not in less primitive environments. Real world has many observations dominated by fat-tailed distributions--dominated by a small number of observations. Imagine two countries, Mediocristan vs. Extremistan, populated with a good sample of the world's population. Next, think of the heaviest person you could find. How much of the total weight will that person represent? Trivial percentage. The exceptional is inconsequential for weight. Law of large numbers--as your sample becomes very large no single instance can influence the total. Converge to some stable average. Normal or Gaussian distribution characterizes Mediocristan. Now, for Extremistan, instead of weight, think about income (which is not normally distributed). Add in the richest person you can think of, say Bill Gates. The exceptional in this case is not inconsequential. The two random variables, weight and income, have different distributions. Currency markets. Reischmark hyperinflation. Extreme events are often not Gaussian. 1992 Israeli interest rate incident.
9:58Now, look around you. Think about published serious novels: a handful will dominate the sales (even in a world without J. K. Rowling). Between 5 and 20 will dominate sales in any given year. Scorning the abstract works well in Mediocristan, where you don't need a large sample to become acquainted with the general properties. (Common rule of thumb is that you need around 30 to figure out the general properties of the distribution.) In Mediocristan, you don't need to sample everyone. It's cost-effective to use a small sample. The size of the sample may vary, but some small sample will do. The top 100 stocks won't be enough of a sample to protect you; you probably need a broader index, say 500; but you still don't need every single stock in existence. Bogle podcast. Not just the risk, but the unexpected, the risk that is unknown. Risk of stock market is not just losing money, but also missing an opportunity from the unknown.
17:01By contrast, think about Extremistan. In general, you tend to be fooled by randomness, to think only about what happened to you in the last few days in the stock market instead of a longer, larger sample. Tend to say "What I've seen is sufficient." In summer of 1982, U.S. banks suddenly lost more than they'd made cumulatively in the past. They were fooled by small sampling. They'd had many good years in a row, and had forgotten about the small probability of a very bad thing happening. This kind of error is pervasive, way beyond financial markets. Skeptical empiricist: don't just tell me the narrative, the ex-post story, but give me the data so I can figure out the general distribution (which will include estimating the probability of extreme events). This is in contrast to misuse, deceptive use of data to give data an air of scientism which is not scientific. In The Black Swan, discuss Narrative Fallacy. Hindsight bias, retrospective view, after the fact. We are suckers for stories, and ex post we have stories. 80% of epidemiological studies fail to represent real life--that is, they cannot be replicated. What people with statistical data do is process it and find accidental associations. Can even find a study where smoking lowers the risk of breast cancer. Non-representative statistical knowledge. The more data, the worse our statistical knowledge will be! The more you read the newspaper the dumber you get. Sometimes the profusion of information is misleading. "Once you have a theory in your head you are going to just look for confirmation." Bruner and Potter paper: show blurry picture of a dog, too blurry to discern. Now increase the resolution only gradually, you don't see a dog; but if you increase the resolution less gradually, the viewer sees the dog. The person who forms a lot of intermediate theories that it is not a dog, then relies on his experience. Global warming: you make errors if you only rely on stories and experience.
28:32Riddle of induction. If you use a linear model, you have only one or two degrees of freedom, which means you have limited ability to make predictions. But nonlinear models can allow too many degrees of freedom, so you can predict anything at all if you select the wrong one. Forward and backward problem. See an ice cube on the floor and you can predict how much it will amount to when it melts. If you see a puddle of water on the floor, it's harder to tell where it came from. Empirical medicine. Inherent bias in epidemiological (and economics) studies because if you find nothing it's hard to publish it. Ed Leamer pointed this bias out. George Stigler anecdote: when he was a young economist, if you wanted to test a theory, you might run only three or four regression analyses and you spent a lot of time thinking about what variables to test because you had to do a lot of computation by hand. Today you can run hundreds of regressions, and you can easily convince yourself that the "good" ones are the ones that worked out. We are fooled by randomness, selecting the ones that confirm our bias. If you have 1000 traders, you are bound to have 3 or 4 who by pure randomness do well. They are the ones who attract capital. Those with the same skills may be unable to get their careers started. A good dentist who makes a lot of money is probably pretty good at being a good dentist; but a trader who makes a lot of money may be just lucky. "Is this person skilled?" with regard to financial markets, the answer is "I don't know." You can't infer skills from success; but we reward it the same way. "We are more likely to mistake the random for non-random than the non-random for random," because of behavioral bias and statistics masquerading as science.
37:37We tend to force the world into Mediocristan, but we really should think about Extremistan. What advice for someone who wants to build wealth? We do not know the downside risk, so how can we arrange our financial affairs to sleep well at night? There are investments that are black-swan neutral (e.g., banking, reinsurance); and those prone to black-swan error (e.g., biotech, publishing). You want to mentally characterize headlines into these categories: want to select headlines that are black-swan neutral or immensely positive. Chapter 13: How-to chapter. People tend to confuse the risky and non-risky. Using Markowitz model to achieve medium risk can be prone to model error. Same average risk can be achieved with securing 80% of your cash with security guards, and with the other 20% taking all the risks you can. To find out what to do with that 20%, go to parties, try to learn about fat tails, small chance of something fascinating happening, create exposure to create an envelope of serenity. Ramifications of confirmation bias. But there is another central source of error, more extreme than Hayek described. Looking at a dog it can be believed it was the product of randomness. But looking at a man-made object like a car it can be hard to believe it was the result of randomness. Yet the car was the result of a random process, and maybe even a worse one than the dog. When reading about scientific life, it seems to evolve as if it is by design. "I have the feeling that we are fooled by our own skills." Most of what has been discovered technologically was found when not looking for it, with minimum amount of theories about what to look for. E.g., Internet, computer, laser--three recent very influential discoveries that were all found by accident. Penicillin. National Cancer Institute (NCI), Mayers (sp.?), found that the NCI went through 130,000 compounds before iterating in on the ones to use for successful chemotherapy. Gives illusion of intention, but you are fooled by accidental finds. Very little is found by controlled experiment. Trial and error is inherently hit and miss. Most successful books and movies are often not predicted.
50:08But is everything luck? No. Problem of separability. If I need a tie to become a banker, does it mean that having a tie causes my success? No. It is necessary to do research to find something. Most people have trouble keeping that separate, distinguishing skill from luck. Doesn't mean all success is lucky. Chapter 4: If I tell you that all terrorists are Moslem, people are likely to mistake it for all Moslems are terrorists. All tigers are killers is not the same as all killers are tigers. You need to make your own luck, as Pasteur said. Look at book owners: they diversify to try to diminish the problem they have of predicting which exact ones will succeed. Ludic and non-ludic games. Ludic games work within the rules, logical induction, assume experimenter is asking the right questions--rules of the game are clear, no ambiguity. In typical ecological uncertainty, by contrast, you are unsure about the rules, don't know probability structure. Mostly Extremistan. No limit poker, as opposed to limit poker (where you know the distribution). In real world, you don't usually know the probabilities. Statistical books are all ludic games--rules are very clear. Hayek's critique. Idiot savant would be very good at a looting game. Economics is not to be confused with finance. Using Gaussian techniques in a non-Gaussian world, or equilibrium techniques in an (unknown) non-equilibrium world, can lead you to make errors. Knowledge as a vast, unknowable landscape. James Buchanan--we take things like prices as data and look for how they come about, but those prices and preferences emerge in part from the process itself. Sentient beings with imperfect information. Hayek's "Pretence of Knowledge". Pete Boettke. Shackle. People who discuss this become unknown. Bastiat, Seen and Unseen. French edition of The Black Swan will have more on Bastiat and also Fouchier (?sp.). What is a good name for those who doubt in this way? Hayek, Popper as philosophers. Economists who think about these things used to be called political economists. The Theory of Moral Sentiments, by the founding economist Adam Smith was a work of philosophy. If you say you are a "moral philosopher", it will cause most people to run away--will that do better at having good conversation at a party than saying you are an economist? How about "empirical philosopher"?
1:10:24The Black Swan uses statistical arguments against statistics itself. Casino story: Running a casino sounds like an easy way to make money; but what about other problems of running a business? A tiger attack like Seigfried and Roy, which can result in lawsuits and in having to find a new show? Risk can be outside your perception. Same casino had multiple problems not related to tiger incident. Tunneling. Gaussian distribution tells you how much data you need to identify it, self-reference. Using the data you first have to discover the distribution you need. Only with the Gaussian distribution can you do this and easily find the parameters. Power laws--cannot find the parameters easily. Chaos theory: result was that you cannot build models. N-body, n-billiard-ball problem: much harder to capture a model from the top down with large numbers of people than with only two. But that doesn't mean you cannot predict anything. Second problem: when you predict a variable for 25 years hence, your error won't be just 25 your error from predicting 1 year forward. Third problem: psychological. Prediction is a kind of therapy. Have to look at your error after the fact. Forecast error is central to decision-making. Traveling to France, you know the size of suitcase for a given time of year; but need much bigger suitcase if traveling to Mars. Socrates, Apology. There are degrees in your ignorance about what you don't know. The Enlightenment. "I want to turn knowledge into action"--Marx critiquing Hegel--vs. Taleb--"I want to turn lack of knowledge into action." We should embrace the phrase "I don't know." We like experts, but not all domains have experts.
Featured Speakers and Categories: Nassim Taleb on Books Favorites Finance

Email thisSave to del.icio.us (45 saves, tagged: podcast economics finance)Digg This!Technorati Links

Posted by Russ Roberts