More on this book
Community
Kindle Notes & Highlights
It can be difficult to determine how well our simple models correspond to the realities of complex systems, but it is easy to determine three things: (1) how well we understand our simple models; (2) how familiar we are with the surface elements, concepts, and vocabulary of the complex system; and (3) how much information we are aware of, and can easily access, about the complex system.
Scientists, architects, and hedge fund managers are respected, but weather forecasters are parodied. Yet weather forecasters have fewer illusions about their own knowledge than do members of these other professions.
Homer assumes that the bear patrol kept away bears, but it really did nothing at all—the first bear sighting was an anomaly that would not have recurred in any case. The scene is funny because the causal relationship is so outlandish. Rocks don’t keep tigers away, but Homer draws the inference anyway because the chronology of events induced an illusion of cause.
Like the illusion of knowledge, the illusion of cause can only be revealed by systematically testing our understanding, exploring the logical bases of our beliefs, and acknowledging that inferences of causality might derive from evidence that cannot really support them. That level of self-examination is one that we seldom reach.