Saturday, November 30, 2013

Complex systems that have artificially suppressed volatility tend to become extremely fragile

Nassim Nicholas Taleb is a brilliant thinker, philosopher and writer. I have read and enjoyed Fooled by Randomness as well as Black Swan and am looking forward to his most recent, Antifragile (reviewed in The Economist here).

Taleb had an article in Foreign Affairs in 2011 which hit on some of his main arguments (as applied to international relations) - The Black Swan of Cairo. Black Swan is Taleb's term for an unpredicted (and essentially unpredictable) event that disrupts the status quo.
Complex systems that have artificially suppressed volatility tend to become extremely fragile, while at the same time exhibiting no visible risks. In fact, they tend to be too calm and exhibit minimal variability as silent risks accumulate beneath the surface. Although the stated intention of political leaders and economic policymakers is to stabilize the system by inhibiting fluctuations, the result tends to be the opposite. These artificially constrained systems become prone to “Black Swans”—that is, they become extremely vulnerable to large-scale events that lie far from the statistical norm and were largely unpredictable to a given set of observers.

Such environments eventually experience massive blowups, catching everyone off-guard and undoing years of stability or, in some cases, ending up far worse than they were in their initial volatile state. Indeed, the longer it takes for the blowup to occur, the worse the resulting harm in both economic and political systems.

[snip]

Humans simultaneously inhabit two systems: the linear and the complex. The linear domain is characterized by its predictability and the low degree of interaction among its components, which allows the use of mathematical methods that make forecasts reliable. In complex systems, there is an absence of visible causal links between the elements, masking a high degree of interdependence and extremely low predictability. Nonlinear elements are also present, such as those commonly known, and generally misunderstood, as “tipping points.” Imagine someone who keeps adding sand to a sand pile without any visible consequence, until suddenly the entire pile crumbles. It would be foolish to blame the collapse on the last grain of sand rather than the structure of the pile, but that is what people do consistently, and that is the policy error.

[snip]

Engineering, architecture, astronomy, most of physics, and much of common science are linear domains. The complex domain is the realm of the social world, epidemics, and economics. Crucially, the linear domain delivers mild variations without large shocks, whereas the complex domain delivers massive jumps and gaps. Complex systems are misunderstood, mostly because humans’ sophistication, obtained over the history of human knowledge in the linear domain, does not transfer properly to the complex domain. Humans can predict a solar eclipse and the trajectory of a space vessel, but not the stock market or Egyptian political events. All man-made complex systems have commonalities and even universalities. Sadly, deceptive calm (followed by Black Swan surprises) seems to be one of those properties.
Taleb also mentions but does not elaborate on, the important issue of "the illusion of local causal chains—that is, confusing catalysts for causes and assuming that one can know which catalyst will produce which effect." There is a tendency to see the last event as the "cause" of something when in fact it is sometimes simply the catalyst to a systemic readjustment, i.e. the straw that broke the camel's back. It wasn't the straw per se, but the cumulative weight that preceded it.

There are echoes of Stephen Jay Gould and Niles Edlredge's Punctuated Equilibrium in which they argued that evolution is not a smooth continual process but rather a process characterized by fits and starts or a system of punctuated equilibrium.

Taleb is arguing that our efforts to ensure near term tactical stability are often at odds with desirable system evolution over the long run. It is a classic trade-off decision. He has observed many times that the good tactical intentions often end up unintentionally leading to catastrophic strategic outcomes. An example would be that of forest fire management. Nobody wants forest fires and for decades the strategy was simple fire suppression, keep fires from happening and put them out as fast as possible when they do happen.

In reducing near term fires, regrettably, forests have accumulated much greater fuel loads than they would otherwise under natural conditions where lightning strike fires periodically clear dead brush. The result has been increasingly frequent, vast and intense wildfires beyond control. A strategy for achieving near term stability (reduced wildfires) has ended up worsening the situation in the long run.

When making a strategic decision, it is important to consider the historical context. Has the existing system evolved over time and therefore has some base level of stability, or has it existed in an unnatural state of artificial stability with all variance suppressed? If it is the latter, then any actions undertaken related to a new change may have unanticipated consequences not necessarily having anything to do with the intended plan of action but simply as a consequence of cumulative avoided evolution.

No comments:

Post a Comment