One way to while-away the time during my short commute, and, errands, is to listen to unabridged audiobooks. If the experience proves worthwhile as a moment of learning, I’m next compelled to work against my learning style (aural-kinesthetic) and read the verbal-visual edition, so-to-speak.
Now I’m driving through Nassim Nicholas Taleb‘s The Black Swan. It provides a gripping journey for a Jamesian fallibilist such as myself. Also, Taleb’s so-called skeptical empiricism circles around my own current central concern that is also strongly skeptical about, as Taleb terms it, narrativity. The interesting difference is I’m looking at ubiquitous hidden chance events, (in ordinary human development,) whereas Taleb deals with rare hidden chance eventsin large-scale domains. I too am similarly fascinated by how linear narratives clothe non-linear events as a matter of post-hoc rationalization, but, again the domain I’m interested in is different than those of Taleb.
There is a funny moment in the book where Taleb blows off a causal assertion about this domain I’m interested in. I’ll return to this after I finish the book.
As a collector of dichotomies, the following is of great interest. Via Nassim Nicholas Taleb, purloined from his notes page at the web site for his book Fooled By Randomness. (Excellent review of The Black Swan by Dan Hill @cityofsound)
116- Fooled by Rationalism; Lecturing Birds How to Fly [From Tinkering]
The greatest problem in knowledge is the “lecturing birds how to fly” effect.
Let us call it the error of rationalism. In Fat Tony’s language, it would be what makes us the suckers of all suckers. Consider two types of knowledge. The first type is not exactly “knowledge”; its ambiguous character prevents us from calling it exactly knowledge. It a way of doing thing that we cannot really express in clear language, but that we do nevertheless, and do well. The second type is more like what we call “knowledge”; it is what you acquire in school, can get grades for, can codify, what can be explainable, academizable, rationalizable, formalizable, theoretizable, codifiable, Sovietizable, bureaucratizable, Harvardifiable, provable, etc.
To make things simple, just look at the second type of knowledge as something so stripped of ambiguity that an autistic person (a high functioning autistic person, that is) can easily understand it.
The error of rationalism is, simply, overestimating the role and necessity of the second type, the academic knowledge, in human affairs. It is a severe error because not only much of our knowledge is not explainable, academizable, rationalizable, formalizable, theoretizable, codifiable, Sovietizable, bureaucratizable, Harvardifiable, etc., but, further, that such knowledge plays such a minor life that it is not even funny.
We are very likely to believe that skills and ideas that we actually acquired by doing, or that came naturally to us (as we already knew by our innate biological instinct) came from books, ideas, and reasoning. We get blinded by it; there may even be something in our brains that makes us suckers for the point. Let us see how.
Fat Tony wisdom, Aristotelian phronesis
Implicit , Tacit
Tinkering, stochastic tinkering
Epilogism (Menodotus of Nicomedia and the school of empirical medicine)
Historia a sensate cognitio
Bottom up libertarianism
Spirit of the Law
Letter of the Law
Cambridge, MA, and UK
Accident, trial and error
Ecological uncertainty, not tractable in textbook
Ludic probability, statistics textbooks
On-model, model based
Side effect of a drug
National Institute of Health
My intentionally idiosyncratic interpretation of Taleb’s usage of the term ludic, is: it names the error found when people believe that their management of known simple fixed probabilities is identical to management of complex dynamic uncertainty. The latter is, of course, impossible to actually manage.