Here’s one more good essay on Cognitive biases potentially affecting judgment of global risks. More black swans, basically.

Heuristics gives us good estimates of probability, especially for simple actions. It is pure inductive logic, so it is not exact and we can make many errors. Cognitive bias makes us believe these errors and fallacies.

These errors cause us to underestimate some risks and overestimate others. Misleading Vividness makes us think that homocides and accidents kill as many people as disease, even though disease is far deadlier.

The Absence of Evidence is evidence of absence.

In probability theory, absence of evidence is always evidence of absence. If E is a binary event and P(H|E) > P(H), “seeing E increases the probability of H”; then P(H|~E) < P(H), “failure to observe E decreases the probability of H”. P(H) is a weighted mix of P(H|E) and P(H|~E), and necessarily lies between the two.

Think of probability as a spectrum between 0% and 100%. If you look at a thing and find evidence of it every time, you approach 100% probability. But if you do not, you become less certain, and it approaches 0% probability in the absence of evidence.

Here are two more great essays from Eliezer Yudkowsky. First, the Twelve Virtues of Rationality. It’s like the Zen of anti-mysticism.

The second is The Simple Truth. This one is great. Remember, nuance is stupid, boring, and wrong.

Active learning is superior to passive learning. Primates, including humans, learn and remember best by actively doing tasks. Rhesus monkeys, for instance, learn far more through active participation than they can through passive instruction.

Even worse, our educational system prizes “verbal” passive instruction. This emphasizes feelings instead of abstract evidence-based reasoning. It sets children up to be failures in life when they realize that verbal skills are utterly worthless in society.

Superstitious behavior occurs in animals too. They are fooled by randomness and believe a correlated action caused an event.

One classic experiment was B. F. Skinner’s Superstition in the Pigeon.

Skinner put some half starved pigeons in a cage and fed them at regular intervals. The pigeons believed certain actions caused foods and began repeating these actions over and over.

You turn on the nightly news. You see a burning car in a foreign country. Ambulances carry away bloody bodies while angry men scream at the camera. You change the channel. You hear a story of a beautiful blonde college girl gone missing on vacation. You change the channel again and listen to a story of a schoolbus accident in California.

If anyone makes an argument based on those examples, they are using the appeal to emotion fallacy. But what is the effect on you, the viewer? Even if you distrust these stories and believe they are sensational exaggerations of rare events, they still impact your impression of reality.

After seeing a vivid event, you believe that such events are more frequent than they really are. In fact, if I was a beautiful blonde college girl I would be too terrified to go on a vacation.

Nuance is supposed to be about subtle distinctions between ideas. It describes complex things rather than simple ones. And it is almost always wrong.

Practitioners of nuance presume that more subtle and complicated answers are superior to simple and concrete answers. In reality, nuance relies on logical fallacies which  lead to wrong answers.

Speculating about others’ motives is a dangerous action. It leads us to wrong conclusions while ignoring the empirical consequences.

When someone disagrees with us on an important issue, we often assume they have an evil motivation. When someone agrees with us, they are kind and intelligent people.

Shankar Vedantam’s recent articleDisagree about Iraq? You’re Evil shows how pervasive these beliefs are.

Overcoming Bias discusses Semantic Stopsigns.

Consider the First Cause Paradox. We know that time began with the Big Bang. So what came before that? The physical laws may have been different beforehand, if so what were they? “God” is not an answer. It’s a logical stopping point.

When we see actions and consequences, we infer motivation and intent. We seek to explain why things happen by suggesting there was a reason behind events. We do this even when this makes no sense. We falsely assign negative motives to individuals who were not responsible for complicated events. We assign intent to natural events through religion. Worse yet, our biases guide us to assign greater value to speculative motives than concrete facts.

Humans evolved for fitness, not intelligence. We’re build to make If/Then judgments to survive and reproduce. For this reason we are fooled by randomness. We see patterns and correlations where there are none.

There are two methods of viewing actions. We can judge the motivations of the actions or the consequences of the actions.

Consequences produce empirical evidence which can be independently analyzed and judged. We can only speculate about another person’s motivation.

Attacking someone’s motivations is a classic political attack, but it is usually an ad hominem fallacy. Why is it that we place more value on a person’s motivations rather than the consequences?

Hindsight bias distorts our view of the world. Everything is obvious and expected in retrospect, but no one has the ability to foresee such results. This cognitive bias limits our appreciation for real science.

Meyers’s Exploring Social Psychology challenges this bias and shows us how confusing reality may be. (via Overcoming Bias)

Penelope Trunk argues that it does not matter that journalists misquote people and get basic facts wrong. Truth is only a narrative, and journalists are telling their own stories.

The problem? Journalism should be a science, not an art. Art describes the artificial creation of a mind. Journalism should seek to discover the external reality. Good journalism has no narrative. It collects facts so historians and social scientists can see patterns and formulate scientific theories. Artistic narratives obscure the truth.