Archive for May, 2007

ITConversations has an interview with Nassim Nicholas Taleb about his new book The Black Swan (which I am sadly too swamped to read right now but will post when I have). Taleb talks about the effect of a world where rare and unforeseen phenomena (which he calls “Black Swans”) shape much of the world in which we live.

ITConversations also has a presentation Taleb gave at PopTech 2005 which is well worth a listen.

Read Full Post »

Bruce Schneier started out as a cryptography expert but he came to realize that most security problems are caused by everything but the military grade encryption technology he studied. Now he writes about real world security and a lot of this is related to how people overreact to security issues that are rare events and underreact to the common:

I tell people that if it’s in the news, don’t worry about it. The very definition of “news” is “something that hardly ever happens.” It’s when something isn’t in the news, when it’s so common that it’s no longer news — car crashes, domestic violence — that you should start worrying.

Schneier has a post about why society feels obliged to do something in the light of a terrible tragedy even if the net effect will be to make the problem worse:

But that’s not the way we think. Psychologist Scott Plous said it well in The Psychology of Judgment and Decision Making: “In very general terms: (1) The more available an event is, the more frequent or probable it will seem; (2) the more vivid a piece of information is, the more easily recalled and convincing it will be; and (3) the more salient something is, the more likely it will be to appear causal.”

So, when faced with a very available and highly vivid event like 9/11 or the Virginia Tech shootings, we overreact. And when faced with all the salient related events, we assume causality. We pass the Patriot Act. We think if we give guns out to students, or maybe make it harder for students to get guns, we’ll have solved the problem. We don’t let our children go to playgrounds unsupervised. We stay out of the ocean because we read about a shark attack somewhere.

Availability Heuristic is a common cognitive bias in which people will overestimate the likelihood of an event if they had been asked to visualize the event previously:

In one experiment that occurred before the 1976 US Presidential election, participants were asked simply to imagine Gerald Ford winning the upcoming election. Those who were asked to do this subsequently viewed Ford as being significantly more likely to win the upcoming election, and vice versa for participants that had been asked to imagine Jimmy Carter [Carroll, 1978]. Analogous results were found with vivid versus pallid descriptions of outcomes in other experiments.

This problem seems to be becoming more pronounced in western society because we no longer need to actively imagine the event. The rise of sensationalistic 24 hour news networks that provide saturation coverage of the most unusual tragedies are priming the pump for availability heuristic in their viewers and this leads to a public outcry that politicians feel obliged to act upon.

Read Full Post »