Feeds:
Posts
Comments

Archive for the ‘data mining’ Category

Interesting article on accidents caused in the aviation and medical industries where people notice deadly problems but don’t convey the seriousness to other team members out of politeness or deferring to a senior team mate. For example:

Korean Air Flight 801, almost the same exact situation as Air Florida. In trying to warn the captain of severe weather problems that would eventually lead to the deaths of 228 of the 254 people on board, the first officer says, “Don’t you think it rains more? In this area, here?” and “Captain, the weather radar has helped us a lot”…

Captain, the weather radar has helped us a lot?! What are these people doing? They’re hinting at the impending problem, in hopes that the guy who’s a little busy with the whole “flying an airplane” or “trying to bring 99 planes circling Kennedy airport in for a landing” thing is going to catch on, read their mind, and solve the problem for them self.

The official term for this is “mitigated speech” and Malcolm Gladwell provides a fascinating account of how it has effected the airline industry in his book Outliers. He defines it as “any attempt to downplay or sugarcoat the meaning of what is being said.” and explains that “we mitigate when we’re being polite, or when we’re ashamed or embarrassed, or when we’re being deferential to authority.”

Read Full Post »

Airline Pilot Patrick Smith has a great post on the pointlessness of the current wave of airline security. It seems every time I fly, there is some new hoop to jump through all in the name of safer travel. But does any of it really make air travel any safer or is it just designed to make people feel safer?

How we got to this point is an interesting study in reactionary politics, fear-mongering and a disconcerting willingness of the American public to accept almost anything in the name of “security.” Conned and frightened, our nation demands not actual security, but security spectacle. And although a reasonable percentage of passengers, along with most security experts, would concur such theater serves no useful purpose, there has been surprisingly little outrage. In that regard, maybe we’ve gotten exactly the system we deserve.

Unfortunately, a lot of legislation since 9/11 has been driven by fear and the need to do something to address the problem that it seems that efficacy has not really the top priority. It easy to say that it’s mostly harmless and makes people feel safer but security is always a set of tradeoffs. Resources spent checking passenger shoes has to be weighed against the opportunity cost that those resources could have been used for. I’m still somewhat amazed that there has been no visible effort to enact the kind of behavioral style passenger screenings at the airports that are common in Israel. To be fair, the immediate addition of locking cockpit doors was a simple, effective and common sense move that would most likely prevent a similar event from occurring in the future.

Nicholas Nassim Taleb talked about how this type of legislation comes to pass – there is almost no incentive to pass effective legislation but every incentive to pass reactionary legislation:

Assume that a legislator with courage, influence, intellect, vision, and perseverance manages to enact a law that goes into universal effect and employment on September 10, 2001; it imposes the continuously locked bulletproof doors in every cockpit (at high costs to the struggling airlines)— just in case terrorists decide to use planes to attack the World trade Center in new York City. I know this is lunacy, but it is just a thought experiment (I am aware that there may be no such thing as a legislator with intellect, courage, vision, and perseverance; this is the point of the thought experiment). The legislation is not a popular measure among the airline personnel, as it complicates their lives. But it would certainly have prevented 9/11.
The person who imposed locks on cockpit doors gets no statues in public squares, not so much as a quick mention of his contribution in his obituary. “Joe Smith, who helped avoid the disaster of 9/11, died of complications of liver disease.” Seeing how superfluous his measure was, and how it squandered resources, the public, with great help from airline pilots, might well boot him out of office. Vox clamantis in deserto. He will retire depressed, with a great sense of failure. He will die with the impression of having done nothing useful. I wish I could go attend his funeral, but, reader, I can’t find him. and yet, recognition can be quite a pump. Believe me, even those who genuinely claim that they do not believe in recognition, and that they separate labor from the fruits of labor, actually get a serotonin kick from it. See how the silent hero is rewarded: even his own hormonal system will conspire to offer no reward.now consider again the events of 9/11. In their aftermath, who got the recognition? Those you saw in the media, on television performing heroic acts, and those whom you saw trying to give you the impression that they were performing heroic acts. The latter category includes someone like the new York Stock exchange Chairman Richard Grasso, who “saved the stock exchange” and received a huge bonus for his contribution (the equivalent of several thousand average salaries). All he had to do was be there to ring the opening bell on television—the television that is the carrier of unfairness and a major cause of Black Swan blindness. Everybody knows that you need more prevention than treatment, but few reward acts of prevention.

Bruce Schneier has done a great job in pointing out a lot of the security theatre that has occurred in the past few years and has even interviewed the head of the TSA on the subject:

BS: This feels so much like “cover your ass” security: you’re screening our shoes because everyone knows Richard Reid hid explosives in them, and you’ll be raked over the coals if that particular plot ever happens again. But there are literally thousands of possible plots.

So when does it end? The terrorists invented a particular tactic, and you’re defending against it. But you’re playing a game you can’t win. You ban guns and bombs, so the terrorists use box cutters. You ban small blades and knitting needles, and they hide explosives in their shoes. You screen shoes, so they invent a liquid explosive. You restrict liquids, and they’re going to do something else. The terrorists are going to look at what you’re confiscating, and they’re going to design a plot to bypass your security.

That’s the real lesson of the liquid bombers. Assuming you’re right and the explosive was real, it was an explosive that none of the security measures at the time would have detected. So why play this slow game of whittling down what people can bring onto airplanes? When do you say: “Enough. It’s not about the details of the tactic; it’s about the broad threat”?

KH: In late 2005, I made a big deal about focusing on Improvised Explosives Devices (IEDs) and not chasing all the things that could be used as weapons. Until the liquids plot this summer, we were defending our decision to let scissors and small tools back on planes and trying to add layers like behavior detection and document checking, so it is ironic that you ask this question—I am in vehement agreement with your premise. We’d rather focus on things that can do catastrophic harm (bombs!) and add layers to get people with hostile intent to highlight themselves. We have a responsibility, though, to address known continued active attack methods like shoes and liquids and, unfortunately, have to use our somewhat clunky process for now.

Read Full Post »

Bruce Schneier started out as a cryptography expert but he came to realize that most security problems are caused by everything but the military grade encryption technology he studied. Now he writes about real world security and a lot of this is related to how people overreact to security issues that are rare events and underreact to the common:

I tell people that if it’s in the news, don’t worry about it. The very definition of “news” is “something that hardly ever happens.” It’s when something isn’t in the news, when it’s so common that it’s no longer news — car crashes, domestic violence — that you should start worrying.

Schneier has a post about why society feels obliged to do something in the light of a terrible tragedy even if the net effect will be to make the problem worse:

But that’s not the way we think. Psychologist Scott Plous said it well in The Psychology of Judgment and Decision Making: “In very general terms: (1) The more available an event is, the more frequent or probable it will seem; (2) the more vivid a piece of information is, the more easily recalled and convincing it will be; and (3) the more salient something is, the more likely it will be to appear causal.”

So, when faced with a very available and highly vivid event like 9/11 or the Virginia Tech shootings, we overreact. And when faced with all the salient related events, we assume causality. We pass the Patriot Act. We think if we give guns out to students, or maybe make it harder for students to get guns, we’ll have solved the problem. We don’t let our children go to playgrounds unsupervised. We stay out of the ocean because we read about a shark attack somewhere.

Availability Heuristic is a common cognitive bias in which people will overestimate the likelihood of an event if they had been asked to visualize the event previously:

In one experiment that occurred before the 1976 US Presidential election, participants were asked simply to imagine Gerald Ford winning the upcoming election. Those who were asked to do this subsequently viewed Ford as being significantly more likely to win the upcoming election, and vice versa for participants that had been asked to imagine Jimmy Carter [Carroll, 1978]. Analogous results were found with vivid versus pallid descriptions of outcomes in other experiments.

This problem seems to be becoming more pronounced in western society because we no longer need to actively imagine the event. The rise of sensationalistic 24 hour news networks that provide saturation coverage of the most unusual tragedies are priming the pump for availability heuristic in their viewers and this leads to a public outcry that politicians feel obliged to act upon.

Read Full Post »