[Note: Since today is Pearl Harbor day I am posting this Monday instead of Tuesday. I’m also interrupting the “Health Care – Something’s Missing” sequence. I plan to return to that series on Friday.]
The fighter planes, dive bombers, and torpedo planes dropped out of the clouds over the 2800-foot Koolau Range. Surprise was complete. Strafing fighters knocked out the U.S. planes before they could get off the ground. Bombers and torpedo planes sunk every warship in Pearl Harbor. Then the attackers returned to their carriers and escaped.
You almost certainly never heard of that attack. No it did not happen on December 7, 1941. It took place on February 7, 1932. The attackers were under the command of U.S. Admiral H. E. Yarnell and of course the bombs and other weapons were simulated, part of a war game.* That day that does not live in infamy – but it should.
The sad fact is that Yarnell’s operation offered some lessons that our naval authorities missed. The Japanese attack in 1941 was a near duplicate of that simulation. Both were on Sunday when defenses were down, both used the same pattern of attack, even using clouds over the same mountain range to conceal their approach. However the Japanese bombs and torpedoes were real and in 1941 the ships were sunk in reality instead of in simulation.
What was the reaction of the admiralty? Some officers saw the lesson and wanted to incorporate it into naval operations. They were overruled. The navy remained organized around battleships and cruisers, with aircraft carriers as a stepchild. Worse, decision-makers ignored the possibility that the Japanese might copy Yarnell’s plan.
The Japanese did not ignore it. Their spies were watching and reports quickly made their way to Tokyo. Those reports may have played a part in Yamamoto’s planning (though there were other sources he may also have used). What is clear is that U.S. involvement in World War II would have started very differently had our authorities learned from Yarnell’s operation and taken measures to defend against such an attack in a real war. The total lack of preparation for air attack made it easy for the Japanese.
That day in 1932 should live in infamy because of U.S. refusal to learn the obvious lessons. Furthermore it should live in infamy as a reminder to each of us that we are subject to similar blind spots. That is part of being human.
The fact is that we do not see the world as it is, the world is much too complicated for that. Our minds filter what we perceive so that we see only what seems important to us. A group of Harvard psychologists demonstrated this by showing a video of basketball players passing the ball. They asked people to count the number of passes made. During the video either a woman with an umbrella or a man in a gorilla costume would walk through the action. Only half noticed the gorilla and only 65% noticed the woman. Those distractions were not necessary to the task so many people filtered them out.
No, we don’t see the world as it is, we see a model of that world. Our mind creates that model by paying selective attention to what we regard as important. That is the only manner we can make sense of this world. So it has always been, and so it will always be unless we somehow become omniscient. The difference between success and failure is not whose model contains the greatest amount of information. It is not even necessarily whose model is most accurate. No, that difference is whose model is most accurate in characteristics relevant to the issue at hand.
Air power was not an important part of the model held by U.S. Navy decision-makers so they filtered out Yarnell’s success. Their model of warfare had been effective during World War I but the world had changed. Following the standard procedure of using the time between wars to learn how to fight the last war better, U.S. commanders ignored important information. Failure to incorporate that information into their model led to disaster.
So what does a failure from 77 years ago have to do with us today? The answer is that human nature has not changed. We still see only our own model of the world and human nature still militates against changing that model. Each of us has a model of the world, correct in some regards, incorrect in others, and simply not including other parts of the world. This affects how we live our individual, family, and work lives. It also affects how we vote and how those we elect govern us. Parents, employees, managers, politicians. All see their own model of the world, not the real world itself. Those models are all imperfect. The effectiveness of their decisions depends on how those imperfections fit with important aspects of those decisions. If the errors in the model are important to their decisions, they will have no choice but to make bad decisions.
Can we overcome this problem? Not completely but we can do better. Gonzales book, Deep Survival, points out that survivors are people who are willing to recognize the imperfections in their model of the world, and to change that model to fit new information. Those who refuse to do this may get by but are setting themselves up for disaster when their model does not match important aspects of reality. Sowell in his book, The Vision of the Anointed, makes a similar point about political life. He points out that many with the “unconstrained vision” simply refuse to admit that they might be wrong. It is not that their model is faulty, that happens to everybody. Their problem is that they do not adapt their model of the world to available information. Like the navy brass after 1932, they continue down the path to disaster.
What can we do about all this? Perfection is not available to humans so we have to do the best we can. That means recognizing that our models of the world are imperfect and always will be. However it also includes continually improving in those models by seeking and accepting new information. We can also insist that politicians do the same. Only in that way can we improve how we see the world with consequent improvement in our decisions and lives.
*My source is Edwin Muller’s article, “The Inside Story of Pearl Harbor, Reader’s Digest, April 1944 reprinted in Secrets & Spies, Reader’s Digest Association, 1964. Several shorter but more readily available accounts can be found by a web search for “1932 Pearl Harbor Attack.”
If you like my blog, please tell others.
If you don’t like it, please tell me.
Monday, December 7, 2009
A Day That Does Not Live in Infamy
Labels:
blind spots,
change,
model,
models,
omniscience,
Pearl Harbor,
Yarnell
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment