Strategically Planned Errors
July 25, 2011: Strategically Planned Errors
Joseph Hallinan's Why We Make Mistakes (New York: Broadway Books, 2009) is a delightfully quick read that outlines the various reasons why human beings make mistakes. I wish I had read this book when I was in school. I might have learned the fallacy about sticking with the first answer chosen on exams (people who change their scores in general improve their scores). Just imagine the ACT scores I could have had! It's hard not to feel incredibly humble when one sees oneself falling prey to the various traps Hallinan mentions that lead to mistakes.
Academics often forego humility in their pursuit and creation of the strategic plan. Taking anywhere from three months to a year, the strategic plan is supposedly the best forward thinking of a college or university's leadership. And, yet, I suspect many of these plans are created without recognizing the mistakes being made along the way. And those mistakes can so easily add up to complete disaster.
For instance, Hallinan describes a phenomenon that has always driven me crazy, especially about Americans. As a general rule, "we all think we are above average" (Chapter 10). (Golf is the only place where Americans are honest about their abilities.) When Hanninan says that "Corporate Executives routinely display overconfidence in their judgments about the thing they know best: their businesses" (165), he might as well be talking about many of an academic executive. There's a reason the executive is paid so much money; he's valuable because he is an expert. And he believes that myth.
So, the strategic plan starts with this overconfidence factor. Add to that the five-year projection at the heart of almost every strategic plan. In academia, five years can seem a long way away. However, as Hallinan notes, "when the consequences of our decisions are far-off, we are prone to take bigger gambles: but when consequences are more immediate, we often become more conservative" (98). It's easy to predict significant enrollment growth or an improved national ranking when you are talking half a decade away, even if all the information from the last year indicates differently.
A third element during the strategic planning process is reviewing the past to determine the future of the institution. A nice idea, but "one of the most significant sources of human error is hindsight bias . . . . knowing how things turned out profoundly influences the way we perceive and remember past events" (64). This hindsight can lead to over-evaluating successes and over-criticizing failures. We added a program in education and got 100 new students. If we add a second program in education, we can get even more. Uh, not necessarily.
Then, once we have the strategic plan completed, we roll it out, announcing it, reviewing it, summarizing it over and over to different "constituencies." The mere act of that repetitious presentation will likely lead to more mistakes. First, literally, is what's first. "The key to anchoring is the first number. People tend to process information in which it is presented. And the best place to be in that order is first" (105). If you've ever been part of a strategic plan discussion, or frankly any discussion of a topic with multiple sections, then you know how item one gets lots of discussion, while item eight will get little. Let's hope item eight isn't seriously flawed, as no one is going to catch it.
Second, people eventually will pay less attention, most importantly the executive team (even the sincere and transparent ones) who led the development of the plan: "As something becomes familiar, we tend to notice less, not more. We see things not as they are but as (we assume) they ought to be" (113). With each presentation, familiarity is potentially breeding disinterest. Obvious errors may be all the more overlooked. There's a reason every composition teacher worth his or her salt preaches that students should set papers aside for several days before editing or revising them.
Eventually, some part of the strategic plan may fail. Despite looking at all of the factors above that could have led to developing a flawed plan, we assume errors were made at the bottom of the food chain. As Hallinan says, "identifying the source of an error also requires knowing where and how to look. After something goes wrong, we tend to look down—that is, we look for the last person involved in the chain of events and blame him or her for the outcome" (191). We didn't make the enrollment goals we anticipated! Well, then those recruiters are doing something wrong.
Ultimately it is in the talking about strategic planning that we establish a core mistake. Hallinan paraphrases scholar Barbara Tversky with the argument that "the purpose of conversation isn't to convey the truth--it's to create an impression" (131). The examples are about story-telling, but they fit the general nature of dialogue, especially about strategy. Most subsequent discussions revolve around impressions about the plan, not really the plan itself.
So, the errors pile up and we delude ourselves to the accuracy of our effort and process. As Hallinan says very profoundly, "in effect, we really do come to believe our own untruths" (132). Amen.
|