KNOWING WHEN AND HOW TO STOP

© 1998 John Perkins

Introduction

Excitation or inhibition, the fundamental states of our nervous system's cells (Pavlov, 1966: 354), also describes in a gross binary way decisions we face singly, in pairs, in groups, as a nation, and as a species. Poetically, Hamlet ponders his fate in his "to be or not to be" solloquy; more prosaically we do the same: does the team stay in town, or leave? Do I stay in this relationship for the indefinite future, or do I get out now?

Scholars who study decision making like to present the process as one of stages (Poole & Hirokawa, 1986; Janis and Mann, 1977). Each stage typically has some tasks which need to be completed before the decision making group moves to the next stage. After leaving a stage, people will not return to it as a matter of routine, but will make incremental adjustments to their original decision as experience accrues (Janis and Mann, 1977: 75). Experience and research confirm that decisions must be made in an imperfect world; that is, the decider(s) will not have all of the information, nor will he or she be able to perfectly predict the behavior of others; sometimes the decider(s) cannot even predict their own behavior (Staw, 1989; Of Auctions, 1989).

Knowing when, and how to, stop has been the shadow side of decision making. I personally began to understand this during my time as an anti-war activist in the 1980's. As I watched government policy as described publicly in the media, I understood that the governments which had nuclear weapons had no serious criteria-and thus no intentions-of stopping.

I remember a fellow activist and I discussing this point. She commented that when she attended public discussions on nuclear power plants, she always would ask what the utility's plans were if they had to close the plant. They had started something, without thinking through what they would do if they had to end it. They never had a plan, and seem confused by the question.

The third source of my reflections on stopping come from when I worked with elementary students in the South Bronx on issues of decision making and self-esteem. For one of our exercises, I would ask them to describe "something you used to say 'yes' to that you now say 'no' to, and something you used to say 'no' to that you now say 'yes' to." We would role-play the switch and have fun with people resisting being persuaded.

I turned to the serious problem of their upcoming adolescence. By a show of hands, all of them felt confident they would not take illegal drugs. But the reality was that many of them would. A one time experiment might, or might not, be anything to get too upset about. "But," I asked, "do you have what it takes to say 'yes' once, then 'no' for the rest of your life? It's not the first time that makes you an addict-it's the fifth, the thirteenth, the sixty-fifth time. And remember, when you try it, it might give you an experience of sights, sounds, colors, and moods, you've never had before, and could never have without the use of illegal drugs. Can you then say 'no' to that drug?" I reminded them that the next six to eight years of their lives would probably be the hardest, and the decisions they made as fifth and sixth graders about what they would do, or not do, would affect those years, and the rest of their lives. I said once they reached junior high school, life and choices would come at them fast and furiously. I ended by telling them we live in a culture which is great in encouraging them to start things, but nearly silent about how to stop or prevent things.

With these personal experiences alerting me, I included this study as a unit in my program, focusing at the organizational level. Knowing how and when to stop has basically two discontinuous moments of truth: deciding to stop before going public and deciding to stop once the decision is public. Closing what Meyer and Zucker (1989) call permanently failing organizations-deciding to end projects or institutions which once had experienced success-become cases of public decision making.

In the case of decisions which some the deciders themselves will later view as patent folly, the best policy appears to stop early and privately. But deciders make decisions with incomplete information and only guesses to how others might behave. Section One, Folly on the March, will take up this problem in greater detail.

Publicity about a decision, or publicly implementing one, widens the number of participants and observers, and thus the difficulty of stopping increases substantially. The decision becomes rigid and resistant to changes (Schulman, 1980: 43). This is because stopping "in the middle" or "before the end" has added difficulties of how to manage one's own "face" or public reputation (Bohannon, 1993; Ting-Toomey, 1994); how to persuade involved others that stopping what you once supported makes more sense now; and how to deal with incurred psychological, symbolic, material, and social costs. This complex area of decision making will be considered in Section Two: In Too Far to Quit?

In the last section of the paper, Can Folly be Prevented?, I will propose some tentative recommendations and reflect on my experience of researching this paper.

Section One: Folly on the March

When Barbara Tuchman looked at the March of Folly (1984) through history, she established some criteria for what she would consider folly. Her three criteria can help distinguish between foolishness and an honest mistake. To fit her definition of folly, a course of action:
(1) must have been perceived as counter-productive in its own time, not by hindsight;

(2) a feasible alternative course of action must have been available; and

(3) the policy in question must be that of a group, not of individual ruler, and should persist beyond any one political lifetime (p. 5).

Taking up Tuchman's third criteria, in 1972, Irving Janis described many of the difficulties posed for the "group" in his ground-breaking book, Groupthink. The groupthink hypothesis asserts that "the existence of certain antecedent conditions within groups of decision makers result in defective decision-making processes, which in turn are linked to poor policy outcomes" (Schafer and Crichlow, 1996: 415). Schafer and Crichlow in their quantitative review of this theory, simplified Janis's groupthink model to three stages as shown in this figure.

From an exhaustive review of Janis's works, Schafer and Crichlow developed a refined list of operational definitions for ten antecedent conditions:

1. Group insulation: Decision makers isolate themselves from others not in the immediate decision making circle.

2. Lack of tradition of impartial leadership: The president has not had a history of conducting impartial decision-making processes, which thereby limits open discussion of a wide range of alternatives.

3. Lack of tradition of methodical procedures: The president has not established a tradition of using methodical procedures in the decision-making process in terms of information search, routine and systematic decision-making meetings, and analysis of pros and cons.

4. Group homogeneity: A lack of disparity exists in the social background and ideology of the members of the decision-making group.

5. Perceived short time constraint: The group suffers under perceived temporal limits that affect its ability to consider policy options fully.

6. Low self-esteem caused by recent failure: Recent political or military defeat weights on the minds of members of the decision-making group and affects current decisions.

7. High personal stress: The crisis being dealt with causes great anxiety because of either the stakes involved and the perceived chances of success or unpleasant policy options.

8. Overestimation of the group: The group operates under an illusion of invulnerability or a belief in its inherent morality.

9. Closed-mindedness: The group relies on collective rationalizations, stereotypes of the out-group, or guiding metaphors or analogies.

10. Pressure towards uniformity: One or more of the following exists: self-censorship, an illusion of unanimity, direct pressure on dissenters, self-appointed mind guards (pp. 418-419).

Schafer and Crichlow investigated the same 19 Cold War crises that Janis had used to test the groupthink hypothesis in 1987 (Hekek, et al). They used bivariate analysis to correlate each of the ten conditions with the level of information processing errors. Four of the ten antecedent conditions proved statistically significant and in the predicted direction: lack of tradition of impartial leadership, lack of tradition of methodical procedures, overestimation of the group, and closed-mindedness. A fifth condition, pressures towards uniformity, came very close to significance and accounted for fifteen percent of the variance on its own and was in the expected direction as well (p. 423). When lumped together, Schafer and Crichlow found that these five variables showed almost a one to one correlation between faulty structural environments and the number of information-processing errors (p. 425).

None of the three situational variables-short time constraint, high personal stress, and recent failure-proved significant. In fact, short time constraint and high personal stress seemed to reduce information processing errors. In other words, some contextual conditions result in more vigilant decision making.

An Example of Groupwisdom


Interestingly, two of the four significant antecedents to groupthink contain the phrase "lack of tradition." The American Heritage dictionary defines tradition as
1. The passing down of elements of a culture from generation to generation, especially by oral communication.
2.a. A mode of thought or behavior followed by a people continuously from generation to generation; a custom or usage. b. A set of such customs and usages viewed as a coherent body of precedents influencing the present.

Which teases out this idea: not only must the group learn to remain humble (to overcome overestimation) and learn to tolerate ambiguity (to overcome close-mindedness), it must continuously examine and deepen its own culture of decision making, authority, membership, argument, etc. In other words, in groupthink mode the group lacks a tradition of inquiry into its own modes of behaving in normal and stressful situations.

Campbell and Nash in A Sense of Mission (1992) describe how some improper payments by one of its overseas operations prompted Johnson & Johnson's president James Burke to launch a series of Credo Challenge Meetings. Collins and Porras (1994) also cite Johnson & Johnson as one of their "visionary companies" because of the impact these meetings had on the decision making processes in the company.

In 1975, Johnson & Johnson had annual sales of about $12 billion and had developed a management culture resistant to centralization (Campbell and Nash: 138). General Robert Wood Johnson published The Johnson & Johnson Credo in 1945 for the enlightenment of other business people. The entire document fits easily on a single page. It lists four valued relationships, each given a full paragraph which defines the relationship and establishes Johnson & Johnson's expected obligations to that relationship. The table on the next page summarizes the 1975 revised Credo.

After much debate, including reservations from the chairman and CEO who that felt the Credo should be above "challenge," President Burke decided to hold company-wide discussions about the value of the Credo in 1975. The Credo was then thirty years old. With the Credo Challenge Meetings Burke began a "long-range program to challenge every manager to make an informed commitment to the Credo's way of doing business" (p. 143). In other words, he began what might turn into a new tradition. Indeed, Credo Challenge Meetings are still held for new managers (p. 146).

Since 1975, over 1200 managers have participated in Credo Challenge Meetings, gathering for two days in groups of 25. These meetings made a point of always discussing how the Credo should be implemented in the management of the Johnson & Johnson company. Interestingly, during these discussions, Burke's boss, the chairman/CEO, sometimes had to excuse himself because he "breathed too heavily on the process" (p. 145). In the language of antecedent conditions to groupthink, he could not maintain leadership impartiality.

Burke called the results, "A turn on. A genuine happening" (p. 145). He learned that managers felt that balancing all of their responsibilities required discussion because of the difficulty of the task, and most managers felt intense commitment to preserving the Credo. Another result of these Credo Challenge Meetings is that managers facing a decision pause to reflect upon what the Credo would say to do, and discussions based on the Credo have become habitual (p. 148). In essence, they added a step which precedes Janis's antecedents of decision making: continually discuss organizational values and seek personal commitments to them. Schafer and Crichlow's diagram can be modified to show how Groupwisdom might operate:

Table 1. Elements in Johnson & Johnson's 1975 Revised Credo
Customers: doctors, nurses, patients, etc. We must strive to reduce our costs in order to maintain reasonable prices. Customers' orders must be serviced promptly and accurately.
Suppliers Must have an opportunity to make a fair profit.
Employees We must respect their dignity and recognize their merit. They must have a sense of security in their jobs. Compensation must be fair and adequate, and working conditions clean, orderly, and safe. Employees must feel free to make suggestions and complaints. We must provide competent management and their actions must be just and ethical.
Communities where people live and work We must be good citizens-support good works and charities and bear our fair share of taxes. We must maintain in good order the property we are privileged to use, protecting the environment and natural resources.
Stockholders Business must make a sound profit. We must experiment with new ideas. Research must be carried on, innovative programs developed and mistakes paid forWhen we operate according to these principles, the stockholders should realize a fair return.

(adapted from Campbell and Nash, p. 141).


Why discuss this in a paper about failures of group decision making? Because only studying failures may not point to where success lies. The Credo discussions prepared Johnson & Johnson to succeed dramatically in making value-based decisions under great time pressure and uninvited public scrutiny.

In 1982, Johnson & Johnson's Tylenol brand pain-killer held 37 percent of the domestic market. It led the market. Ironically, it had been launched the same year that the Credo Challenge Meetings began. On September 30, 1982, seven people died in Chicago from taking cyanide-tainted Tylenol. Shortly after the news reached Johnson & Johnson, they recalled 93,000 bottles and sent almost half a million messages to doctors, hospitals, and retailers warning them about the tainted capsules. Within a week they would recall or replace 22 million capsules. On the day after the poisonings, Johnson & Johnson canceled all advertising for the brand, except for one ad asking for the public's continued trust in Johnson & Johnson (pp. 151-152).

They set up a toll-free number-staffed by employees volunteering their time-for customers to get a refund without submitting proof of purchase. One weekend the toll-free number handled over a million calls, but many of those were good wishes from the public.

Everyone, the media, government, and employees, watched how Johnson & Johnson would respond. At its conclusion, one employee said:

Tylenol was the tangible proof of what they said at the Credo Challenge Meetings. You came away saying, "My God! You're right: we really do believe this. It's for real. And we did the right thing."

Actually, the whole organization had spent seven years preparing itself to act principally. It prepared itself to respond quickly and congruently to bad news.2 With their masterful handling of the crisis, Johnson & Johnson returned Tylenol to the shelves with tamper-resistant packaging. Backed by a massive marketing effort, the brand regained 95 percent of it previous market share and once again led the field (p. 154).

Responding in a crisis situation defines just one type of decision-making a group is likely to face. Far more difficult to gauge and respond successfully to are situations where the problem takes a long time to become salient. I turn to those types of situations next.

Section Two: In too far to Quit?

Great is the art of beginning,
but greater is the art of ending.
-Henry Wadsworth Longfellow

Anyone attempting to stop something once it becomes public faces a frustrating task. By public I mean the tangible proofs of a decision, whether begun or just proposed. This includes: major projects such as the closing of factories, missions to land a person on the moon, even setting a date for a wedding.
Decision making under public scrutiny has been described by different authors as escalating situations, permanently failing situations, and large-scale policy making. In escalating situations efforts to recoup losses simply increase the losses (Staw and Ross, 1989). In permanently failing organizations (Meyer and Zucker, 1989) what one group perceives as failure other groups interpret as acceptable, or even as successful. According to Schulman (1980), large-scale situations, whether successful or failing, create institutional and personal commitments which resist being changed.

Escalating Situations

Escalating situations often turn on the very first commitment to a course of action. In other words, once "go" gets the nod, decision makers don't look back. Note the care General George C. Marshall, as Secretary of State, took in 1948 to point out to Congress how difficult it would be to extricate the country from a military commitment to back the Chinese Nationalists: It would involve this government in a continuing commitment from which it would practically be impossible to withdraw" (Neustadt and May, 1986: 248-249).

Unfortunately, "who lost China" became the shibboleth of the fifties, which may have prevented the bureaucracy from clearly pointing out the hazards of escalating in Vietnam. Still, George Ball did attempt to warn President Johnson of the dangers of escalating in Vietnam:
The decision you face now is crucial. Once large numbers of U.S. troops are committed to direct combat, they will begin to take heavy casualties in a war they are ill-equipped to fight in a noncooperative if not downright hostile countryside. Once we suffer large casualties, we will have started a well-nigh irreversible process. Our involvement will be so great that we cannot-without national humiliation-stop short of achieving our complete objectives. Of the two possibilities I think humiliation will be more likely than the achievement of our objectives-even after we have paid terrible costs (quoted by Staw and Ross: 1989: 216, quoting from The Pentagon Papers).
Staw and Ross's (1989) article "Understanding Behavior in Escalating Situations" begins with this quote from Ball. They describe "escalating situations" as "situations in which losses have resulted from an original course of action, but where there is the possibility of turning the situation around by investing further time, money, and or effort." In our daily living we face making decisions in escalating contexts, too: what to do with a stalled career, falling investments, or a faltering relationship (Staw and Ross, 1989: 216).
Some insights into escalating situations might come from looking at gambling. Though a gambler might play dozens of spins of the wheel at the roulette table in an evening, after the initial decision to play at all, subsequent bets are not genuine decisions, merely a continuation of that first commitment. Dostoevsky described this moment before the plunge in The Gambler:
I confess that my heart was pounding in my breast and that I didn't feel at all cool and detached; probably I had felt for a long time already that I would leave Roulettenberg a different man and that something was about to happen which would radically and irrevocably change my life. I felt that it was bound to happen (in Leonard, 1989: 39).
One can see in another Dostoevsky passage what Janis called overestimation:
I believe I had something like four thousand gulden in my hands within five minutes. That's when I should have quit. But a funny feeling came over me, some sort of desire to challenge Fate, an uncontrollable urge to stick my tongue out at it, to give it a flip on the nose (p. 38).
Note the "funny feeling" Dostoevsky's gambler gets. Here literature points to a facet of escalation some research misses: there exists something in the emotion-action-results process of persistent gambling which feeds a hunger beyond material success or failure. The gambler, confronted with losing at the wheel, dice, horses, cards, etc. confronts reality's mismatch to their intense feelings of confidence. He or she attempts to regain both the confident feeling of precognition and external success by increasing his or her betting. The gambler continues until all is lost.

From a review of research in the laboratory and the "real world," Staw and Ross have uncovered four classes of determinants-project, psychological, social and organizational-for this persistence in losing situations (1989: 216). Each of these determinants possesses, naturally, both rational and meta-rational dimensions. Though the research into these determinants has been uneven, the existence of multiple determinants of commitment and persistence leads Staw and Ross to view escalation as multidetermined. Persistent pursuit of a failing course may most likely happen in situations characterized by a series of seemingly insufficient small-impact variables, as say, the loss of a few dollars on a particular bet out of a original stake of several hundred dollars. They add, "A slow and irregular decline may not only make a line of behavior difficult to extinguish (in the reinforcement theory sense), but may allow the forces for persistence to grow over time" (p. 219).

"Slow and irregular decline" can also be named permanent failure. In their book on Permanently Failing Organizations, Meyer and Zucker (1989) researched the literature on low performing organizations and presented four case studies:

The decline of the Los Angeles Herald Examiner, the only paper actually founded by William Randolph Hearst, and once the flagship of the Hearst publishing empire-endured a nine year strike from 1967 to 1976 and the family remains divided on whether to sell the paper or to attempt a comeback;

A worker-owned meat packing company where workers picketed their own company-it declined for 15 years under various organizational schemes before going bankrupt;

The attempt of the Los Angeles Archdiocese to sell off an under-enrolled Catholic boys' high school without anticipating the reactions of the staff or the Latino alumni, ninety percent of whom had gone on to college and successful careers-the Archdiocese reversed its decision after eighteen months of controversy and in the face of a well coordinated "Save Cathedral High" campaign; and

The divergent motivations of steel mill owners when contrasted with those of their employees and the communities in which the mills operated-workers and communities established new quasi-governmental institutions which could take over mills by exercising powers of eminent domain with the stated purpose of protecting the jobs, and communities, dependent on the mills (pp. 31-40).
These cases share some characteristics identified by Meyer and Zucker: "anticipated future benefits, multiple goals, and decline" (p.80). Anticipated future benefits mean that some people involved believe that continuing the organization, even in the face of disappointing performance or long-term decline, may still bring benefits in the future. Competing groups contest how to frame the benefits (and decline) and to whom the benefits accrue, which leads to multiple goals. In the case of the closing steel mills, for example, stockholders would view closing uncompetitive mills as a sound business decision while communities viewed the same news as "bombshells" (p. 40). Naturally, given the divergent ways in which parties measure success, whether or not an organization is even "failing" and in a state of decline becomes a matter of opinion. For example, in the case of the high school, measured financially it had ceased to pay its way, but measured educationally it continued to be an outstanding success story. Who gets to name the measures of performance?

To the three common qualities identified by Meyer and Zucker, I humbly submit three more. First, none had an explicit and public process for closing down operations. In the incorporation papers for organizations the initial founders must describe what they will do in the case the organization has to dissolve. When an organization is one week old, this can be answered with a phrase; when it is several decades old, more is needed.

Second, three of the four cases had a wide range of people who advocated for a right to participate in the decision, even though they were not "owners." These people often recruited other people of wider social or political influence to the cause of keeping the organization operating. Meyer and Zucker called these secondary interests "dependent actors" (p. 24) and offered this description of this role:
Workers, who receive solely wage or salary compensation; the community, which receives only the side benefits of firm operation in the form of employment, purchasing power, and access to goods and services provided by the firm; and the organizations using the firm's products as inputs, requiring the firm's services, or selling their products to the firm, are in a position of dependency and therefore have the motivation to support the maintenance of the firm apart from its efficiency and resulting performance. This motivation is relatively constant, hence it weakens the relation of performance to persistence (p. 93-94).
Third, the people initially identifying the organization as being in decline failed to understand the emotional-symbolic connections others had with the organization. Organizations take on significant symbolic meaning for people whose lives they have previously touched, as well as for current dependent actors, as can be seen in the Herald Examiner and Catholic high school cases. Failing to understand this connection leaves those seeking to suspend operations baffled by the emotion and energy people put into keeping the organization "alive." They also misstep significantly right at the beginning of their effort to close by not informing dependent actors well enough in advance and by not making sure people can find a way to ritualistically say good-bye to-while somehow remaining emotionally in touch with-their "symbol" (Bridges, 1991: 31). They unwittingly step in to the quagmire of changing behaviors and expectations about large-scale operations, which I will turn to next.

Large-Scale Decisions

The political scientist Paul Schulman makes a case in Large-Scale Policy Making (1980) that large-scale projects need special handling, almost a new way of being conceptualized. The way forward, or back, may not be smooth and continuous. He notes that
large-scale policy pursuits are beset by organizational thresholds or "critical mass" points closely associated with both their initiation and subsequent developmentThe large-scale policy objective typically confronts psychological, technological, organizational, and administrative barriers over which it must "leap" discontinuously if it is to establish and sustain itself (p.28).
And I will add that these discontinuous leaps reappear when a large-scale project needs to reverse itself and shut down operations. Essentially, large-scale projects generate formal and informal agreements among lots of people, who then participate in doing the work of the project. The Nobel Laureate economist Kenneth Arrow has neatly summarized the dilemma:
The problem is that agreements are typically harder to change than individual decisions. When you have committed not only yourself but many others to an enterprise, the difficulty of changing becomes considerable. If it is done on a conscious level, we have all the formalities involved in persuading others to change their minds. What may be hardest of all to change are unconscious agreements, agreements whose very purpose is lost in our minds (1974: 28).
The discontinuities exist externally as well as within individual decision makers-and within involved decision making groups. A change, even a small one, begins to feel to the decision maker, and well may be interpreted by others, as a personal error and failure, though he or she or they decided and acted in good faith for the benefit of the organization. This results in overprotection and rigidity in decision-making processes by those with power because even small changes come to mean admissions of error (Schulman, p. 50). Paradoxically, persisting in the error, sometimes at enormous costs, becomes preferred by people identified with the decision to admitting to a mistake in judgment. Persisting in folly, as Barbara Tuchman points out, displays a certain lack of courage:
Persistence in error is the problem. there is always the freedom of choice to change or desist from a counter-productive course if the policy-maker has the moral courage to exercise it. He [or she] is not a fated creature blown by the whims of Homeric gods. Yet to recognize error, to cut losses, to alter course, is the most repugnant option in government (p. 383).

Section Three: Can Folly be Prevented?

Can folly be prevented? Maybe not, but its incidence might be reduced. Barbara Tuchman thinks that individual leaders need better and longer training, and that the public needs to peer into their character before allowing them to assume the reins (puns with reigns) of office (p. 384). What can we do while waiting for this enlightened educational policy, and the enlightened populace which will nurture it? Plenty, I think. I will call my recommendations anticipatory conditions.

The leading anticipatory condition must be the continuous practice of wide-ranging discussions of an organization's values and moral beliefs. This keeps decision makers conscious of how their actions might support or undermine reputed values. Clearly, the example provided by Johnson & Johnson shows the value of talking about morals. It is noteworthy that moral issues raised by Senator Fulbright and Arthur Schlesinger did not get discussed by Kennedy's advisors before the Bay of Pig invasion in 1962 (Janis, 1972: 157).

Anticipatory conditions two through six are the active, positive rephrasings of the significant antecedents Schafer and Crichlow identified. These will be listed later.

One of the fascinating features of groupthink is how partial the decision makers thinking becomes, perhaps as a way to cope with their anxiety by limiting the scope of what they would think about. Neustadt and May in Thinking in Time (1986) have identified this tendency to "ignore whatever seems not to fit and to define the problem as one calling for solutions they [politicians] have ready" (p. 235). Tuchman called this trait wooden-headedness, and commented that for Philip II of Spain "no experience of the failure of his policy could shake his belief in its essential excellence" (Tuchman: 7).

Neustadt and May provided several remedies for this malaise. My anticipatory condition number seven is Neustadt and May's suggestion that decision makers make time lines of their own organization, of their competitors' organizations, and of the current situation. They recommend leaving nothing of note out, and weaving into it the details of personal and organizational history (Neustadt and May: 238).

To balance this literal mapping of the history of the people and issues before the group, they suggest making deliberate use of analogies: that is, making a worksheet which lists possible historical analogies to what is perceived as the current challenge, and then note where the analogy has likenesses-and also note the differences-to the current problem (p. 273). A spontaneous example of this from Robert Kennedy avoided a pre-emptive strike during Cuban Missile Crisis: he called a surprise attack on Cuba a "Pearl Harbor in reverse" (emphasis added, Janis, 1972: 157).

A ninth anticipatory condition will also be taken from Neustadt and May: decision makers need to think through their assumptions about cause and effect. For example, when deliberating on whether to approve the Bay of Pigs invasion, Kennedy or his aides, needed only to have written on a yellow the completion to this sentence: "For the objective of bringing Castro down, a landing at the Bay of Pigs is the best option because" Even if they had sidestepped basic beliefs ("truths"), they would have encountered key "if-then" presumptions. Those could have been tested-or discarded-on sight (p. 238). Janis points out, with Arthur Schlesinger, that some of their assumptions about the military operations might have been challenged by simply looking at a map of Cuba after the plan was changed from a landing at Trinidad to a landing at the Bay of Pigs:
Schlesinger acknowledges that he and the others attending the White House meetings simply overlooked the geography of Cuba: "I don't think we fully realized that the Escambray Mountains lay 80 miles from the Bay of Pigs, across a hopeless tangle of swamps and jungle." This oversight might have been corrected if someone in the advisory group had taken the trouble to look a map of Cuba (1972: 29).
My tenth, and last, anticipatory condition is deciding before acting how and when one will want to voluntary stop the activity. This is to be done for every option under discussion. In escalating situations, where tempting possibilities and disregarded dangers lurk at every turn, a pre-decision as to when to stop can prevent great loss (Of Auctions, 1989: Staw, 1989).

Of course, decision makers must always seriously consider doing nothing. But action not required now may be required later. This means setting up a monitoring system for detecting the needed information. (see Neustadt and May, p. 238). To summarize my ten anticipatory conditions recommendations:
1. Host on-going discussions about the moral and ethical values of the organization.

2. Create a tradition of impartial leadership.

3. Create a tradition of methodical procedures.

4. Keep the estimation of the group limited and underestimated. Stay humble.

5. Practice tolerance for ambiguity as the group struggles to make sense of all of the available information. Available information includes the feelings, hunches, and metaphorical reasoning of members of the group.

6. Continually strive to see the situation in new ways in order to avoid becoming closed-minded.

7. Make a time line of the history of the group, of any opposing groups, and of the situation. Weave in personal details.

8. Make deliberate use of analogies by listing them and how they are similar-and different-to the current situation.

9. Probe thinking for cultural, personal, and institutional assumptions about cause and effect.

10. Discuss for every suggested decision option how the group would want to voluntarily stop.
Of course, these are just suggestions. There exists no royal road. As Neustadt and May put it after discussing their time-line:
This bring us to the absolute frontier of our considerations and experiments up to now. We have no more to offer than a general attitude, a cast of mind, an outlook, not a method (p. 237).

Final Words

Despite the positive tone of my recommendations, I could add many more. This means to me that the question of knowing how and when to stop remains an open one. If I can assert certainty about anything I have learned on this topic, it is that all proposals must be put forward as provisional and subject to later, or better, periodic review and revisions. Public discourse about decisions and large-scale efforts may need to always include reminders of the provisionalness of our planning, and specifically point out the types of new data or changes in circumstances which would trip a major reconsideration. I also have learned that large-scale changes involves the changing of a lot of minds, so continuous and open dialogue and evaluation of both the criteria and data of performance will pay dividends later.

Knowing how and when to stop will be different for every specific instance. One map cannot be drawn for how to transverse this territory, for each instance has different players, obstacles, histories, and anticipated futures. Still, as Kenneth Arrow points out:
There are moments in history when we simply must act, fully knowing our ignorance of possible consequences, but to retain our full rationality we must sustain the burden of action without certitude, we must always keep open the possibility of recognizing past errors and changing course (p. 29).
Though I began my researches for this paper with characteristic enthusiasm, believing that, at last, I would be able to learn from the mistakes of others to distill for myself a few "rules" for knowing when and how to stop, I leave this project with many more questions and plausible lines of research.

Bibliography


Texts

Arrow, Kenneth. (1974). The Limits of Organization. New York: Norton.

Bennis, Warren. (1989). Why Leaders Can't Lead. San Francisco: Jossey-Bass.

Bennis, Warren. (1993). An Invented Life: Reflections on Leadership and Change. Reading, MA: Addison-Wesley.

Bergler, Edmund. (1967). The Psychology of Gambling. In Herman, Robert, D,, editor, Gambling. New York: Harper and Row.

Bridges, William. (1991). Managing Transitions: Making the Most of Changes. Reading, MA: Addison-Wesley.

Bruteau, Beatrice. (1979). The Psychic Grid: How We Create the World We Know. Wheaton, IL: The Theosophical Publishing House.

Campbell, Andrew, and Nash, Laura L. (1992). A Sense of Mission: Defining Direction for the Large Corporation. Reading, MA: Addison-Wesley.

Dauten, Dale A. (1980). Quitting: Knowing When to Leave. New York: Walker and Company.

Farnham, Alan. (1994). It's a Bird! It's a Plane! It's a Flop! Fortune, May 2, 1994, pp. 108-110.

Ferarnside, W. Ward, and Holther, William B. (1959). Fallacy: The Counterfeit of Argument. Englewood Cliffs, NJ: Prentice-Hall.

Herek, G. M., Janis, L, and Huth, P. (1987). Decision making during international crisis: Is Quality of Process Related to Outcome? In Journal of Conflict Resolution, 31:203-226. Cited in Schafer and Crichlow (1996).

Janis, Irving L. (1972). Victims of Group Think. Boston: Houghton Mifflin.

Janis, Irving L., and Mann, Leon. (1977). Decision Making. New York: Free Press.

Keynes, John Maynard. (1977, originally 1932). The World's Economic Outlook. In Desaulniers, Louise, editor. Highlights from 125 Years of the Atlantic. No City: Atlantic Subscriber Edition, pp. 334-338.

Kolb, Deborah M., and Bartunek, Jean M., editors. (1992). Hidden Conflict in Organizations: Uncovering Behind-the-Scenes Disputes. Newbury Park, CA: Sage Publications.

la Brecque, Mort. (1980). On Making Sounder Judgments: Strategies and Snares. Psychology Today, June 1980, pp. 33-42.

Leonard, Linda S. (1990). Witness to the Fire: Creativity and the Veil of Addiction. Boston: Shambhala.

Lord, John, with Wold, Jeffrey. (1992(, Song of the Phoenix: The Hidden Rewards of Failure. Stockbridge, MA: Berkshire House.

Meyer, Marshall, and Zucker, Lynne G. (1990). Permanently Failing Organizations. Newbury Park, CA: Sage Publications.

Neustadt, Richard E., and May, Ernest R. (1986). Thinking in Time: The Uses of History for Decision-Makers. New York: The Free Press.

Of Auctions, Dilemmas, and Bloodletting: Model of Escalation Behaviour. (1989). The Lancet December 23/30, 1989:1487-1488. Failing

Pavlov, Ivan P. (1966, originally 1932). Essay on the Physiological Concept of The Symptomatology of Hysteria. In Marks, Robert W., editor. (1966). Great Ideas in Psychology. New York: Bantam, pp. 353-375.

Poole, Marshall, and Hirokawa, Randy Y. (1986). Communication and Group Decision Making. Newbury Park, CA: Sage Publications.

Schafer, Mark, and Crichlow, Scott. (1996). Antecedents of Groupthink: A Quantitative Study. In Journal of Conflict Resolution, 40(3):415-435.

Schulman, Paul R. (1980). Large-Scale Policy Making. New York: Elsevier.

Slovic, Paul, Fischhoff, Baruch, and Lichtenstein, Sarah. (1980). Risky Assumptions. Psychology Today, June 1980, pp. 44-48.

Staw, Barry M. and Ross, Jerry. (1989). Understanding Behavior in Escalation Situations. Science 246(10/13/89):216-220. PDE/Failing

Teger, Allan I. (1980). Too Much Invested to Quit. New York: Pergamon.

Thompson, James C., Jr. (1977 originally 1968). How Could Vietnam Happen: An Autopsy. In Desaulniers, Louise, editor. Highlights from 125 Years of the Atlantic. No City: Atlantic Subscriber Edition, pp. 558-567.

Ting-Toomey, Stella, editor. (1994). The Challenge of Facework: Cross Cultural and Interpersonal Issues. Albany, NY: State University of New York.

Tuchman, Barbara. (1984). The March of Folly: From Troy to Vietnam. New York: Knopf.

Valentin, Erhard K. (1994). Anatomy of a Fatal Business Strategy. Journal of Management Studies, 31(3):359-382.

Tapes

Bohannan, Paul, et al. (1993). Face Saving Devices as a Mode of Conflict Settlement. Portland, OR: National Conference on Peacemaking and Conflict Resolution.

Internet

Forsyth, Donelson R. (1996). Lecture Notes: Decision Making in Groups. Available 12/23/96 at http://www.vcu.edu/hasweb/psy/psy341/decide.html.

Francis, Donald P. (1996). A Voice for the Public's Health. Available 12/23/96 at http://www.gene.com/AE/TSN/SS/bkgnd_paper.html.

Healing Mission: William Foege, MPH Harvard 1995, Reflects on a Career Devoted to Reducing Human Suffering. (1996). Available on 12/23/96 at http://biosun1.harvard.edu/hphr/94fall/healing.html.

Johnson, Jo-Ann. (1996). Workers Avert Plant Closings, Save Jobs. Available 12/28/96 at http://www.americannews.com/story/ans-14.html.

NOTES

1
Comparing this version with the 1945 original, shows that a fifth category has been deleted: "Our third responsbility is to our management. Our executives must be persons of talent, education, experience, and ability. They must be persons of common sense and full understanding" (Collins and Porras, 1994: 58). I have not seen any references to why it was deleted.

2 Note that without this preparation and deep organizational understanding of its ideology, Johnson & Johnson might have reacted as Bristol-Myers did to a problem of Excedrin being tampered with in the Denver area: Bristol-Myers recalled tablets only in Colorado and did not alert the public. (Collins and Porras: 81)


BACK TO ARCHIVE