THE SOCIAL ENGINEERING SOLUTION TO THE MURDER IN THE MILGRAM EXPERIMENT

 

 

 

 

Eugen Tarnow, Ph.D.

18-11 Radburn Road

Fair Lawn, NJ 07410

etarnow@avabiz.com

 

 

Abstract

            Society's power to make us obey allows for peaceful existence, economic prosperity and efficiency but it also amplifies faulty decisions to become catastrophic. In 1963 Stanley Milgram showed that the vast majority of humans exhibit excessively obedient behavior in presence of an authority and can easily be made to encourage or tolerate real torture and murder.

            In this advocacy paper, the overdue issue of how to limit excessive obedience is addressed. Eliminating the Milgram Prediction Error – i.e. the discrepancy between what we think we will do and what we actually do in situations of authority is stressed.  Barriers and dynamics in our society that keep us from breaking and even enforce our habit to obey excessively are discussed. For example, society does not know what the strong situations are and therefore cannot put up a defense against them; the law does not punish excessively obedient behavior and the teaching of ethics is hampered by illusions of its efficiency.

            A sketch of a solution to the problem of excessive obedience is made involving experiential training, mappings of authority fields, rules and strong situations, and policy changes.


1.      Introduction

Thou shalt not follow a multitude to do evil . . .

(Exodus, 23:2, suggestion from about four thousand years ago)  

 

            The Milgram obedience experiment has become quite famous over the last forty years (for reviews, see Milgram, 1974 and also Miller, 1986) - if one mentions the experiment at a party, some of the participants will vaguely remember it.  But while making for a good topic for conversation over a beer, it is a finding that has yet to produce a single useful action.  In fact, it did just the opposite: it provoked other researchers to kill the messenger and declare the experiment unethical.  It is thus not surprising that over time, the result has not improved: the experiment yielded the same horrendous obedience rate in 1985 (Meeus and Raaijmakers, 1995) as in 1963 (Milgram, 1974).  But, as we shall see, it may be that simply telling people about the experiment is not enough anyway, the behaviors challenged are just too difficult to change.

The Milgram obedience experiment reveals what physicists would call an instability in our society towards limitless obedience to authority.  I.e. while our society is quietly humming along, a catastrophe may lurch around the corner once too many people start to obey a bad set of directives.  Some has been written about the role of excessively obedient behavior in world events such as the Holocaust (Arendt, 1970; McKellar, 1951; Miller, 1986), the My Lai massacre, the treatment and disappearance of people during the military regime in Argentina (Kelman and Hamilton, 1989), and the NASA space shuttle disaster (Feynman, 1990).  If one accepts the description of these events in terms of excessive obedience then they serve as an additional motivation for reexamination of the problem in the Milgram experiment – if not, this paper will not argue one way or the other.

2.      The Experiment Described

In the Milgram experiments, a subject, the Teacher, is asked by the Experimenter to give electrical shocks to a confederate, the Learner. The stated purpose of the experiment is to understand how punishment affects memory recall. The Student, with a stated heart problem, fakes an increasing discomfort and as the fake electrical shocks increase to dangerous levels, he suddenly becomes quiet which can be reasonably interpreted as him being dead (Mantell (1971) conducted a replication of the Milgram experiment in Germany and interviewed the subjects after the experiment.  Many claimed that they believed the learner had been dead or at least unconscious). Milgram found with this simple experiment that most people can be made to seriously injure and "kill" by verbal orders.  Even though the subjects may feel intuitively that they are doing something terrible, the forces of obedience are overpowering.  Milgram also discovered that predictions by psychiatrists, graduate students and faculty in the behavioral sciences, college sophomores, and middle-class adults of the rate of inflicting maximal injury during one of the experimental conditions were consistently much smaller (0-1%) than the actual rate (65%, see Milgram, 1974, p. 31).  This is referred to as the Milgram Prediction Error (Tarnow, 2000a).

The murder in the Milgram experiment occurred because the Experimenter was able to use his authority to limit the Teacher’s options for thought and behavior in a purposively designed, deceptive and gradually presented “strong” situation and because our society has created individuals who are much too easy to command.  The Experimenter was able to limit the subjects’ interpretations of the experiment to the idea that it was a reasonable study in learning.  Not one of a thousand Teacher subjects acted on the alternative interpretation that it was a dangerous experiment and called the police or freed the Learner (Zimbardo, 1974; Milgram had alerted the local police department beforehand because he expected such calls (Alexandra Milgram, private communication)).  Likewise, the Teacher subjects assigned the responsibility for their actions to the Experimenter and concentrated on performing the task at hand in the most efficient way possible.  They no longer saw the choice of disobedience, but only the choices with which obedience could be improved. Some went as far as to assign responsibility for the Learner's death to the Learner (Milgram, 1974).          

The murder is not committed by subjects who enjoy killing others.  Martin, Lobb, Chapman and Spillane (1976) found that high obedience rates can also be obtained if the result is self immolation.


 

3.      Defining Excessively Obedient Behavior

            Excessive obedience can be defined on an individual level as well as from a larger perspective.

3.1.  When is Obedience Excessive – The Larger Perspective

Would a manager like to have his staff be totally obedient?  At first glance one would think so.  In theory, the manager bears the responsibility for a department and in order to run it properly she would need everybody to help.  But what if she gives an erroneous order?  Then it would probably be better for her staff not to carry out the order.  Thus it is evident, at least for the sake of error correction, that the degree of obedience should not always be 100%. 

Let’s take the specific example of a manager and her staff:  the airplane captain and the first officer.  Up to 20% of all airplane accidents may be preventable by optimizing the “monitoring and challenging” of captain errors by the first officer (Tarnow, 2000a).  The following is a real-life example from an accident review of the National Transportation Safety Board.

On December 1, 1993, Express II Airlines Inc. / Northwest Airlink Flight 5719 had a problem (NTSB, 1994a).  The captain was a particularly intimidating superior and his first officer was a beginner pilot.  The captain committed several errors during the flight, these errors were evident to the first officer who nevertheless failed to challenge them.  In particular, the first officer failed to call out the need to execute a missed approach (the plane was too high up and too close to the airport to land). The first officer made only one attempt to challenge the landing according to the voice recorder (I quote the voice recorder transcript without correcting the language):

            First Officer:      just .. you just gonna stay up here as long as you can?

            Captain:            yes.  guard the hor- I mean an speeds one hundred.

At the point the plane is scraping the trees, the following dialogue occurs:

Captain:            did you ah click the ah airport lights .. make sure the co-common traffic advisory frequency is set. [sound of seven microphone clicks]. click it seven times?

First Officer:      yup yeah I got it now. [momentary sound of scrape lasting for .1 secs]

According to the NTSB (NTSB, 1994a) the crash was caused by several factors, among which was the failure of the first officer to monitor and alert the captain of the erroneous descent.  Had the first officer been less obedient, it is likely that he, the captain and the other people on the plane would have been alive today.

3.2.  When is Obedience Excessive – The Individual Perspective – The Milgram Prediction Error

            From the individual point of view, excessive obedience can be defined as behavior for which the obedience level is higher than predicted by the individual, i.e. when the Milgram Prediction Error (Tarnow, 2000a), is significant.  This definition does not imply anything about moral values (should apply to atheists as well as people of various religious affiliations).

4.      Dynamics Facilitating Excessive Obedience

            Here is presented eight barriers and dynamics which keep us from breaking, and even enforce, our habit to excessively obey.

4.1.  The Milgram Prediction Error

The Milgram Prediction Error erects one barrier towards the elimination of excessive obedience: it keeps the consequences of excessive obedience from our awareness.  If we do not recognize that there is a problem, it is not going to be a priority to fix.  Thus it is a lot easier to find people to protest whoever is the current president than to ask Congress to form a committee and pass laws about strong situations.  When the verdicts come in from trials which involve strong situations, the newspaper rarely point out that most other people would have done the same thing and that societal obedience or the strong situation is the problem.  Instead we are happy to conclude that the convicts are different from us and we would never have done what they did.

The Milgram Prediction Error is part of a larger social illusion of the effectiveness of ethical teachings.  Milgram wrote that “the force exerted by the moral sense of the individual is less effective than social myth would have us believe.  Though such prescriptions as ‘thou shalt not kill’ occupy a pre-eminent place in the moral order, they do not occupy a correspondingly intractable position in the human psychic structure.  A few changes in newspaper headlines, a call from the draft board, orders from a man with epaulets, and men are led to kill with little difficulty---Moral factors can be shunted aside with relative ease by a calculated restructuring of the informational and social field.” (Milgram, 1974, pp 6-7).  Teaching ethics by simple instruction is inefficient in strong situations; the Teachers had no doubt obtained such instruction (“it is wrong to kill”) and did not expect to punish the Learner as severely as they did.

In the field of scientific authorship (which the author happens to be familiar with) Eastwood, Derish, Leash and Ordway found that training in the ethics in research correlated with an individual’s belief that it influenced conduct of scientific research and publishing, and that it heightened his sensitivity to misconduct.  However, training in ethics was actually uncorrelated with willingness to commit unethical or questionable research practices in the future, and was positively correlated with a tendency to award honorary authorship.  The intention to award honorary authorship also increases dramatically for those who have first-hand experience with inappropriate authorship (either by having been asked to list an undeserving author, named as an author together with an undeserving author, or unfairly denied authorship).  The authors concluded that “despite the respondents’ own standards in this matter, their perception of the actual practice of authorship assignment in the research environment has fostered a willingness to compromise their principles.” 

During a stint as a trainer in a hospital situation I noticed an example of the social illusion.  As an icebreaker my boss and I would ask the individuals of a hospital department to state something about their values.  Almost everybody would then ascribe to the golden rule.  Even with my limited knowledge I knew that this did not describe the behavior of several of the people involved, but nobody objected, no one laughed.  I have noticed another example which occurs when wars and catastrophes are discussed in religious settings.  Inevitably, the problem is described by seemingly well-meaning people as belonging to those bad people and, of course, nobody present would ever do anything like that.  One gets a cozy feeling and a thankful feeling – “I am so lucky to belong to this group of people that will make sure I am always safe”.  I sit there and wish for Stephen Katz to come and present the toilet situation in Auschwitz to shake up the compliant crowd.  I sense that the social penalty for pointing out the social illusion is severe and I keep quiet.

The Milgram Prediction Error asks us to face two tough truths:

·        Milgram's finding that anybody is likely to seriously injure the Learner means that we are not safe from our neighbors.  This presumably also makes it very difficult to discuss in groups since it points out the fallibility of the group members and therefore of the group itself.

·        That we injure the Learner against our later judgment means that we ourselves cannot be trusted.

Just like the obedience rate in the Milgram experiment stayed constant, the Milgram Prediction Error--our non-anticipation of the result--had not changed in 1985 either: Meeus and Raaijmakers, (1995) performed an obedience experiment involving “administrative violence,” depriving someone of his job.  Predicted obedience rates: 10% Actual rates: 95%. 

4.2.  The Limited Perspective Problem

In the Milgram experiment not one of a thousand Teacher subjects came up with an interpretation alternative to the Experimenter's and, for example, called the police or freed the Learner (Zimbardo, 1974). This limited perspective of the Teacher has been investigated by many researchers.

Milgram developed the theory of the "agentic" state to explain his experimental results (Milgram, 1974). It is a hypnotic state in which one assigns all responsibility for one's actions to the supervisor and concentrates on performing the task at hand in the most efficient way possible. One no longer "sees" the choice of noncompliance, but only the choices with which compliance can be improved. The theory of the agentic state explains the tendency to assign responsibility for the Learner's death to the Learner (Ibid.): after the task has become all-important to the Teacher, the Learner is perceived to be one of the variables left that can be optimized; thus the Teacher wishes the Learner to try his best. The Learner's death is self- inflicted because he refuses. Likewise, the Nazi concentration camp guards stopped thinking about the horrors they were perpetrating and concentrated on the ease of execution of its victims. The guards posted signs saying "work bring freedom" (Vrba, 1964, p. 90), screamed to the victims to go faster, faster (Ibid., p. 132-4), made them believe that executions were medical checks (Ibid., p. 144), etc.

A short note by Kohlberg suggests that the limited perspective problem is related to the subjects' moral development (Kohlberg, 1969).

Milgram, and Kelman and Hamilton, also refer to the limited perspective problem as the "narrowing of the cognitive field" (Milgram, 1974, p. 38) and as "dehumanization" and "neutralization."

4.3.  The Halo Effect of the Obedient

There is a halo effect that favors excessive obedience over dissent:  A person who obeys has much of society's validation behind him, and society has had a long time to "beautify" his behavior (uniforms, monetary rewards, etc). Thoreau, a pioneer in civil disobedience, remarked in 1849 about the obedient majority: "They will wait, well disposed, for others to remedy the evil, that they may no longer have it to regret." (Thoreau, ed. 1980, p. 226). The obedient majority can look around to see others behave just like it and reinforce their behavior.

Dissent, on the other hand, often becomes ugly. Ziemke wrote, in the context of the denazification of Germany: "the man who was individualistic enough to have stood out against the Nazis was probably not going to fit in easily with the Americans either" (Ziemke, 1975, p. 381), i.e. the Americans would contend themselves with dealing with the Nazis or the obedient subjects rather the ones that dissented.  It is the dissenter` who has to pay the price for the dissent--"a gnawing sense that one has been faithless" (Milgram, p. 164).

4.4.  Lack of Knowledge About Strong Situations

Strong situations occur daily, and we need to know what they are in order to decrease excessive obedience rates. Only a few examples of studies can be found in the literature - unknown doctors ordering nurses to inject unknown medicine (Hofling et al, 1966), and bureaucratic orders to disturb a test-taking potential employee (Meeus and Raaijmakers, 1986). 

4.5.  Lack of Knowledge about Rules

            Milgram wrote: "Obedience, because of its very ubiquity, is easily overlooked as a subject of inquiry" (Milgram, 1974, p. xi). Twenty years later, it might be that psychologists studying obedience have missed an important level of analysis - the rule - perhaps because of its ubiquity: There are rules to create peace, to uphold standards, to increase efficiency, to spare people's feelings etc. The corresponding field of rules (similar to Milgram's binding factors, Milgram, 1974, p. 148) has never been mapped out. Without a knowledge of what we are obeying, we cannot lessen excessive obedience.  Authority benefits from rules being elusive, and may perpetuate this situation because it has more experience of the situation and is more powerful.

4.6.  The Penalty of Breaking Rules is not Well Defined

We often do not know what the consequences for breaking rules are, traffic and criminal laws excepted. Indeed there might not be a fixed penalty: Imagine that one wants to enter a particular University library and insists on breaking one rule: the rule that one has to possess a library card. Thus one walks and passes the guard. Here are two examples of consequences:

·         Nobody sees you and there is no consequence.

·         The guard does not mind. Breaking the rule costs little.

·         The guard minds, and tells you to get yourself a card, it takes only five minutes anyway. You insist that you want to enter the library and break this one rule. The guard calls security. Two bouncers enter and demand that you leave the premises. You explain the situation to them as you did to the guard. They think you are crazy and perhaps dangerous and they pull a gun on you.

 

In either case it is interesting to see which rule was actually broken.  The rule about the card, the rule about obeying the guard?  It seems to be difficult to separate the infringements from each other.

            It is likely that mental barriers to uphold and break rules are set by impressions from childhood (Milgram, 1974, p. 136, and Zimbardo (1974)). Since our childhood authorities may be more influential and important than our adult authorities, the costs of breaking rules may be erroneously valued.

4.7.  Excessive Obedience is Easy to Create, Hard to Get Rid of

            Axelrod writes that "it is the ability to identify and punish defectors that makes the growth and maintenance of norms possible" (Axelrod, 1985). The norm for excessive obedience is much easier to maintain than the norm against. It is difficult to identify a person who excessively obeys since he typically not stand out from the crowd. A dissenter, on the other hand stands out clearly and can be easily penalized.  It has also been pointed out that the power of authority is increased multifold in the presence of an obedient group (Tarnow, 2000b).

            Identification of the excessive obedience culprit is also difficult because the overall responsibility is inherently an issue involving two or more people: The power to make decisions and the power and knowledge to carry them out are often separate. Organizational life with large bureaucratic organizations exacerbates these points since hundreds of people could be involved with a crime in large and small ways. Milgram also found that observers and participants in the experiment have different views (Milgram, 1974).

            The punishment for excessive obedience is very elusive. The consequences for the perpetrators of the My Lai massacre were minimal. Only one person, Lt. Calley, was convicted. He served only three years in house arrest for twenty-two premeditated murders. In France, a blood bank named CNTS was found to be giving people the HIV virus because of negligent testing of the blood supply. As a result, more than 100 hemophiliacs contracted the virus. Three physicians from the company CNTS were sentenced to up to four years in prison (Science 258, 735 (1993)). Presumably, many more people had knowledge of the crime and will never be tried. While the law states that obedience is not an excuse for committing a crime, juries tend to not convict (Isenman, 1990).  Social psychologists using the Milgram experiment can also manipulate the courts into not enforcing responsibility (Davison et al, 1993).

            Indeed, the Milgram obedience experiments present an unsolved legal paradox. Since almost everyone would commit a crime in strong situations, it is doing justice to the criminal not to convict him (see also Le Bon, ed. 1982, p. 163-165). On the other hand, the absence of a conviction does not serve the victim, nor does it protect society from future crimes.

4.8.  Ethics - Turning the Other Cheek

            "Turning the other cheek" is a heuristic that often lends credence to excessive obedience because it can be construed as obedient behavior that further strengthen the obedience field.

5.      A Social Engineering Approach to Excessive Obedience

5.1.  Combating Excessive Obedience - Samples from History.  Madison, Denazification and Civil Rights

            In the late eighteenth century, James Madison stressed the vulnerability of our society to the violence of "factions." Madison wrote that once these factions acquire momentum, "neither moral nor religious motives can be relied on as an adequate control." Under these circumstances people become followers and lose their independent judgment. Madison proceeds to make the argument that the large representative democratic government proposed in the Constitution would guard against the violence of factions: the representative form refines public opinion and filters it through the eyes of the prominent citizens elected. Furthermore, the size of the Union proposed would make it less likely that a majority would have a detrimental motive: ". . . communication is always checked by distrust in proportion to the number whose concurrence is necessary." (Madison, ed. 1961, Federalist Paper #10.)

            Madison's assumes with his latter assertions that a larger number of people would automatically lead to a better averaging effect. However, this is probably incorrect (see, for example, Janis, 1972 and the concept of diffusion of responsibility and the actual lack of averaging in groups see Sherif, 1961 and Tarnow, 1996).  In particular, if the levels of obedience and conformity are high, collective effects can result which make individuals powerless to follow their own convictions (Tarnow, 1996).

            Madison's thoughts is one example of an attempt to deal with the effects of excessively obedient behavior. A second example is the "denazification" of postwar Germany. The American purpose was to eliminate "nazism-fascism, German militarism, the Nazi hierarchy, and their collaborators." (Ziemke, 1975, p. 108). Since much of Nazism was the idea of obeying one's superiors to the extreme, and Hitler in particular, this seems relevant to our inquiry. An ambivalence about the mission (Ibid., p. 428), a perceived lack of qualified German administrators who had not also been (or still were not) Nazis (Ibid., p. 381), and a dearth of theories of why Nazism came to power (Ibid., p. 108), were detrimental to the denazification effort (another description of the ineffectiveness of the denazification effort in Germany can be found in Hilliard, 1997).excessive obedience behavior, important to the rise to power of Nazism, had not been eliminated as late as 1971 when a replication of the Milgram study was performed in Munich, West Germany (Mantell, 1971).

            A third example is the civil rights movement in the U.S. which has worked towards protecting minority societal participation by the introduction and enforcement of laws. It has combated some excessive obedience behavior by vigorously upholding the right to dissent against the government.  However, this movement appears to have created excessively obedient behavior of its own with the overemphasis on anti-Israel polemics as if removing Israel and killing Jews will make peace in the world.

5.2.  How to Combat Excessive Obedience:  A List of Suggestions

            Let us discuss some of the possible ways of decreasing excessive obedience.

·         Experiential education.  It is likely that learning by instruction is ineffective (just like teaching about authorship ethics actually created more inappropriate authorship).  Rather learning by experiential education is more appropriate because of the presence of the social pressures involved in "doing the right thing" (indeed, Milgram interviewed his subjects after the experiment and many felt they had learnt something important).  Since we know that playacting of the Milgram experiment can give close to the same result as the experiment itself, this seems an appropriate endeavor (Meeus and Raaijmakers, 1995, found that with sufficient intensity, role playing of the Milgram gives the same result as the original Milgram experiment).  At each step of a strong situation the participants would be taught to see the full perspective of choices available to them. Spectators could learn by viewing the experiences that it is imperative to accept the dissenter who may emerge, the somewhat different type of person she might be, or has to be, and accept the unattractiveness that accompanies dissent. 

·         Once excessive obedience is more widely understood, we can catalog the strong situations. Zimbardo emphasized the need "for more knowledge about those conditions in our everyday life where, despite our protest - 'I would never do what they did' - we would, and we do" (Zimbardo, 1974). Mapping of work situations that are strong for individuals can be done by undercover order- giving (Hofling et al, 1966; Meeus and Raaijmakers, 1986, Tarnow, 2000a). For example, imagine an organization where pleasing the bosses is more important than the work output; excessive obedience is pervasive. At random times, each of the managers could be asked to give what the board of directors considers a nonsensical/unethical order. If the unethical order is obeyed, the situation is too strong. If a situation is found to be too strong, it should be pointed out, and discouraged. The regular occurrence of obedience-testing questions will serve to create a norm for what orders can be given, and to encourage critical evaluations of future orders (a specific example includes obedience optimization in the airplane cockpit and other high risk work places, see Tarnow, 2000a).  Human resource departments could assist in making lists of strong situations and post them in a conspicuous place.

·         Axelrod (1985) studied the emergence of norm systems and found a necessary criterion for the viability of a new norm system: the ability of the agent to modify the unwanted behavior. It makes little sense to punish a person unless they or others are given the power to behave differently in the future. To help make awareness of excessive obedience a norm, and to decrease excessive obedience, the law may be useful by eliminating strong situations and by increasing our individual armament against social pressures. In the former case, laws may need to regulate the size and communication structure of groups. A meeting between an employee and two managers, for example, is a situation that may be questionable. Large bureaucracies create strong obedience fields and the existence of these could be questioned on this ground. In the latter case excessive obedience must be identified and punished often enough for it to disappear as a norm. If a group of people was involved, partial individual responsibilities should be assessed and clear rules for distribution of punishment made. If the assigning of responsibility becomes impossibly difficult, then the proper legal actor must be the full group. If the situation is somewhere in between, one can assign the responsibility to both the executor of the crime and to the people responsible for the obedience "field."  The legal arena may also be useful to remove excessive obedience once a social policy has been adopted that defines and enlightens citizens of the dangers of excessive obedience.

·         The structure of the communication flow in an organization needs to be considered. For example, if a group of people were sitting in a circle (or in a row in a movie salon), and were only allowed to talk to their nearest neighbors, we can speculate that dissent would be relatively easy. It only costs you the opinions of at most two people. However if you are in a workplace with privacy inhibiting cubicles, dissent would be much more costly.

·         The mapping of conscious and unconscious contracts and "covenants" that exist in the work place need to be performed. Efforts should be made to simplify the contracts and covenants so that individuals are not overwhelmed. It should be apparent to everybody when no contract exists (Hobbes stressed "the silence of the law" being important for liberty (Hobbes, ed. 1960, p. 143)).

·         The real consequences of not going into a contract, or of disobeying a rule need to be understood. To illustrate, one could construct a "social crime" table. This table would show the temptation levels and the actual breaking rates to give us a sense of how strongly social rules are enforced.

·         We need to understand the functions of the rules around us. The addition of rules can serve many hidden purposes. Authorities can "fix" problems, inefficiency can be hidden, the obedience field is strengthened, and the breaking of even minute rules can lend credence to firing individuals. Sometimes we may ask ourselves whether we want to communicate the existence of a rule, and thereby strengthening it if we do not agree with it.

·         An alternate way to diminish excessive obedience is to encourage dissent (also emphasized by Kelman and Hamilton, 1989). While the U.S. Constitution guarantees some individual rights against the Government, the Constitution does not apply to private organizations; in these there are no First Amendment rights and few privacy rights. Since private organizations are by virtue of their contribution to the popular mindset essential to the problem of excessive obedience, the lack of comprehensive civil rights in the private sector, in addition to those related to discrimination, should be reconsidered. Whistle-blowing was recently encouraged by the government under special circumstances but it could be further encouraged in other arenas. The legal rights people do have in organizations may be taught more vigorously: People who have little idea about the laws that rule them are not empowered to insist on their rights.

·         Since "turning the other cheek" can enhance the obedience field, it needs to be taught more carefully. It should be properly contrasted with the opposite heuristic - tit-for-tat. Axelrod found the latter to be the most robust "ethics" in a computer game. It can be error-correcting and some generalizations to human behavior support this algorithm (Axelrod, 1984). It may be that "turning the other cheek" should be thought of as a way to correct tit-for-tat, not replace it.

6.      Summary

In this article I have stressed the need for a solution to the societal instability pointed out by Stanley Milgram, explained what some of the barriers are to a solution and made a plausible sketch of what such a solution might look like.  From here it is the responsibility of policy makers and granting agencies to pick up the task.


7.      References

            Hannah Arendt (1970), “Eichman in Jerusalem, a report on the banality of evil”, The Viking Press.

Axelrod, R. (1984). The Evolution of Cooperation, Basic Books.

Axelrod, R. (1985). "Modeling the Evolution of Norms," speech delivered at the American Political Science Association Annual Meeting, New Orleans, 8/29-9/1, 1985.

Davison, Allan J.; Higgins, Nancy C. (1993). "Observer Bias in Perceptions of Responsibility." American Psychologist, v48, 584.

            Eastwood, S., Derish, P., Leash, E., Ordway, S. (1996) Ethical issues in Biomedical Research:  Perceptions and Practices of Postdoctoral Research Fellows Responding to a Survey, Science and Engineering Ethics 2: 89-114.

            Hilliard, R.L. (1997).  “Surviving the Americans”, New York: Seven Stories Press.

            Hobbes, T. (1960). Leviathan. Basil Blackwell, Oxford.

            Hofling, C.K.; Brotzman, E.; Dalrymple, S.; Graves, N., and Pierce, C. (1966). "An Experimental Study of Nurse-physician Relations,"  The Journal of Nervous and Mental Disease 143, pp. 171-180.

            Isenman, M.K. (1990). "Reviewing Crimes of Obedience. By Herbert C. Kelman and V. Lee Hamilton." The Michigan Law Review 88, 1474.

            Janis, I.L. (1972). Victims of Groupthink: A Psychological Study of Foreign-policy Decisions and Fiascoes. Boston: Houghton Mifflin.

            Kelman, H.C., and Hamilton, V.L. (1989). Crimes of Obedience. New Haven: Yale University Press.

            Le Bon, Gustave. (1982). The crowd - a study of the popular mind. Atlanta: Cherokee Publishing Company.

            Madison, James in Rossiter, Clinton. (1961). The Federalist Papers. Penguin Group, New York.

            Mantell, D.M. (1971). "The Potential for Violence in Germany." Journal of Social Issues 27, pp. 101-12.

            MARTIN, J.; LOBB, B.; CHAPMAN, G.; SPILLANE, R. (1976). "Obedience under conditions demanding self-immolation." Human Relations, p. 345-356.

McKellar, Peter. (1951) "Responsibility" for the Nazi policy of extermination. Journal of Social Psychology, v34:153-163.

            Meeus, W.H., and Raaijmakers, Q.A. (1986). "Administrative Obedience: Carrying Out Orders to Use Psychological-administrative Violence." European Journal of Social Psychology 16: 311-324.

            Milgram, S. (1974). Obedience to Authority: An Experimental View. New York: Harper and Row.

            Miller. A.G. (1986). The Obedience Experiments. New York, Praeger.   

            National Transportation Safety Board (1994a). Controlled collision with terrain: Northwest Airlink Flight 5719, Hibbing,  Minnesota, December 1, 1993.  Washington, DC.

            Sherif, M. (1961). Conformity-Deviation, Norms, and Group Relations. In Conformity and Deviation, ed. by I.A. Berg and B.M. Bass. Harper and Row.

            Tarnow, E (1996). “Like Water and Vapor--Conformity and Independence in the Large Group” Behavioral Science v41, 136-151

            Tarnow, E (2000a). "Toward the Zero Accident Goal: Assisting the First Officer Monitor and Challenge Captain Errors", Journal of Aviation/Aerospace Education and Research, v 10, no 1.

            Tarnow, Dr. Eugen (2000b) A Quantitative Model of the Amplification of Power through order and the Concept of Group Defense.

            Thoreau, Henry David. (1980) "On the Duty of Civil Disobedience", in Walden and "Civil Disobedience," Penguin Group, New York.

            Ziemke, Earl F. (1975). Army Historical Series: The U.S. Army in the Occupation of Germany. Center of Military History, U.S. Army, Washington D.C.

            Zimbardo, P.G. (1974). On "Obedience to Authority". American Psychologist 29:566-567.


 Biographical Note

            The author, Eugen Tarnow, is a consultant with a degree in physics (Ph.D. M.I.T., 1989). His interests include groupware, training in customer relations, task efficiency, business vision statements, the performance of large and small work groups, and cockpit crews.

            The author thanks Carol Caruthers, Rafi Kleiman, Arianna Montorsi, Mats Nordahl, and Barbara Smith for critical readings of the manuscript; Steve Maaranen, and Nicklas Nilsson for useful discussions.

            Correspondence concerning this article should be addressed to Eugen Tarnow, 18-11 Radburn Road, Fair Lawn, NJ 07410, USA; etarnow@avabiz.com (E-mail).