[Editorial Note: This document is part of a collection of documents on brainwashing and related controversies. For a history of these controversies, and a discussion of the relevance of this document, see the paper by Massimo Introvigne "Liar, Liar": Brainwashing, CESNUR and APA. This is the draft version of the report examined by the reviewers appointed by the Board of Social and Ethical Responsibility for Psychology (BSERP) of the American Psychological Association (APA). The BSERP rejected the report by a Memorandum of May 11, 1987. The BSERP regarded it as "the final draft of the report, minus the reference list" (letter from Dorothy Thomas, executive assistant at BSERP, December 29, 1986). Figure 3 is not present in any circulated version of the report; the references are in the provisional form of this last draft.]

Report of the Task Force on Deceptive and Indirect Techniques of Persuasion and Control

Margaret Thaler Singer
University of California Berkeley
Harold Goldstein
National Institute of Mental Health
Michael D. Langone
American Family Foundation
Jesse S. Miller
San Francisco, California
Maurice K. Temerlin
Clinical Psychology Consultants, Inc.
Louis J. West
University of California Los Angeles

[2]

Abstract

Cults and large group awareness trainings have generated considerable controversy because of their widespread use of deceptive and indirect techniques of persuasion and control. These techniques can compromise individual freedom, and their use has resulted in serious harm to thousands of individuals and families. This report reviews the literature on this subject, proposes a new way of conceptualizing influence techniques, explores the ethical ramifications of deceptive and indirect techniques of persuasion and control, and makes recommendations addressing the problems described in the report.

[3]

Report of the APA Task Force
on
Deceptive and Indirect Techniques
of
Persuasion and Control

In recent years, cultic groups in the areas of religion, politics, and psychotherapy have generated considerable public criticism as a result of the harmful consequences of the techniques such groups use to recruit, persuade, and control their members. Many of these techniques are highly, though often subtly, manipulative and deceptive. The casualties of the nondiscriminating and unethical use of such techniques frequently wind up in the clinical or counseling psychologist's office.
The American Psychological Association has long involved itself with the ethical aspects of psychological techniques and practices, e.g., the APA's Task Force on Behavior Modification. Deceptive and indirect techniques of persuasion and control, however, have not been adequately examined; nor have the ethical principles pertinent to their application been well defined.
Therefore, the Board of Social and Ethical Responsibility in Psychology (BSERP) instituted in the fall of 1983 a Planning Committee on the Use of Coercive Psychological Techniques. (In

[4]

1984 the American Bar Association established a similar group, the Personal Litigation Subcommittee on Cults.) The Planning Committee concluded that the importance of the issue under study, especially considering the unsophisticated understanding of influence processes demonstrated by the media and the general public, called for the establishment of an APA Task Force on Deceptive and Indirect Techniques of Persuasion and Control.

Given these assumptions, the research to be reviewed later in this report, and the professional and research experiences of

[5]

committee members, it seemed clear that individuals can be induced to make uninformed, personally detrimental decisions while under the illusion that their decisions are voluntary and to their benefit. In order to increase understanding of this phenomenon, the Committee charged the Task Force to:

In order to ensure that this report is a reasonable length, the Committee recommended that the Task Force concentrate its efforts on certain controversial areas in which deceptive and indirect techniques of persuasion and control are extensively employed. Hence, this report will not concern itself with brief, single-episode situations in which one person deceives another (scams, street hustles, pigeon drops, bank examiner bunco artist operations, flim flam sales techniques). This is the world of deceptions with which the criminal justice system toils daily.

[6]

Neither will the deceptions, persuasions and coercions used by psychopathic personalities and criminals be the center of attention. Rather, this report focuses on systematically organized efforts to influence and control behavior through the use of deceptive and indirect techniques in religious and psychotherapy cults and large-group awareness trainings (or what Cushman, 1986, calls "mass marathon psychology organizations").

Historical Background

During this century a series of events demonstrated that individual autonomy is much more fragile than was commonly believed. The Russian purge trials of the 1930s manipulated men and women into falsely confessing to crimes and falsely accusing others of having committed crimes (Mindszenty, 1974). The world press expressed bewilderment and amazement at the phenomenon, but, with few exceptions, soon lapsed into silence (Rogge, 1959). The late 1940's and early 1950's saw the effects of the revolutionary universities in China and the subjugation of an entire nation to a thought reform program which induced millions to espouse new philosophies and exhibit new behaviors (Chen, 1960; Hinkle and Wolff, 1956; Hunter, 1951; Lifton, 1961; Meerloo, 1951; Sargant, 1951, 1957, 1973; Schein, 1961).
Next came the Korean conflict in which United Nations' prisoners of war were subjected to an indoctrination program based

[7]

on methods growing out of the Chinese thought reform program, combined with other social and psychological influence techniques. At that time, the term "brainwashing" was introduced into our vocabulary, "a colloquial term applied to any technique designed to manipulate human thought or action against the desire, will, or knowledge of the individual." (Encyclopedia Britannica, 1975)
After a few years, public interest in human influence and manipulation subsided. In academia, however, valuable though sometimes controversial research was conducted. Asch's (1952) conformity studies, Milgram's (1974) shock experiments, and Zimbardo's (Zimbardo, Ebbesen, & Maslach, 1977) prison role-play experiment are merely some of the many studies that delineated social-psychological influences on group behavior (see Cialdini, 1984, for a recent review).
As this academic work was proceeding, other significant events began to recapture the public's interest in influence processes. Charles Manson's diabolical control over a group of middle-class youths shocked the world during the early 1970's (Atkins, 1978; Bugliosi, 1974; Watkins, 1979). Soon after, in 1976, the Symbionese Liberation Army, a small California terrorist group, kidnapped Patricia Hearst and manipulated and controlled her behavior (Hearst, 1982). By the mid-1970's, thousands of families in the United States were puzzled and alarmed about the

[8]

influence a vast array of new gurus, messiahs, and mind-manipulators had over their offspring. Then, on November 18, 1978 Jim Jones led 912 followers to death in a Guyanese jungle (Reiterman and Jacobs, 1982). Jim Jones's final hours of domination brought the concepts of influence, persuasion, thought reform, and brainwashing to the attention of the world.
In the post-Jonestown world, thousands of families who had relatives in various cultic groups began to be heard. The government responded to this pressure by conducting hearings on "mind-control" techniques in religious cults (Dole, 1977; Fraser, 1978; King, 1979; Lasher, 1979; Lefkowitz, 1974). Although these hearings ultimately led to some prosecutions of wrongdoing (e.g., Rev. Moon's tax evasion case), concerns about interfering with freedom of religion constrained authorities considerably (see Delgado, 1978, 1984, and Lucksted and Martel, 1982, for reviews of legal issues in this area). Frustrated families, then, had to learn to fend for themselves or seek help elsewhere. Many turned to mental health professionals.
The first wave of families seeking help were, for the most part, describing sudden and frightening personality changes in relatives who had become involved with various religious and philosophical cults (cf. Addis, Schulman-Miller, & Lightman, 1984; Clark, Langone, Schecter, & Daly, 1981; Langone, 1983; Singer,

[9]

1978, 1979, 1986; West and Singer, 1980). But as Singer (1979a; 1979b; 1986) and West and Singer (1980) noted, throughout history the ever-present, self-appointed messiahs, gurus, and pied pipers appear to adapt to changing times. Thus, as persuaders moved into new realms, more families began to seek consultations on situations that involved what, for lack of a better term, we call "cultic relationships." Cultic relationships, which are characterized by a state of induced dependency, can be found in certain pseudo-growth groups, pseudo-therapy groups, commercial groups, the mainline religious fringe, religious cults, and other undue influence situations.
Some of the larger, more powerful cultic groups have branches in many countries, extensive property holdings, subsidiary organizations with special names for special purposes, and a growing degree of influence. Besides the governmental investigations noted earlier, international concerns about the detrimental effects of certain cults upon the well-being of their members, families, and society in general have led to a recent national conference on the problem in Germany, a nationally televised debate in Spain, a Canadian government study (Hill, 198x), a French government investigation (Vivien, 1985), a resolution by the European Parliament (Cult Observer, 19xx), and numerous conferences in the United States and other countries.

[10]

Although estimates vary, there are approximately 3000 cultic groups in the United States alone. Most of these groups are relatively small, but others have tens of thousands of members and incomes of many millions of dollars a year. Based on observations of turnover, membership estimates of former members, and a few studies (Bird & Reimer, 1982; Zimbardo & Hartley, 1985), it seems probable that at least ten million Americans have been at least transiently involved with cultic groups during the past decade and a half.
The elderly and the very young are not excluded from cultic groups, as demonstrated by the membership of the Peoples Temple and the demography of the dead at Jonestown. However, persons between the ages of 18 and 30 are especially subject to recruitment. A recent study of students in the San Francisco area found that half were open to accepting an invitation to attend a cult meeting, and approximately 3% reported that they already belonged to cultic groups (Zimbardo & Hartley, 1985).
Mental health professionals who have studied the matter believe that no single personality profile characterizes those who join cultic groups (cf. Ash, 1985). Many well-adjusted, high-achieving individuals from intact families have been successfully recruited by cults. So have persons with varying degrees of psychological impairment. To the extent that

[11]

predisposing factors exist, they may include one or more of the following: naive idealism, situational stress (frequently related to the normal crises of adolescence and young adulthood, such as romantic disappointment or school problems), dependency, disillusionment, an excessively trusting nature, coming from enmeshed families, or ignorance of the ways in which groups can manipulate individuals.
This report does not explore the personality factors which make some individuals especially susceptible to cultic enticements (see Ash, 1985; Clark, Langone, Schecter, & Daly, 1981; Schwartz, 198x; Zerin, 198x; and Zimbardo & Hartley, 1985 for discussions of personality variables). Nor does the report analyze the structure of the groups or situations in which systematic manipulations of social and psychological influence techniques bring about sometimes radical changes in behavior and belief. Instead, the report concentrates on psychological influence techniques and their consequences, as exemplified in cults and large-group awareness trainings.

Cults

Definitional Issues

Some scholars shun the word cult, preferring instead the term new religion, presumably because of the negative connotations of cult (Bromley & Shupe, 1981; Kilbourne & Richardson, 1984;

[12]

Robbins & Anthony, 1981). Although this view is appealing in certain ways (because some of the several thousand known "cults" seem essentially harmless), it is misleading. New religions are not essentially like old religions, except for their newness. There are many other differences, not the least of which is the presence of institutionalized mechanisms of accountability. Furthermore, if all groups called cults are termed new religions, what happens to the term cult? Is it banished from the English language? Is a group like the Manson family a new religion? And what about non-religious cults, e.g., the Symbionese Liberation Army, or the growing number of nameless psychotherapy cults to be described later?
Webster's Third New International Dictionary (Unabridged, 1966) provides several definitions of cult, among which are: (1) a religion regarded as unorthodox or spurious; (2) ... a system for the cure of disease based on the dogma, tenets, or principles set forth by its promulgator to the exclusion of scientific experience or demonstration; (3) ...a great or excessive devotion or dedication to some person, idea, or thing...the object of such devotion...a body of persons characterized by such devotion.
These definitions clearly describe many cultic groups, characterized by extremist tendencies of one form or another, for which the term new religion would be inappropriate.

[13]

Furthermore, such cults often develop, as noted earlier, in diverse social areas, including politics, psychotherapy, religion, education, and business. Hence relying exclusively on a term such as new religion results in a disregard for non-religious cults, an attitude of deviance deamplification toward extremist cults, and a tendency to gloss over critical differences between cultic and non-cultic groups. Representative of this point of view is an article by Kilbourne and Richardson (1984), which argues that psychotherapy and religious cults ("new religions" in the authors' terminology) are "functionally equivalent." Kriegman and Solomon (1985) criticize the logic of this position:

Cultic groups have caused concern because they tend, unlike psychotherapy, to be totalist in character and to exploit rather than fulfill needs. The following definition illuminates the

[14]

differences between groups that are merely unorthodox or innovative and groups that are cultic:

Totalist cults, then, are likely to exhibit three elements to varying degrees: (1) excessively zealous, unquestioning commitment by members to the identity and leadership of the group; (2) exploitative manipulation of members, and (3) harm or the danger of harm. Totalist cults may be distinguished from new religious movements, new political movements, and innovative psychotherapies (terms that can be used to refer to unorthodox but relatively benign groups), if not by their professed beliefs then

[15]

certainly by their actual practices. The term cult as employed henceforth in this report is intended to mean "totalist cults" as defined above (and as defined by contemporary usage, which has emphasized the negative connotations of the word, cult).

Religious Cults

Types of religious cults. Sociologists have proposed a variety of classification systems for religious Cults. Wilson (1976) distinguishes groups according to whether they teach that salvation is gained from knowledge stemming from a mystic source, the liberation of the self's own powers, or affiliation with a saved community. Campbell (1979) also suggests a tripartite classification, consisting of illumination (mystical), instrumental (self-adjustment), and service-oriented cults. Anthony and Robbins (1983) divide groups into dualistic and monistic movements, the former tending to be concerned with changing a morally defective world, and the latter advocating inner spiritual transformation. Wallis (1984) proposes a typology in which groups are divided into world-rejecting, world-affirming, and world-accommodating new religions.
Although these sociological models have merit, their utility is closely tied to the sociological theories on which they rest. A less ambitious, though perhaps immediately satisfying, approach is to look at religious cults and new religions in light of the

[16]

religious traditions to which they are historically tied. Thus, one may speak of "Bible-based" (Christianity), Eastern (Hindu, Buddhist, Taoist, Sufi/Moslem), satanic (Satan worship), and eclectic (drawing on several traditions) groups. Those who take this approach tend to distinguish between new religions and cults according to their behavior and practices, rather than doctrinal variations from the traditions to which they are linked. The definition of cult proposed earlier implies such a typology in that new religions depart in doctrine or practice from the various religious traditions but do not exhibit the totalist tendencies which characterize cults.

Harms associated with religious cults. Much of the controversy surrounding cults revolves around the contention that some cultic groups harm individuals and society. Although it has been argued that such accusations are mere "atrocity tales" designed to stigmatize deviant groups (Bromley, Shupe, Ventimiglia, 1983), considerable evidence indicates that many so-called "atrocity tales" are factual. Although the quantity and quality of the evidence varies and is far from one-sided, it seems clear that many religious cults have seriously harmed the physical or psychological well-being of members or their families (Langone & Clark, 1985). There are also reports detailing concerns about legal, political, and economic abuses perpetrated

[17]

by certain groups, most of which are religious in nature (Antidefamation League, 19xx; Boettcher, 1980; Casgrain, 1986 (Kropveld); Delgado, 1977, 1982, 1985; Dole, 1977; D'Souza, 1985; Grafstein, 1984; Hill, 1980; McLeod, 1986; Rudin, 1979/80; Spero, 1984; Williams, 1980).
No reliable data exist which would permit a comparison of the frequency of physical or psychological harm in religious cults and in mainstream society. Furthermore, because of the number and variety of cults and their tendency to be suspicious of outsiders, it is possible that such data will never exist. Nevertheless, if ten to fifteen cultic groups were randomly selected from current lists (Hulet, 19xx) of cults and studied systematically and in depth, reasonably confident generalizations might be made. In the absence of such a costly study, however, conclusions must be based on anecdotal reports and investigations of groups, which have caught researchers' attention for one reason or another.
The most striking examples of cult-related harm involve death, violence, and child abuse. Jonestown (Reiterman & Jacobs, 1982) certainly is the preeminent example, although it is not the only instance of cultic mass suicide in history. (Robbins, 198x). Law enforcement officials report increasing numbers of ritual killings (of animals and humans) apparently linked to small satanic cults (Baird, 1984; Boston Globe, 1984; Gallant, 1985;

[18]

Groh, 1984; Kunhardt & Wooden, 1985; McFadden, 1984). The Center for Disease Control found that members of Faith Assembly in Indiana had a maternal death rate 100 times the state average and a perinatal death rate 3 times the state average (MMWR, 1984), while the Fort Wayne News-Sentinel documented the deaths of 65 people who died after they or their parents followed the sects teachings (News-Sentinel, June 1, 1984). A lawyer investigating Synanon, a drug "rehabilitation" community turned religion, was bitten by a rattlesnake cult members placed in his mailbox (Anson, 1978). White supremacist groups have been linked to murders and death threats (Maddis, 1985; Ridgeway, 1985). Three top disciples of guru Bhagwan Shree Rajneesh were indicted for attempted murder, conspiracy to commit murder, and first-degree assault, all of which charges related to the leaders' attempts to control members of the commune (Cult Observer, March/April 1986). So many children were reportedly beaten at the Northeast Kingdom Community Church in Vermont (Grizzuti-Harrison, 1984) that state authorities removed 112 children (Burchard, 1984), a move that was overturned by a controversial judicial decision (Boston Globe, June 23, 1984). Sixty-two youths were removed from the House of Judah following the death of a child (New York Times, July 9, 1983). And countless other cases of beatings and medical neglect of children in religious cults have been reported (Gaines, Wilson,

[19]

Redican, & Baffi, 198x; Landa, 1985; New York State Assembly, 1979; Markowitz & Halperin, 1984).
Hundreds of newspaper and magazine articles, as well as legislative hearings (references), have recounted the experiences of former religious cult members and their families (Scharff, 1985, is one of the best such reports), while a number of books have been authored by parents (Adler, 1978; Allen, 1982; Ikor, 198x; Yanoff, 1981) and ex-cult members (Edwards, 1979; Kemperman, 1981; Mills, 1979; Underwood & Underwood, 1979). Most accounts published in the popular press, especially during the late 1970s and early 1980s, tell the stories of deprogrammed ex-members, who describe how cultic manipulation and exploitation made them unhappy yet unable to leave their group. Deprogramming, which generally consists of intensive interactions over a period of three to seven days, provides cult members with information about their group and manipulative influence techniques in order to help them make an informed decision about continued group affiliation. Although in popular usage the term deprogramming often implies abduction, most contemporary deprogramming occur with the unpressured consent of the cult member. Exit counseling has come to be the most commonly used term for describing this process, although some professionals (e.g., Langone, 1983) prefer the term

[20]

reevaluation counseling because it does not presuppose a goal of exiting.Clinical reports about the experiences of cultists (Addis, Schulman-Miller, & Lightman, 1984; Clark, 1978, 1979; Etamed, 1979; Galper, 1982; Lazarus, 19xx; Levine, 1979; Maleson, 1981; Schwartz, 1983; Schwartz & Zemel, 1980; Singer, 1978, 1979, 1985; Spero, 1982; West & Singer, 1982) are generally consistent with published personal accounts, although the latter, as is to be expected, tend to be one person's story, abbreviated, and relatively unanalytical. Many cultists and ex-cultists come for professional help at the urging of their families, who have usually already consulted clinicians.
The overwhelming majority of cultists examined by clinicians appear to have been stressed (e.g., romantic breakup, academic difficulties) prior to joining the cult, such stress magnifying whatever other susceptibilities they may have had to manipulative influences. Most converts were commonly subjected to an array of deceptive and indirect techniques of persuasion and control. Although clinicians differ in their explanations of what happened to these cultists, there is general agreement that most underwent major, frequently sudden, changes in personality (which usually was the trigger for parental alarm). In taking on the cult's values, beliefs, attitudes, and practices, most of these cultists

[21]

experienced considerable conflict, which was usually suppressed (often through the use of chanting or other dissociative techniques), especially when the prescribed changes were significantly opposed to how the convert was raised. (Some clinicians, cf. Ash, 1985, believe that many cultists experience a prolonged dissociative state while under the group's influence. See also DSM-III 300.15, Atypical Dissociative Disorder.) These changes and tensions appear to have motivated families to intervene (e.g., through deprogramming) or to have caused converts to leave the group, usually in turmoil (some converts experienced psychotic breaks, which often led to their ejection from the cult).
Ex-cultists from clinical samples were in the cult an average of two to three years. Their readjustment to life in the mainstream world is generally not easy, with many ex-cultists exhibiting significant levels of depression, anxiety, guilt, anger, distrust, interpersonal problems, and a form of dissociation called "floating," which is somewhat analogous to drug flashbacks (Ash, 1985).
Skonovd (1983), providing many quotations from interviews, describes a variety of reasons why people leave cults. Within the network of activist parents and ex-members,

[22]

approximately one-third of former cultists left on their own, i.e., without involuntary deprogramming (Conway & Siegelman, 1979; Eden, 197x; Langone, 1984; Schwartz, 1983; Solomon, 1981). Informal reports and several studies (Barker, 198x; Galanter, 1983; Levine, 1985), however, suggest that the voluntary departure rate for the broader cult population is probably considerably higher. Two studies (Solomon, 1981; Wright, 1983) found a correlation between reported negativity toward the cult and exposure to the counter-cult network. Although the authors of these studies were inclined to attribute the former cultists' negativity to their contact with the counter-cult network, it seems more probable that a self-selection process occurred: those parents and cultists who were most adversely affected by the cult affiliation were most likely to seek help and, consequently, most likely to come into contact with the counter-cult network.
The results of a number of research studies are consistent with clinical reports. Conway and Siegelman (1985), who drew their sample from the counter-cult network, from which most clinical cases come, found significant correlation’s between participation in cult rituals and various indicators of distress. Spero (1982), who treated 65 cultists in psychotherapy averaging 15 months, found two basic profiles: "a) significant constriction

[23]

in cognitive processes with a clearly defined preference for stereotype or b) a manic denial of depressive trends, also featuring deficiencies in optimum psychological differentiation, exceedingly quick response times, emotionally labile rather than constricted responses, and featuring unrealistic and idealistic object-relational themes"(p. 338). Galanter (1983) found that 36% of 66 former members of the Unification Church "reported that they had had serious emotional problems after leaving" (p. 984). Deutsch (1975), who interviewed 14 devotees of an American guru, concluded that all appeared to have psychiatric disorders. In a clinical study in which Rorschachs were given to four Unification Church members, Deutsch and Miller (1983) found evidence of hysterical and dependency features. Schwartz (1983), in the only identified survey of parental responses to a child's cult involvement, agreed with clinical descriptions. Parents described themselves as "numb, rejected, opposed, skeptical, disappointed, angry, disapproving, devastated, guilty, 'damned mad,’ stunned, ashamed" and "baffled" (p. 5).
Other research studies suggest that the level of harm associated with religious cults may be less than clinical reports indicate. Levine and Salter (1976) and Levine (198x) found little evidence of impairment in structured interviews of over 100 cultists, although Levine and Salter did note some reservation

[24]

about "the suddenness and sharpness of the change" (p. 415) that was reported to them. Ross (1983), who gave a battery of tests, including the MMPI, to 42 Hare Krishna members in Melbourne, Australia, reported that all "scores and findings were within the normal range, although members showed a slight decline in mental health (as measured on the MMPI) after 1 1/2 years in the movement and a slight increase in mental health after 3 years in the movement" (p. 416). Ungerleider and Wellisch (1979), who interviewed and tested 50 members or former members of cults, found "no evidence of insanity or mental illness in the legal sense" (p. 279), although members showed elevated Lie scales on the MMPI. In studies of the Divine Light Mission (Buckley and Galanter, 1979) and the Unification Church (Galanter, Rabkin, Rabkin, and Deutsch, 1979; Galanter, 1983), the investigators found improvement in well-being as reported by members, approximately one-third of whom had received mental health treatment before joining the group.

Methodological considerations. All of these studies suffer from serious methodological flaws, including sample selection (e.g., help-seekers vs. "volunteers" from groups that have clearly tried to woo academicians, cf. Dole and Dubrow-Eichel, 1981), the use of measuring instruments with inadequate psychometric development (e.g., Galanter’s General Well-Being Schedule and Neurotic

[25]

Distress Scale), motivated distortion on the part of subjects (e.g., the Lie Scale elevations in Ungerleider & Wellisch, 1979 and the "moderate attempt for both men and women to 'look good"' [p. 418] reported by Ross, 1983), inability of some standard measuring instruments to detect subtle psychopathology, such as dissociation (Ash, 1985), and bias, confusion over terms, overgeneralization, unwarranted causal inferences, and inadequate methods of collecting data (Balch, 1985).

The brainwashing/deprogramming controversy. How much of the harm associated with cults is causally related to group practices? Why, for instance, should one consider "child abuse and cults" a meaningful topic of study, but not "child abuse and Methodists" or "child abuse and sociologists"? Many would answer that cults, unlike Methodists or sociologists, tend to be very controlling and characteristically use disturbingly subtle manipulations:
deliberate attempts to manipulate someone else's behavior seem exploitative when they are covert. One can always imagine that the victim might have resisted had the attempt been more overt or had informed consent been solicited (Andersen & Zimbardo, 1985, p. 197).
This emphasis on harm-producing manipulation has given rise to the brainwashing/ deprogramming controversy. On one side are

[26]

former cultists, their parents, and many journalists. These people compare cult conversion to Korean War indoctrination and Chinese thought reform programs, often referred to in the popular press as brainwashing. These writers note the similarities of isolation, group pressure, debilitation, induced dependency, etc. They frequently support deprogramming and their terminology often determines the language with which the layman conceptualizes this problem. On the other side of the controversy are cultists, their public relations experts, and a few sociologists and other writers. This group attacks the validity of the brainwashing model (partly because the term in their minds implies the use of coercion and physical threats, and partly because of a fixed disregard of the active, orchestrated, and often deceptive recruiting programs used by cultic groups), which they frequently use as a strawman, the knocking down of which apparently is aimed at dismissing ex-members' claims of exploitation (Shuler, 1983).
In the middle are many clinicians and other professionals who have studied cults. They recognize that the threat of physical coercion found in Korean War brainwashing is rarely present in cult conversions, that brainwashing represents one end of a continuum of influence (Langone, 1985) and is not mysterious, and that the individual's personality and actions play a significant role in his or her conversion. These professionals, however, do

[27]

not gloss over the many distinctions between cultic and more traditional conversions. Their positions on deprogramming vary markedly, depending upon their ethical evaluations of the procedure.
Perhaps the question "Are cultists brainwashed?" should be replaced with two questions, addressing, respectively, the individual and social levels. With regard to the individual, one could say, paraphrasing Bergin and Strupp (197x): "To what extent has this particular group's use of deceptive and indirect techniques of persuasion and control harmed this particular person or family at this particular time?" On the social level one could say: "To what extent does this particular group - or class of groups - misuse deceptive and indirect techniques of persuasion and control?"

Summary. In summary, it seems that the only confident conclusion one can draw from the many studies of religious cults is that a large variety of people join diverse groups for many reasons and are affected in different ways. Nevertheless, considerable numbers of persons join cults in large part because of their vulnerability to the deceptive and indirect techniques of persuasion and control used by cults. Furthermore, a significant percentage of cultists is clearly harmed, some terribly so. Many cultists, however, appear to be unharmed or even positively

[28]

affected, although some scholars (e.g., Ash, 1985) argue that most of these are disturbed in subtle ways.

Psychotherapy Cults

Literature review. Temerlin and Temerlin (1982), Hochman (1984), Singer (1986), and West and Singer (1980) each pointed to a new phenomenon: the psychotherapy cult. Cultic therapists use varying combinations of coercive, indirect, and deceptive psychological techniques in order to control patients. In so doing, these therapists violate ethical prohibitions against forming exploitative and dual relationships with clients, misusing therapeutic techniques, and manipulating therapeutic relationships to the advantage of the therapist. Therapy cults may arise from the distortion and corruption of long-term individual therapy (Temerlin and Temerlin, 1982; Conason and McGarrahan, 1986), group psychotherapy (Hochman, 1984), large-group awareness trainings, human potential groups, or any of a variety of groups led by non-professionals (West & Singer, 1980; Singer, 1983, 1986).

[29]

These cults were formed when professionals deviated from an ethically based, fee-for-service, confidential relationship with clients and brought clients together to form cohesive, psychologically incestuous groups. Leaders were idolized rather than transferences studied and understood. Instead of personal autonomy being built, patients were led into submissive, obedient, dependent relationships with the therapists. Their thinking eventually resembled what Hoffer saw in the "True Believers" (1951) and what Lifton (1961) termed totalistic. That is, the clients were induced to accept uncritically their therapists' theories, to grow paranoid toward the outside world, to limit relationships and thinking to the elite world created by the cult-producing therapist, and selflessly devote themselves to their therapists. The groups varied in size from 15 to 75 members. Often members had been in groups from 10 to 15 years.
The authors concluded that membership in a psychotherapy cult was an iatrogenically induced negative effect of psychotherapeutic techniques and relationships being used in unethical ways.
Hochman (1984), writing about a now defunct school of psychotherapy, The Center for Feeling Therapy, also spoke of the many iatrogenic symptoms he found in former clients and patients

[30]

who had been members of this group, which had evolved into a therapy cult. He wrote:

This group lasted approximately ten years, and consisted of 350 patients living near one another, sharing homes in the Hollywood district of Los Angeles. Hundreds more were non-resident outpatients, and others communicated with "therapists" (some were licensed, others allegedly were patients assigned to be therapists) by letter. Maximum benefit supposedly came only to residents, and patients were led to see themselves as the potential leaders of a therapy movement that would dominate the 21st century. The leaders promulgated a "theory" which maintained that individuals function with "reasonable insanity," but that if individuals learn to "go 100%" in five areas -- expression, feeling, activity, clarity, and contact -- a person could put aside his "old image" and now be "sane," which was defined as the "full experiencing of feelings." This latter, ambiguous objective was purported to be the attainment of the next

[31]

stage in human evolution. Thus, therapy cults use a technique also commonly seen in religious cults (Singer, 1983): the inhibition of critical thinking by encouraging the use of thought-stopping cliches.

Legal cases. A number of civil suits and hearings of the California Department of Consumer Affairs Board of Medical Quality Assurance have grown out of the activities occurring in the Center for Feeling Therapy. The following are illustrative but not exhaustive: State of California: Psychology Examining Committee Case 392, L-33445 v. Binder; State of California as cited, v. Corriere, Gold, Hart, Hopper, and Karle, Case L-30665, D.3103 through 3107; State of California as cited v. Woldenberg, No. D-3108, L-30664; Hart et al. v. McCormack et al., Superior Court of the State of California, for the County of Los Angeles, No. 00713; Raines et al. v. Center Foundation, Superior Court of the State of California, County of Los Angeles, No. 372-843 consolidated with C 379-789; Board of Behavioral Science Examiners, No. M 84, L 31542 v. Cirincione, Franklin, Gold, and Gross.
In these legal cases, defendants were charged with extreme departures from the standards of psychology, the standards of medicine, and the standards of psychotherapeutic care. The State

[32]

alleged that the staff, while purporting to be providing psychotherapy:

[33]

[34]

Timnick (1986), calling the Center "a once trendy 'therapeutic community,"' reported that the above legal hearings have "become the longest, costliest and most complex psychotherapy malpractice case in California history" (p. 3). In this case, more than 100 former patients filed complaints of fraud, sexual misconduct, and abuse. Already, civil cases have settled for more than six million dollars on behalf of former clients. Testimony

[35]

at hearings depicted the group as a psychotherapy cult using deceptive, manipulative, and coercive techniques to retain and control clients. The welfare of the patients was subverted to the welfare of the therapists. Treatment plans and goals were subverted to financially and personally benefit the therapists and the Center. Instead of the usual standard of practice of patients being aided to achieve greater independence and self-direction, the therapists instituted a systematic social influence process and an enforced dependency situation that was cultic and controlling.

Non-professional cults. In contrast to the above psychotherapy cults directed by trained professionals, Singer (1983; 1986) reported the rise of cult-like groups conducted by non-professionals. The latter appear to have familiarized themselves with many of the tactics, techniques, and methods of both individual and group therapies, and have used these techniques to recruit, control, and maintain followers. Defining a cultic relationship as "those relationships in which a person induces others to become totally or nearly totally dependent on him or her for almost all major life decisions, and inculcates in these followers a belief that he or she has some special talent, gift, or knowledge" (1986, p. 270), Singer described a series of groups in which non-professionals using various psychological

[36]

techniques were subverting the welfare of individuals who had been led, often through deceptive methods, into believing the leaders had psychological or other special skills, talents, or knowledge.
In one instance, a diet cult was described in which a man and woman leader implied that they had special scientific methods for weight control programs. In addition to leaving their regular work, followers were persuaded to turn over large sums of money and property to the leaders, cut off all ties with family and former friends, and move to an isolated town and remain in the domicile of the leaders. Here each day four to five hours of hypnosis and self-hypnosis exercises plus many periods of hyperventilation interspersed with lectures and demonstrations of how to "speak in voices and hear in voices" were carried out, ostensibly to change the personality of the followers. No actual weight control programs existed. Only the pseudo-therapy and pseudo-growth programs were operative.
Earlier, Singer (1983) reported on the experiences of individuals in various groups run by non-professionals who used intense confrontational techniques, encounter group type exercises, and enforced and prolonged self-revelation processes. Some groups encouraged participants to move in with the leader, others did not. But all seemed, in one way or another, to encourage dropping former friends and relatives and becoming

[37]

psychologically, socially, and otherwise dependent upon the group. Several long-standing groups reportedly are run by not only non-professionals, but by persons with criminal records. The promises of the promoters lead persons entering these groups to think that they will receive various psychological and mental benefits from participation. Whether benefits accrue is not possible to ascertain, because studies have not been made of either the groups or of sufficient numbers of former members of these groups. Additionally, because these groups often are on the fringe of legality, leaders are not likely to allow open scrutiny of their operations.
That individuals have been damaged psychologically and in other ways appears evident. That psychological and social manipulations by means of indirect, deceptive, and coercive methods are occurring is also evident.

Large-Group Awareness Training

Historical Background

The Human Potential Movement bloomed in the 1950's and 1960's. Sensitivity and encounter groups spread rapidly, promising increased communication, intensified experience, and expanded consciousness. Business, educational, and other groups were sold sensitivity training programs, some conducted by psychologists, but most led by non-professionals who used the

[38]

processes and techniques developed by psychologists. There soon appeared the commercially packaged large-group awareness trainings (LGATs), which combined a number of the encounter and sensitivity techniques with various sales, influence, indoctrination, and behavior control techniques.
Most existing commercial LGATs grew out of a format developed in the early 1960s by William Penn Patrick, who labeled his venture Leadership Dynamics Institute (LDI). This was the first of what has become a smorgasbord of commercially sold LGATs.
Church and Carnes (1972) describe the original LDI program as an encounter group training session costing $1,000 and in which persons "were held virtual prisoners for four days of living hell during which members of the class were beaten, deprived of food and sleep, jammed into coffins, forced to perform degrading sexual acts, and even crucified" (p. 178). Purportedly, this commercial encounter group would make persons "better leaders and executives." The seminar was supposed to rid people of their "hang-ups," teach total obedience, and motivate participants to persuade other persons to take the training. Patrick, who headed Holiday Magic Cosmetics, Mind Dynamics, LDI, and other pyramid sales organizations, decreed that attending an LDI "seminar" was required for anyone wishing a management position with Holiday

[39]

Magic. Attendees were kept in the dark about what they would experience at these seminars, as "graduates" were pledged not to reveal their experiences. The venture ended amidst multiple law suits in California courts.
Some changes have occurred in subsequent LGATs, while certain features have remained. In most of the new groups, attendees continue to pledge secrecy and push the product on friends and acquaintances within a pyramid sales structure. The status of "graduates" of the LGATs is generally dependent upon the number of recruits they bring in. Eliciting most criticism, however, is the extensive use of deceptive and indirect, and even coercive, techniques of persuasion and control at all levels of the organization, including the training.

Review of the Literature

There is an extensive array of accounts of these "trainings," most notably about est (Erhard Seminars Training). Hundreds of journalistic reports exist. The following are but a sample of articles or books on est: Bartley, 1978; Benziger, 1976; Brewer, 1975; Bry, 1976; Fenwick, 1976; Frederick, 1974; Greene, 1976; Hargrove, 19xx; Hoyt, 1985; Leonard, 1972; Rhinehart, 1976; Tipton, 1982. Descriptions of Lifespring, another well-known LGAT, are found in Haaken and Adams (1983) and Cushman (1986), while Actualizations is described by Martin (1977).

[40]

The term training is misleading if the consumer thinks the title refers to skill-building groups (Rudestam, 1932). The LGATs are not skill training events, but instead resemble intense indoctrination programs. In the LGATs an authoritarian leader, now minus the Marine Corps swagger stick of LDI days, persuades the consumers who purchase attendance to believe that their lives are not working, that they have caused every dire event that has happened to them, and that salvation is based upon accepting the belief system being offered, learning to talk in the jargon of the trainer, and remaining connected with the organization by becoming unpaid volunteer helpers recruiting other customers for the organization. The formats, stripped of individual jargon characterizing each particular commercial group, remain essentially as outlined above (Baer & Stolz, 1978; Cinamon & Farson, 1979; Fenwick, 1977; Gross, 1978; Tipton, 1982; Zilbergeld, 1983).
Finkelstein, Wenegrat, and Yalom (1982) consider Lifespring, Actualizations, and est as examples of "intensive large-group awareness trainings." They describe these groups as being characterized by "the commercial, non-professional use of potentially powerful tools for personal growth," which "often evoke powerful emotions" (p. xx). These authors discuss the “trainer's extraordinary demeanor...(his/her) air of absolute

[41]

authority ... no affect, even when he excoriates the trainees ... repeatedly referring to them as ‘assholes'...devalues their accomplishments with the repeated assertion that their lives 'do not work"' (p.xx).
Finkelstein et al. (1982) report on est's "Truth Process," an event occurring on the second day of the training. During this exercise trainees lie on the floor, eyes closed, meditating on an individual problem they have selected.

[42]

At a mid-week meeting following the first weekend, "trainees report on their experiences since the weekend, often to tell of dramatic improvements...and occasionally to complain of deterioration in their mood" (Finkelstein et al., 1982, p. 515-521).

Finkelstein et al. (1982) also note that nearly 450,000 persons in the United States have undergone one of the several commercial large-group awareness trainings. Yet the literature on these groups resembles that of the early encounter and human potential groups:

[43]

The reports of psychological harm resulting from LGATs appear in Fenwick (1976), Glass, Kirsch, and Parris (1977), Kirsch and Glass (1977), Simon (1977, 1978), Higgitt and Murray (1983), and Haaken and Adams (1983). While Fenwick, a psychologist, was a participant observer at an est training, Haaken and Adams, a psychologist and a sociologist respectively, were participant observers at a Lifespring training.
Fenwick called attention to the est training selections-admission forms, in which persons were asked if they had been in therapy, and if in therapy (now or recently), were they "winning." She voiced the opinion that some persons might, intentionally or because of incapacity, misrepresent their psychiatric status on such a form, or might feel "their medical or psychiatric history is not appropriately revealed to a private business offering them an 'educational' service" (p. xx). She concluded:

[44]

Fenwick further noted that the Lieberman and Yalom studies (19xx) of encounter groups indicated that "the people who experienced negative results in combination with the psychological casualties constituted about 19% ... or for close to one out of five people who participated in these group experiences, the results were harmful" (p. 166).

[45]

Cushman (1986) termed a number of groups, such as est, Lifespring, Psi World, Transformations, and Summit Workshops, "mass marathon psychology organizations." As had the many writers who described the est trainings, he noted the highly coercive and authoritarian methods of control used in these groups. He called them restrictive groups, because they depended upon strict milieu control, public rewards and punishments, and the pressuring of participants to enroll others and immerse themselves in the organizations as volunteers and companions of other graduates.
Despite the LGATs promoting themselves as educational experiences, the majority of the professionally trained writers (psychologists and psychiatrists) who have published comments on the groups consider them to be psychological in nature (Cushman, 1986; Hoyt, 1985; Fenwick, 1976; Glass, Kirsch, & Paris, 1977; Haaken & Adams, 1983; Higgitt & Murray, 1983; Kirsch & Glass, 1977; Paul & Paul, 1978; Simon, 1977, 1978). Glass et al. (1977) concluded that although est presents its programs as educational,

[46]

they are in fact "quasi-therapeutic group experiences" (p. xx). Simon (1978) stated that "est has some powerful psychological effects on many of those who take the training... It is apparent from the progressive and regressive responses to est that some powerful change agent is at work here. It may be that... Werner Erhard has discovered an unconventional route to approach these psychotherapeutic goals" (p. 686, 691).
Other legal cases. Space prohibits a complete description of all legal cases involving LGATs. However, there are currently more than 30 such cases, including... [to be added if deemed appropriate]

Conclusions

The preceding literature review suggests that most of the nationally known LGATs and a burgeoning, but as yet undetermined number, of take-offs on them are using powerful psychological techniques capable of stripping individuals of their psychological defenses, inducing behavioral regression, and promoting regressive modes of reasoning. Further, it appears that deceptive sales techniques are involved in promoting the trainings since the secrecy surrounding the programs' sales promotions prevents consumers from obtaining full disclosure. Consumers are persuaded to purchase programs described as educational, while in actuality the programs consist of highly orchestrated, intense indoctrination processes capable of inducing marked psychological

[47]

experience. Consumers are not fully and adequately informed about the programs' intensity, the new philosophical formulations of reality that they imply, the potentially harmful consequences of some of the exercises to which participants will be exposed, the sometimes lurid psychological upset they will witness, nor the fact that management is aware of at least some of the risks to which they subject participants. Such practices run counter to American Psychological Association recommendations on the running of growth groups (American Psychological Association, 1973).

Analysis

As should be clear by now, criticism of cults and LGATs stems from the observation that such groups use deceptive and indirect (and sometimes coercive) techniques of persuasion and control to advance the goals of leaders, frequently to the detriment of members, their families, and society at large. The problems posed by such groups, then, have psychological and ethical aspects. The psychological aspect concerns the nature of behavior and attitude change techniques and their consequences. The ethical aspect of the problem concerns the appropriateness of such techniques in various situations. Preceding sections of this report have detailed many of the harmful consequences. This section explores the nature of influence techniques and ethical implications.

[48]

The Continuum of Influence: A Proposal

Figure 1, which delineates a continuum of influence, illustrates how deceptive and indirect techniques of persuasion and control differ from other techniques. On one extreme of the continuum lie nondirective techniques, such as reflection and clarification. On the other extreme we find physical restraint and punishment and pressured public confessions. The specific techniques listed in the figure have been grouped under four methods of influence: educative/therapeutic, advisory/therapeutic, persuasive/manipulative, and controlling/ destructive. Further, educative/therapeutic and advisory/therapeutic methods of influence are classified under the choice-respecting mode of influence, which emphasizes effectively communicating one's message while persuasive/manipulative and controlling/destructive methods are classified under the compliance-gaining mode, which emphasizes obtaining the desired response from the influences.

Insert Figure 1 about here

 

According to this schema, a particular social influence interaction may be categorized with varying levels of precision, e.g., compliance-gaining mode, persuasive/manipulative method,

[49]

foot-in-the door technique. Furthermore, a particular environment may be evaluated according to the frequency with which social influence techniques occurring in that environment fall under the particular modes or methods of influence. If, for example, researchers developed a means of classifying discrete social influence interactions according to this scheme, they could observe an environment over time (e.g., a Moonie three-day workshop at Booneville, a time-limited psychotherapy group) and develop a profile on that environment (Figure 2 illustrates a hypothetical profile). One could speak, then, of "climates of influence" and could say, for instance, that a controlling/destructive climate of influence characterized Jonestown.

Insert Figure 2 about here

 

Influencer Goals and the Influence Continuum

Obviously, to some extent the ethical propriety of a particular influence mode, method, or technique and the goals of the influencer are interrelated. Figure 3 joins the influence continuum with an intent continuum reflecting the extremes of influencer-centered goals and influencee-centered goals. These two continua form four quadrants, which may be considered

[50]

influencer attitudes. When influencers' mode of influence is choice-respecting, they may have an inspirational attitude (seeking self-sacrificing action from influencees while carefully respecting their right and capacity to choose to accept or reject the appeal) or a self-development attitude (which characterizes, for example, ethical psychotherapy) toward influencees. If, on the other hand, influencers' mode is compliance-gaining, they may possess a caretaker or an exploitative attitude toward influencees, depending upon whether influencers use compliance-gaining tactics for their benefit or that of influencees. An exploitative attitude toward influencees will nearly always be considered unethical (an exception being undercover police work, for example), while the ethics of a caretaker attitude will depend upon the influencee's capacity to make responsible choices. it is acceptable, for example, to take a caretaker attitude toward a young child or a mentally retarded adult, but not toward a reasonably well functioning adult in psychotherapy. Inspirational and self-development attitudes will nearly always be ethical, although exceptions exist, e.g., naively accepting a psychotically depressed person's wish to jump off a bridge, or tolerating childish self-indulgence when a natural disaster demands that every available adult help out.

[51]

Insert Figure 3 about here

 

Ethical Issues for Psychologists to Consider

The preamble to the American Psychological Association's Ethical Principles of Psychologists (APA, 1981) begins with a declaration of respect for the "dignity and worth of the individual" (p. 633). It further directs the efforts of psychologists towards "increasing knowledge of human behavior and of people's understanding of themselves and others" (p. 633).
The ethical principles which follow articulate the concerns of professional psychologists for the utilization of this knowledge in the promotion of human welfare and the protection of fundamental human rights. Psychologists are ethically bound to use their skills only for purposes consistent with the clearly stated values of the Ethical Principles, and they are specifically enjoined not to "knowingly permit their misuse by others" (p. 633). Although respecting the prerogatives and obligations of other professions and institutions, we are, nonetheless, obligated to provide information which will serve the best interests of the public.
Traditionally, psychologists maintain a self-development attitude toward their clients, except when clients' mental competency is in question, for which occasions a caretaker

[52]

attitude is appropriate (e.g., restraining a suicidal person). During the past thirty years, however, psychology has witnessed the development of a profusion of new behavioral change techniques and an increased tolerance for experimentation with innovative individual and group psychotherapies. Few would argue that, on the whole, this change has benefited the profession and consumers. On the other hand, the interest in innovation has often blinded psychologists to the ethical aspects of these changes. Too often the focus has been on the effectiveness of new techniques, not their ethical appropriateness.
Certainly, we are not the first to point out this danger. Indeed, some groups have taken action to limit abuses, especially in the field of behavior modification (APA, 19xx; AABT, 1977). However, much more work remains. Corsini (19xx), for example, lists 192 psychotherapies, most of which have been developed since 1960. Many of these innovations use some of the deceptive and indirect techniques of persuasion and control that have been the subject of this report. Rarely, however, have the ethical dimensions of therapists' influence over clients in these innovative psychotherapies been examined closely, even though some techniques, e.g., paradoxical instructions to clients and hypnosis without trance, depend for their effectiveness upon the clients lack of awareness and absence of free choice.

[53]

The dangers posed by the inappropriate use of deceptive and indirect techniques of persuasion and control within psychology pale before the dangers posed by their use outside of the profession. No professional qualifications are required to develop and market LGATs or "new" therapies, for example. It is not surprising, then, that horror stories abound.
Psychologists, according to their own ethical principles, are obligated to speak out about the abuse of psychological techniques outside of the profession, as well as within. Recommendations concerning this issue will be made in the final section of this report.

Ethical Issues for Nonpsychologists

Because psychologists have studied the techniques about which we are concerned, it is appropriate for them to offer opinions about how the use of these techniques may impinge on others, recognizing, of course, that the input of other professions is prerequisite to a full understanding of the issue.
Deceptive and indirect techniques of persuasion and control appear in politics, business, education, medicine, law, the media, the psychological services industry, and religion. Although our concern has been the latter two areas, the other areas should not be, and fortunately have not been, ignored. As early as 195x, Vance Packard decried the "hidden persuaders" in advertising.

[54]

Recent debates about the effects of violent TV programs and commercials on children are also significant, as are consumer protection laws aimed at unscrupulous sales practices. Liberals' objections to McCarthyite techniques of persuasion and control on campuses of the 1950s are now echoed by similar accusations on the part of conservatives. Phenomena such as ABC-TV's "Viewpoint" programs (in which media professionals are questioned by their audience) and the rapid growth of organizations such as Accuracy in Media demonstrate that the public is becoming more aware of and concerned about how the media influences them. Because of increased appreciation of how interaction with authority figures can neutralize critical thinking and reduce personal autonomy, informed consent has become fundamental to ethical practice in medicine and all other helping professions, including psychology. Research into how psychological factors can manipulate and deceive juries (e.g., Orne, 198x) is having an impact on legal practice. And despite the traditional tolerance toward political rhetoric and machinations, recent revelations about the cultic nature of some political groups (e.g., the LaRouche movement, Mintz, 1985) raise questions about the ethics of deceptive and indirect techniques of persuasion and control in politics.
All of these problem areas revolve around the ethics of how one individual or group influences another. Yet in all of them,

[55]

the ethical question is often obscured by other factors: The doctor genuinely concerned about the medical needs of this particular patient at this particular time can easily lose sight of why informed consent is a necessary ethical procedure. The advertiser determined to sell children’s toys may not appreciate the long-range consequences of his methods. And politicians intent on winning may forget that the Constitution they swear to uphold is founded on the notion that a free yet orderly society must have rules to determine how its citizens influence one another.
The Constitution and the body of law issuing from it articulate the written rules governing social influence, e.g., libel law. Tradition determines the unwritten rules. These, in recent decades, have become increasingly ambiguous, largely because the traditions on which they rest have been questioned unceasingly. To some extent this phenomenon may reflect the waning of America's historical Protestant ethic and the triumph of pluralism. This change has negative as well as positive implications.

[56]

The growth of cults and LGATs, some advocating moral positions antithetical to the mainstream, contributes to the conflict to which Berger alludes. A pluralistic society wants to remain open and tolerant; yet an extreme moral pluralism may render it so open that it becomes vulnerable to those seeking to transform it into a totalist system. A key challenge to society, then, is arriving at an ethical consensus which allows for pluralism without inviting moral (used as a synonym to ethics) anarchy.
This challenge poses a dilemma: What ethical rules shall govern how we attempt to influence one another while we struggle toward a consensus concerning a broad range of fundamental ethical issues, including the ethics of social influence? If, for example, person A believes that media manipulation is an ethical means of resolving disputes, while person B rejects this position, is it ethical for A to manipulate the media in an attempt to persuade B to change his or her position regarding the ethics of media manipulation?
The tolerance for manipulation in politics renders the public policy arena a dubious field for discussion of ethics. Perhaps, then, scholarly and professional disciplines, which are well-grounded in rationalistic ethics, might best begin the debate, in a non-public way. Psychology ought to play a key role

[57]

in this endeavor, for our field has studied the psychological mechanisms underlying persuasive processes. It is necessary to understand the full range of ways in which we persuade each other before we can rationally discuss the ethical boundaries of persuasion. And these must be defined before we can arrive at an acceptable consensus on other ethical issues. Otherwise, the debate will be full of accusations of cheating.
The study of deceptive and indirect techniques of persuasion and control in cults and large group awareness trainings charts out the unethical extreme of persuasive processes, thereby serving as a frame of reference for one side of the influence continuum. A large body of social psychological research is applicable to an understanding of other parts of the influence continuum. More research is required, however. Furthermore, this huge body of data needs to be integrated in a way that will help clarify the ethical aspects of social influence. Then, perhaps, disputing groups within society can attempt to influence each, within ethical boundaries, on broader ethical/moral questions (e.g., sexuality, the death penalty, abortion drug-taking, criminal punishment) in the hope of coming to a working moral consensus without being sidetracked by "games" in which one side tries to manipulate, rather than reason with, the other.

Recommendations

[58]

Research

Recommendation. Psychologists should devote more effort toward understanding the mechanisms of action, effects, and ethical implications of social influence techniques, especially those that are deceptive and indirect.

Discussion. Much of the laboratory work on social influence is applicable to the understanding of behavioral changes observed in cults and LGATs. In many ways, the latter appear to be real life examples of mechanisms discovered in the lab. They are, however, more complex and subtle than laboratory studies suggest. Clinical and first person accounts of radical, sudden conversions, for example, often suggest the presence of hypnotic-like processes, rendered more effective perhaps by a particularly vulnerable subject. These processes are not as well understood as they could be.
Additional research ought also be directed toward the notion of an influence continuum (see Figure 1). If this proposal has merit, it will help clarify the ethical implications of influence techniques, which also require further study. Paradoxically, however, ethical constraints on experimentation may seriously impede a psychological understanding of the very influence processes that call for ethical analysis. Many of the classic experiments in social influence (e.g., Milgram, 19xx) would not be

[59]

performed today. It may be necessary, therefore, for psychologists to use less restrictive methodologies in studying these processes. Perhaps, for example, we should develop participant observation methodologies that are more sensitive to psychological subtleties than are the participant observation methods of sociologists and anthropologists.

Recommendation. The study of deceptive and indirect techniques of persuasion and control should include study of how such techniques can be resisted and neutralized, and how those harmed by such techniques can be provided more appropriate therapy.

Discussion. So much psychological effort has been devoted to benevolent attempts to help people change that we psychologists easily forget that change agents can have malevolent goals as well. If we are the experts on individual behavioral change, then we are responsible for studying how individuals can defend themselves against change agents. Precious little work has been done in this area, a notable exception being an article by Andersen and Zimbardo (1985). Once again, however, the messiness of this subject in the real world suggests that non-traditional methodologies may have to be used.
Treatment of cultists and their families, whose needs differ in many ways from other clients (e.g., the need for information

[60]

about cultic influence processes), also requires additional study, although some work has been done (Addis et al., 1985; Langone, 1985; Langone & Clark, 1984; Singer, 1986).

Professional Ethics and Education

Recommendation. The American Psychological Association ought to consider how future versions of APA's ethical code and ethical casebook material should be revised in light of the ethical implications of deceptive and indirect techniques of persuasion and control used in LGATs, innovative psychotherapies, and other settings.

Discussion. This Task Force's investigation clearly suggests that many individuals are harmed as a result of participation in certain LGATs or "therapies." Although only a minority of participants may be casualties, the harm to the few is serious. Furthermore, since many members of the American Psychological Association support groups that have been criticized, a close examination of their practices is called for in order to ensure fairness and objectivity.
We believe that such an investigation ought to look carefully at the ethical ramifications of the practices in question. If, for example, considerable risk is, in fact, associated with certain LGATs and innovative therapies, ought psychologists be admonished from endorsing or participating in them without first

[61]

producing convincing evidence that they are sufficiently beneficial to warrant the risk and that adequate, not nominal, safeguards have been taken to minimize the risk. Ethics demands such caution in research and traditional therapy. Why not in nontraditional forms of therapy and therapy-like processes as well?
Psychology must also be more forthright about the use of potent psychological techniques by nonprofessionals. Otherwise, accumulating abuses and successful tort cases within the “psychological services industry" may ultimately lead to regulations that could adversely affect ethical psychologists. In addition to increasing insurance costs, a proliferation of tort cases may also intimidate psychologists by causing them to fear (however unrealistically) that disgruntled clients may think that their experience is similar to a person who got $500,000 in damages from an LGAT. Even baseless suits that can be summarily dismissed are often expensive, time-consuming, and frightening.
Interestingly, many evangelists have shown a keen interest in developing an ethical code to govern their behavior, mainly because they want to be able to distinguish themselves from the fanatics and phonies who thrive on the Christian fringe (cf. Cultic Studies Journal, Spring/Summer 1986). Ethical psychologists ought also pay more attention to distinguishing

[62]

themselves from the fanatics and phonies who thrive on the fringe of the "psychological services industry."

Public Policy

Recommendation. Because of the sometimes grave consequences of unethical application of deceptive and indirect techniques of persuasion and control, psychologists ought to direct more attention to educating the public about such techniques.

Discussion. This area offers a unique opportunity for those interested in prevention. All too often the harms from which we seek to protect young people (e.g., drugs, teenage pregnancy, delinquency) are actively pursued by the young persons we hope to help. Cocaine feels good.
Young people, however, do not seek to be manipulated and deceived. They may long for an easy way to fulfill painful needs. But, except perhaps in certain pathological cases, they do not want to be the objects of "mind games." Therefore, preventive efforts aimed at teaching them how "mind games" work may have much potential.
Preventive efforts ought also be directed toward adults. Psychologists and cooperating free-lance writers ought to write more articles and books on how to identify and resist deceptive and indirect techniques of persuasion and control. (Cialdini's

[63]

Influence, 1985, and Andersen and Zimbardo's brief article, "Resisting Mind Control," November 1980, are examples of this genre.) The profession ought also prepare guidelines for personnel managers in business and government in order to help them more effectively evaluate LGATs and other psychological programs. Lifespring, for example, failed to sell its training to the U.S. Air Force because of an ABC 20/20 investigation. Helping executives make informed decisions about programs within the psychological services industry ought not be a task left to journalists alone.

Recommendation. Because the increasing quantity of litigation related to adverse consequences of deceptive and indirect techniques of persuasion and control poses a potential threat to consumers and ethical psychologists, the American Psychological Association ought to consider advocating stricter regulations regarding nonprofessionally run programs that seek to change behavior through the systematic application of deceptive and indirect techniques of persuasion and control.

Discussion. Psychology obviously cannot exercise a monopoly over the use of psychological techniques, no more than physicians can exercise a monopoly over the intake of food. Nevertheless, we are obligated to speak out about abuses. And we are obligated to at least study the possibility of advocating

[64]

regulations when purportedly non-psychological programs systematically use specialized psychological techniques in ways that make ethical psychologists blanch.

[65]

Figure Caption
Figure 1. The Influence Continuum.

[66]

I
n
c
r
e
a
s
i
n
g

L
e
v
e
l
s

o
f

I
n
f
l
u
e
n
c
e
Mode of Influence Method of Influence Techniques
Choice-respecting
(emphasis on message)
Educative/Therapeutic - Reflection
- Clarification
- Discussion
- Information giving
- Directed questioning
- Creative expression
Advisory/Therapeutic - Commenting on problem or alternatives
- Suggesting ideas or solutions
- Recommending solutions
- Rational argument: message oriented
- Hypnosis (some forms)
Compliance Gaining
(emphasis on response)
Persuasive/Manipulative - Rational argument: compliance oriented
- Emotional appeals
- Compliance tactics: consistency, reciprocation, social proof, authority, liking, scarcity (see Cialdini, 1985)
- Gross deceptions
- Hypnosis (some forms)
Controlling/Destructive - Isolation from social supports
- Selective reward/punishment
- Denigration of self and of critical thinking
- Dissociative states to suppress doubt and critical thinking
- Alternation of harshness/threats and leniency/love
- Control oriented guilt induction
- Active promotion of dependency
- Debilitation
- Physical restraint/punishment
- Pressured public confessions





[67]

Figure 2. Hypothetical Influence Climates Profile.

[68]

[69]

Figure 3. Influencer Goals and the Influence Continuum.

[missing in the original]

 

Back to "Liar, Liar": Brainwashing, CESNUR and APA, our general discussion of the brainwashing controversy.
Back to the list of available documents on the APA brainwashing controversy.


Back to the CESNUR Page on Brainwashing and Mind Control Controversies


[Home Page] [Cos'è il CESNUR] [Biblioteca del CESNUR] [Testi e documenti] [Libri] [Convegni]

[Home Page] [About CESNUR] [CESNUR Library] [Texts & Documents] [Book Reviews] [Conferences]

Fri, Dec 10, 1999