Cultic Studies Review, Vol. 5, No. 2, 2006, Page 69
Zablocki‘s (2001) sociological theory of brainwashing builds upon the pioneering work of
Lifton (1961), who studied thought reform among U.S. POWs in Korea and Chinese students
and intellectuals on the mainland. Zablocki‘s theory is not about how people enter
charismatic groups, or cults, but ―the process of inducing ideological obedience in
charismatic groups‖ (p. 160). He describes in detail the complex process that enables cultic
groups to build commitment and loyalty among members and, when it serves the leader‘s
interests, to devote enough resources to selected members so as to turn them into what he
calls ―deployable agents‖—that is, members who are uncritically obedient to leaders even in
the absence of external controls. Zablocki‘s ―economic‖ perspective implies that members
will vary in their commitment to the group/leader because leadership must make resource-
allocation decisions concerning the building of commitment among different members.
Leaders, then, will not put effort into developing a deployable agent, unless such a person
can deliver an objective that is worth the resources that the leader expends. Hence,
Zablocki says that there ―is no reason to believe that all cults practice brainwashing any
more than that all cults are violent or that all cults make their members wear saffron robes‖
(p. 196).
Zablocki‘s (2001) theory presumes that a necessary but not sufficient condition for
brainwashing to occur is ideological totalism, ―a sociocultural system that places high
valuation on total control over all aspects of the outer and inner lives of participants for the
purpose of achieving the goals of an ideology defined as all important‖ (p. 183). Although
the resocialization process differs among groups, common elements include ―a stripping
away of the vestiges of an old identity, the requirement that repeated confessions be made
either orally or in writing, and a somewhat random and ultimately debilitating alternation of
the giving and the withholding of ‗unconditional‘ love and approval‖ (p. 187). The
resocialization process affects cognitive functioning and emotional networking, which in turn
lead ―to the attainment of states of hyper-credulity and relational enhancement,
respectively‖ (p. 187). Because convictions function more as valued possessions than as a
means of testing reality, ―a frontal attack on convictions, without first undermining the self-
image foundation of these convictions, is doomed to failure‖ (p. 188). The assault on
members‘ identity is compensated by the payoff of feeling more ―connected with the
charismatic relational network‖ (p. 188), which ultimately brings about an identification with
the group, an ―imitative search for conviction‖ (p. 189), and ―the erosion of the habit of
incredulity‖ (p. 189). A symbolic death and rebirth marks the completion of the
brainwashing process as ―the cognitive and emotional tracks come together and mutually
support each other‖ (p. 189). With the brainwashing process complete, the individual
perceives the cost of exit to be sufficiently high that compliance with group demands
becomes a rational choice.
Lalich (2004) complements Zablocki‘s Lifton-based process model. Although she too is most
concerned with the deployability associated with the brainwashing concept, Lalich
approaches the brainwashing phenomenon by examining the complex interaction of the
processes of conversion and commitment. She views conversion, as does this paper, as a
worldview shift that usually occurs within a social context, which can enable converts to
sustain and strengthen their worldview shift. Lalich discusses four interlocking structural
dimensions that underpin the social dynamics of cultic groups:
1. charismatic authority
2. a transcendent belief system
3. systems of control
4. systems of influence
Previous Page Next Page