12 ICSA TODAY
Universal Social-Influence Processes
In understanding how and why people may be recruited into
cultic groups, I have found the work of Robert Cialdini particularly
helpful. In Influence: Science &Practice (1984), Cialdini described
six social-influence processes: reciprocation, commitment and
consistency, social proof, liking, authority, and scarcity. In Pre-
Suasion: A Revolutionary Way to Influence and Persuade (2018),
he adds a seventh universal social-influence process: unity. He
proposes that these processes tend to produce a “click-whir”
automatic response in people. These unconscious, automatic
reactions arose as survival responses in our early, physically
dangerous environments they were usually correct and gave
us an evolutionary advantage. Sometimes, however, the
automaticity of these ingrained responses can be maladaptive.
In Pre-Suasion…, Cialdini maintains that, when humans in
our modern, evolved age find themselves overwhelmed by
information, fake news, stress, and insecurity, they tend to take
the same age-old mental shortcuts, unconsciously betting on
the odds, and often ending up having the same, unreflective
responses. In our modern, digital, and high-technology age, the
tendency toward this primitive-consent automaticity (based
on our evolutionary history) becomes potentially dangerous.
People often feel “overwhelmed” and stressed, unable to
fully understand or control the increasing flood of new and
revolutionary processes and inventions that are taking control of
our lives. We become cognitively and emotionally numbed out.
It is this overwhelmed state that triggers the old, unreflective,
automatic reflexes of survival. This uncritical mental state makes
us vulnerable and can be used (consciously or unconsciously) by
cultic and extremist leaders and groups to gain influence.
In Influence…, Cialdini explains how his initial six fundamental
psychological principles direct human behavior. They give the
tactics of influence their power. These influence processes are
used in all human groups, and are often consciously employed by
advertising firms, businesses, and political groups, for example.
He calls these principles weapons of influence. In Pre-Suasion…,
he mentions how the (now seven) principles and tactics may be
used by narcissistic leaders and their cultic and extremist groups
in recruiting new members:
Reciprocity. If you give me something (goods or services),
I am obliged to give you something back. This is the
basis of social give-and-take. It is deep-seated primate
behavior. Social bonding and hierarchy building are
ultimately based on this behavior. For example, the
Moonies (Unification Church) gave free flowers to people.
This was a successful fundraising and support-building
effort because people felt obliged to give them something
back: money or liking, or even some form of commitment.
The Hare Krishnas (ISKCON) gave passersby a free copy of
the Bhagavad Gita, and people tended to feel obliged to
give something in return. In my group, the guru gave me
personal recognition and hundreds of dollars in gifts so I
felt obligated to work hard for his mission.
Consistency. As one interpretation of the saying “In
for a penny, in for a pound” suggests, once an initial
commitment is made (even if it is relatively insignificant),
we have a tendency not to want to go against our original
decision. This inclination comes from an unwillingness
to admit we have made a mistake in the first place. In
psychological theory and research, this is called cognitive
dissonance, our tendency to resolve an apparent
contradiction (dissonance) by rationalizing or suppressing
the dissonant element. In one study, for example, when
the end of the world didn’t occur at the appointed
time, instead of followers feeling misled and becoming
disillusioned, the belief of many actually increased
because the cult leader explained that their “devotion” had
saved the world from destruction! This explanation was
consistent with and reconfirmed their previous belief.1 In
my group, males were required to wear white clothes to
spiritual functions. At first I thought this was silly. I felt like
an ice cream salesman! I was convinced by a friend that
this was only a cultural ingredient of the leader’s otherwise
very idealistic and spiritual path. Wanting to conform,
I concluded, “What the heck, no big deal.” I went along
with the requirement, and I began to feel more like one of
the group. This was just the first small step of many later
rationalizations that kept coming.
Social Proof. “Truth ‘R Us”—we are convinced by the
actions of trusted individuals, especially those like
ourselves, or those we admire: If they think something
is a good thing, well then, it probably is a good thing. As
primates, we tend to follow our group. I became a member
of a high-demand, cultic group in part because I was
impressed by the high quality and idealism of the leader’s
disciples—many of whom are still friends today, after 23
years in the group, and 16 years after having left it.
Liking. Friendly thieves—physical attractiveness (culturally
defined) attracts! Similarity in background, interests,
values, likes and dislikes, and so on—attract. Compliments
given to us, and being recognized, also attract us, as do
shared aspirations and values. Like the other six principles,
this influence process is almost invisible.
Authority. Directed deference—a deep-seated, primate
tendency to trust authority (even blindly), leads us to
respect those with the apparatus of authority: military or
police uniforms, priests’ collars, gurus’ robes, doctors’ white
jackets and stethoscopes, chief executives’ expensive suits
How, then, are our basic social
and cognitive instincts—
the need to belong and the need
to understand—related to
whether or not we become
engaged in a cultic group?
13 VOLUME 11 |ISSUE 1 |2020
and ties, scientists’ and officials’ clipboards, and the like
tend to engender obedience. “Follow the group leader” has
been successful evolutionary logic for social animals for
thousands of years. The classic studies of this phenomenon
are Solomon Asch’s lab experiments (1956) Stanley
Milgram’s Yale experiments (1956) and Philip Zimbardo’s
Stanford Prison experiments (ending in 1971). Asch’s
experiments were focused on laboratory experiments
showing how human perception could be influenced by
external social factors Milgram’s experiments focused on
how authority figures (doctors, professors, psychiatrists,
etc.) can influence and persuade us to go against our own
better moral judgements Zimbardo’s social psychology
experiments involved roleplaying between powerful
prison guards and powerless prisoners, and the insidious
development of coercive control and even brutality.
In my personal case, I was impressed by the fact that
our leader regularly met with famous world leaders,
celebrities, musicians, athletes, and religious leaders (e.g.,
Gorbachev, Mandela, Mother Theresa, Carl Lewis, Whitney
Houston, Carlos Santana [who actually became a disciple
the others were classified as “admirers”] numerous Nobel
prize winners, Bill Clinton and numerous other politicians,
Ravi Shankar, Leonard Bernstein, Pir Vilayat Khan, Mats
Wilander). (This was also a case of social proof.)
Scarcity. If we can be convinced that something (goods,
services, psychological or spiritual benefits, or even smiles
or tenderness) is in short supply, we will be influenced to
want to have it—again, this is deep-seated, evolutionary
behavior. For example, my group leader was sparing
and unpredictable in giving personal recognition and
praise. Once, after the group had spent all night framing
and hanging the leader’s pen-and-ink drawings at an art
gallery, the guru smiled at me. I thought, “Did you see that!
Guru smiled at me! It was all worth it!”
Unity. We is the shared Me. The ability to influence (and
change) others is often, and importantly, grounded
in shared personal relationships, which create a “pre-
suasive” context for assent (e.g., “OK, my dear friend yes,
I’ll join…”). This term was coined by Cialdini, and it means
that the influencer lays the groundwork for an eventual
attempt at persuasion—before the actual influence tactics
(or “weapons of persuasion” [Cialdini, 1984/1993]) are
employed. The relationship of unity is not “Oh, that person
is like us” (although this also works to a lesser degree)
rather, it is “Oh, that person is us.” Unity is about shared
identities. It’s about the categories that individuals use to
define themselves and their groups, such as race, ethnicity,
nationality, and family, and also political and religious
affiliations. A key characteristic of these categories is that
their members tend to feel one with, merged with, the
others. There is an overlapping of self and other identities
within we-based groups. In the group I was in for more
than twenty years, we all honestly felt that we were
brothers and sisters, and we shared our lives together. This
level of relationship has resulted in an intense feeling of
oneness and belonging and security.
My Personal Experience of Being Recruited/
Joining
When I was recruited into a New Age, high-demand, yoga
group, I was a graduate student in a foreign country without a
full-time job I was largely unaware of my own personality and
evolutionary-based weaknesses. I was also totally unaware of
cultic groups, narcissistic spiritual leaders, and social-influence
processes. This was in the early 1970s, before the exposés
concerning high-profile, cult-like groups such as the Branch
Dravidians, Scientology, Aum Shinriko, Catholic Church sects
and pedophile priests, the Osho/Rajneesh group, the Unification
Church (Moonies), and others. It was a time often viewed as one
of relative innocence, reflected in the Beatles, Eastern meditation,
self-improvement, and progressive social change. I had started
practicing Transcendental Meditation and was spiritually/
philosophically seeking. As a high-school student I listened to
Bob Dylan Muddy Waters Joan Baez Doc Watson Lightning
Hopkins John Lee Hooker Peter, Paul, and Mary Pete Seeger
Judy Collins… Although I came from a conservative, middle-class,
Republican family, I was relatively open-minded.
A relative of my wife had become a disciple of a serious, celibate,
and “God-Realized Master,” an Indian guru living in New York City
(NYC) who was recognized and accepted at the United Nations
headquarters. I resisted this relative’s well-intended and honest
efforts for about four years. However, when my wife accompanied
the relative on a spiritual pilgrimage to the guru’s NYC ashram
and was convinced to become a disciple, I was forced to make
a decision. At the time I made my decision, I was (ironically) at
a bathing-suit-optional beach, drinking a beer, and enjoying a
beautiful, sensual summer. I loved my wife and respected her
decision so I eventually shaved off my beard and mustache,
cut off my ponytail, stopped drinking beer, and also became
a disciple on a very serious, spiritual path. We later moved to
Queens, NYC to be where the guru was.
As I am able to recognize now, many of the required values,
beliefs, and behaviors for this discipleship were selectively
concealed in the beginning. We were told we could merely
“reduce” our sexual activity, for example: we did not have to
totally renounce it but later we found out that celibacy was
a demand for single disciples. As novices, we were gradually
induced to conform as we became more solid and more
integrated into the group. By then, however, we had close and
At the time, I felt I had made a
voluntary decision to join this
inspiring, energetic, and idealistic
group …. Now I understand that
I was actually recruited…
Previous Page Next Page