10
and peers (social proof). Recruiters can structure context so
there appears to be even more uncertainty than troubled
youths were experiencing previously. This environment enables
the recruiters to manipulate and coach young recruits toward
destructive behavior (unlike true leaders, who coach toward
prosocial behavior). The recruiters fake both their expertise
and the available social-support groups as long as the recruits’
perception is that these elements are real, that perception
becomes their reality as victims in the radicalization process.
An interesting example of how we might miss these social
forces is when we discuss the radicalized youth of Molenbeek.
Frequently we blame government policies, lack of immigration
efforts, or their low social status for their radicalization.
But it turns out that the radicalized youth were actively
recruited. Philippe Moureaux, who served for two decades
as Molenbeek’s mayor, described this as “the paradox of
integration.” A less-integrated Turkish community has resisted
the promise of redemption through jihad offered by radical
zealots. Yet the members of a Moroccan community who are
more at home in French-speaking Brussels have seen some
of their young fall prey to recruiters such as Khalid Zerkani, a
Moroccan-born petty criminal who became the Islamic State’s
point man in Molenbeek (Higgens, 2016). This is a powerful
example of how, when they are uncertain, most people will
look at what others just like them are doing. Here, the proof of
a correct choice isn’t based on knowledge or logic or empirical
evidence it’s based on social evidence of what one’s peers
and those in one’s social network have decided to do. Cialdini
notes (2010) that deceptive recruitment or fraud of this sort is
hardly limited to one ethnic or religious group. For example,
Charles Ponzi, who gave his name to the infamous Ponzi
scheme that Madoff copied, was an Italian immigrant to the
United States who fleeced other Italians.
CM: Most people think that cult members are foolish
somehow, or very naïve. Do you think that is right?
RC: No I don’t think that is right. In fact, the title of the talk
I gave in the conference is You Don’t Have to Be a Fool to Be
Fooled. We are all taken in by these principles to purchase
products, to vote for candidates, to contribute to causes.
These principles reflect the way we work as human beings in
modern society we can’t simply abandon them. They are the
way we decide frequently who[m] to say Yes to and who[m]
to say No to. So true authorities are to be followed—it makes
sense to do so. It would be foolish to abandon the idea that
experts provide important diagnostic information about how
we should comport ourselves in a particular situation we
can’t just abandon the idea that experts are a good source of
information. So we are all susceptible to them [the deceptive
users of these principles for their own motives].
CM: You mentioned cognitive overload in your lecture
yesterday, and said this was the one thing that made cultic
experience a little bit different from the type of influence of
other groups, although the use of the principles remains the
same.
RC: Yes in cults the environment is controlled in two ways.
One is that it is entirely full of cultic activity and ideas, and
there is really no time for anything else… The other way [is
that] all the principles can be used constantly. The one thing
that separates cultic groups from other influencers such as
advertisers, marketers, and fundraisers who use these tactics
is that the influence environment in cults is pervasive it is
constant and there is no escape from it. That’s why in the
data we collected recently we find that former cult members,
more than any other group members, report more intensive
exposure to each of the principles in my book.
CM: You have a lot of knowledge about influence. Do you
think it is at all possible for you to be recruited into a business
cult, a pyramid cult, or a religious cult?
RC: This is a good question because when I did the research for
my book, one of the things I did was to infiltrate the training
programs of as many of the influence professions in society
as I could get access to. I learned how to sell automobiles,
portrait photography, insurance, encyclopedias, et cetera.
I also infiltrated some fund-raising organizations to see what
they did to be more successful—to train their recruits and be
more successful getting donations and contributions for their
causes. But I never felt comfortable infiltrating cultic groups
because I read stories about journalists who, under the guise
of becoming a recruit so they could write their story, joined
cults, but they haven’t come out yet.
CM: Why do you think that is?
RC: That is because of the power of the principles of influence.
The cults are able to employ them in settings that they control
completely. They don’t control in a physical way members’
ability to leave, but they control [it] psychologically, by having
complete control over the information and communication
environments through which people exist in those groups.
They are surrounded by others who believe the same thing.
They are led by a charismatic figure, which pulls attention and
focus in their direction and isolates members from all other
sources of information as to what constitutes the truth. And
before long, people are swept into a state of believing without
being critical because everything around them tells them
there is no need to be critical. Also, their ability to be critical is
undermined by things such as information overload, poor diet,
lack of sleep, and so on. All this makes it difficult for people to
step back from the situation and do the hard work of counter
arguments.
CM: Can you say something about your research?
RC: In our research we use what is called The Group
Psychological Abuse (GPA) Scale [developed by Chambers,
What the cults do is to take that
information overload, that
stimulus saturation and cognitive
overload, and intensify it. ..
Previous Page Next Page