All language subtitles for Through.the.Wormhole.S07E01.What.Makes.a.Terrorist.480p
Afrikaans
Albanian
Amharic
Arabic
Armenian
Azerbaijani
Basque
Belarusian
Bengali
Bosnian
Bulgarian
Catalan
Cebuano
Chichewa
Chinese (Simplified)
Chinese (Traditional)
Corsican
Croatian
Czech
Dutch
English
Esperanto
Estonian
Filipino
Finnish
French
Frisian
Galician
Georgian
German
Greek
Gujarati
Haitian Creole
Hausa
Hawaiian
Hebrew
Hindi
Hmong
Hungarian
Icelandic
Igbo
Indonesian
Irish
Italian
Japanese
Javanese
Kannada
Kazakh
Khmer
Korean
Kurdish (Kurmanji)
Kyrgyz
Lao
Latin
Latvian
Lithuanian
Luxembourgish
Macedonian
Malagasy
Malay
Malayalam
Maltese
Maori
Marathi
Mongolian
Myanmar (Burmese)
Nepali
Norwegian
Pashto
Persian
Polish
Portuguese
Punjabi
Romanian
Russian
Samoan
Scots Gaelic
Serbian
Sesotho
Shona
Sindhi
Sinhala
Slovak
Slovenian
Somali
Spanish
Sundanese
Swahili
Swedish
Tajik
Tamil
Telugu
Thai
Turkish
Ukrainian
Urdu
Uzbek
Vietnamese
Welsh
Xhosa
Yiddish
Yoruba
Zulu
Odia (Oriya)
Kinyarwanda
Turkmen
Tatar
Uyghur
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:03,990 --> 00:00:06,020
We're at war,
2
00:00:06,030 --> 00:00:08,190
and the battle lines
have been redrawn.
3
00:00:10,860 --> 00:00:13,130
Terrorism can strike anywhere...
4
00:00:15,730 --> 00:00:18,840
...and everyone is a target.
5
00:00:18,840 --> 00:00:23,140
Science is peering into the dark
heart of terror networks
6
00:00:23,140 --> 00:00:29,010
to find out why terrorists
embrace unspeakable atrocities.
7
00:00:29,010 --> 00:00:31,580
Are they so different from us,
8
00:00:31,580 --> 00:00:33,620
or do we all have the ability
9
00:00:33,620 --> 00:00:36,890
to give our lives for a cause?
10
00:00:36,890 --> 00:00:41,060
Could we learn to
dehumanize our enemies,
11
00:00:41,060 --> 00:00:44,690
or could science show us
a way to stop terrorism...
12
00:00:47,000 --> 00:00:50,470
...and understand
what makes a terrorist?
13
00:00:56,240 --> 00:01:00,810
Space, time, life itself.
14
00:01:03,180 --> 00:01:07,520
The secrets of the cosmos
lie through the wormhole.
15
00:01:07,520 --> 00:01:10,520
Captions by vitac...
www.Vitac.Com
16
00:01:10,520 --> 00:01:13,520
captions paid for by
Discovery communications
17
00:01:21,000 --> 00:01:22,770
When four planes dropped out
18
00:01:22,770 --> 00:01:27,140
of a clear-blue sky
on September 11th, 2001,
19
00:01:27,140 --> 00:01:30,070
it was shocking to realize
20
00:01:30,080 --> 00:01:33,510
we were all potential targets.
21
00:01:33,510 --> 00:01:37,810
Today, al-qaeda and Isis
claim the headlines,
22
00:01:37,820 --> 00:01:40,280
but it was once the ira
23
00:01:40,290 --> 00:01:43,620
and basque separatists
in Europe, or anarchists
24
00:01:43,620 --> 00:01:47,220
and the Ku Klux Klan
in the United States.
25
00:01:47,230 --> 00:01:49,030
Terrorists, like any group,
26
00:01:49,030 --> 00:01:51,260
have ideals they belive in...
27
00:01:51,260 --> 00:01:54,200
a cause to rally around.
28
00:01:54,200 --> 00:01:57,470
But for them, the cause
justifies the deliberate
29
00:01:57,470 --> 00:02:01,370
targeting of innocent civilians.
30
00:02:01,370 --> 00:02:04,470
How can they think that way?
31
00:02:04,480 --> 00:02:08,410
Can we peer inside
the mind of a terrorist?
32
00:02:18,120 --> 00:02:21,290
Anthropologist
Scott Atran has spent years
33
00:02:21,290 --> 00:02:24,390
trying to understand the enemy.
34
00:02:24,400 --> 00:02:26,260
He went to the front lines
of Iraq,
35
00:02:26,270 --> 00:02:28,770
embedding himself
with American allies
36
00:02:28,770 --> 00:02:31,740
to meet terrorists face to face.
37
00:02:31,740 --> 00:02:35,710
This is the foremost
position on the front lines.
38
00:02:35,710 --> 00:02:38,940
I'm 10 kilometers west
of makhmur.
39
00:02:38,940 --> 00:02:41,780
So, we went to Iraq in 2016
40
00:02:41,780 --> 00:02:43,950
to follow up on a battle.
41
00:02:43,950 --> 00:02:46,580
Some 500 kurdish
soldiers had cornered
42
00:02:46,590 --> 00:02:50,150
fewer than 100 Isis fighters.
43
00:02:50,160 --> 00:02:52,260
When all appeared
to be lost to them
44
00:02:52,260 --> 00:02:54,260
and the last 15 were retreated,
45
00:02:54,260 --> 00:02:58,900
7 of them blew themselves up
to cover their retreat.
46
00:02:58,900 --> 00:03:02,100
The kurdish forces, seeing this,
47
00:03:02,100 --> 00:03:04,330
decided it just wasn't
worth trying
48
00:03:04,340 --> 00:03:05,700
to hold on to the territory,
49
00:03:05,700 --> 00:03:07,600
even though
they vastly outnumbered
50
00:03:07,610 --> 00:03:09,540
the islamic state fighters.
51
00:03:09,540 --> 00:03:12,540
Scott wanted to know
how such a small group
52
00:03:12,540 --> 00:03:16,280
were able to hold off
a much more formable force.
53
00:03:16,280 --> 00:03:19,650
He sat down to interview
captured Isis fighters
54
00:03:19,650 --> 00:03:22,020
and put them through
a battery of tests
55
00:03:22,020 --> 00:03:24,350
to assess the strength
of the group.
56
00:03:28,130 --> 00:03:30,230
Yes.
57
00:03:30,230 --> 00:03:33,460
Scott discovered there
was an incredible bond
58
00:03:33,470 --> 00:03:35,600
between these fighters.
59
00:03:35,600 --> 00:03:39,000
They would do anything
for their cause.
60
00:03:41,570 --> 00:03:42,910
Scott has been studying
61
00:03:42,910 --> 00:03:45,840
the dynamics of groups
for years.
62
00:03:45,840 --> 00:03:48,580
Humans are
the preeminent group animal.
63
00:03:48,580 --> 00:03:52,250
That's what allows us
to build things like that.
64
00:03:52,250 --> 00:03:53,780
It's a collective effort.
65
00:03:53,790 --> 00:03:54,920
They have to coordinate,
66
00:03:54,920 --> 00:03:57,090
and they have to take risks
for one another.
67
00:03:57,090 --> 00:04:00,290
In fact, they have to be able
to sacrifice for one another.
68
00:04:00,290 --> 00:04:03,860
One world trade stands
as a testament to America's
69
00:04:03,860 --> 00:04:07,900
resolve after September 11th.
70
00:04:07,900 --> 00:04:10,870
When the towers
came down in 2001,
71
00:04:10,870 --> 00:04:14,540
it was the iron workers
from local 40, amongst others,
72
00:04:14,540 --> 00:04:18,910
who rebuilt the site
a decade later.
73
00:04:18,910 --> 00:04:24,180
Terrorists and iron workers
seem like polar opposites to us.
74
00:04:24,180 --> 00:04:25,880
But to an anthropologist,
75
00:04:25,880 --> 00:04:28,620
both groups share
a crucial trait.
76
00:04:32,260 --> 00:04:35,890
These New York City
iron workers risk their lives
77
00:04:35,890 --> 00:04:40,030
every day for each other
maneuvering these massive beams
78
00:04:40,030 --> 00:04:43,600
while balancing hundreds
of feet in the air.
79
00:04:43,600 --> 00:04:46,040
Just set it down here.
80
00:04:46,040 --> 00:04:49,170
They have an extremely
close group dynamic.
81
00:04:49,170 --> 00:04:53,240
Today, Scott is going to measure
how strong it is,
82
00:04:53,250 --> 00:04:55,950
administering to them
the same test
83
00:04:55,950 --> 00:04:59,420
he has given
to terrorists in Iraq.
84
00:04:59,420 --> 00:05:02,020
We're interested
in how individuals relate
85
00:05:02,020 --> 00:05:05,290
to the groups they live with
and they work with.
86
00:05:05,290 --> 00:05:07,320
The test is simple.
87
00:05:07,330 --> 00:05:10,090
It involves five cards.
88
00:05:10,100 --> 00:05:12,660
Each card shows
a pair of circles.
89
00:05:12,660 --> 00:05:15,170
The small one represents me.
90
00:05:15,170 --> 00:05:18,340
The large circle
represents the group.
91
00:05:18,340 --> 00:05:20,470
In the first pairing,
the "me" circle
92
00:05:20,470 --> 00:05:23,770
and the "group"
circle are totally separate.
93
00:05:23,780 --> 00:05:27,040
But gradually,
they begin to overlap.
94
00:05:27,050 --> 00:05:29,450
In the fifth pairing,
the "me" circle
95
00:05:29,450 --> 00:05:32,520
is entirely contained
within the "group" circle,
96
00:05:32,520 --> 00:05:35,790
the two identities fully fused.
97
00:05:35,790 --> 00:05:37,590
And what I'd like you to do
98
00:05:37,590 --> 00:05:41,620
is pick the one that defines
the way you think you are.
99
00:05:41,630 --> 00:05:43,460
The iron workers
see their identities
100
00:05:43,460 --> 00:05:46,760
closely overlapped
by the group's.
101
00:05:46,770 --> 00:05:48,800
Scott expects this...
102
00:05:48,800 --> 00:05:51,530
it's a natural byproduct
of a tight-knit group
103
00:05:51,540 --> 00:05:54,600
in a highly risky profession.
104
00:05:54,610 --> 00:05:57,570
So, how do the iron workers'
answers to the test
105
00:05:57,580 --> 00:05:59,940
compare to those of terrorists?
106
00:06:03,450 --> 00:06:05,980
The terrorists' answers
were unlike those
107
00:06:05,980 --> 00:06:08,690
of any other group
Scott had studied.
108
00:06:08,690 --> 00:06:11,120
For them, the group
was all consuming.
109
00:06:11,120 --> 00:06:14,360
The individual was nonexistent.
110
00:06:14,360 --> 00:06:16,960
Well, they did this
with guerillas in Libya,
111
00:06:16,960 --> 00:06:18,960
where they actually
rubbed it out
112
00:06:18,960 --> 00:06:20,800
and blackened out
the "me" in the group,
113
00:06:20,800 --> 00:06:22,330
as if there were no difference.
114
00:06:22,330 --> 00:06:25,270
Well, then we say they they're
completely fused with the group.
115
00:06:27,510 --> 00:06:30,910
What makes their
dedication so extreme?
116
00:06:30,910 --> 00:06:33,010
Scott says it has
to do with the values
117
00:06:33,010 --> 00:06:35,480
that hold the group together.
118
00:06:35,480 --> 00:06:38,050
Iron workers are loyal
to their brothers,
119
00:06:38,050 --> 00:06:41,320
but probably wouldn't go to work
if they weren't paid
120
00:06:41,320 --> 00:06:44,350
or their families
were being threatened.
121
00:06:44,360 --> 00:06:46,120
Not so for Isis fighters,
122
00:06:46,120 --> 00:06:48,930
who see their group
as their salvation
123
00:06:48,930 --> 00:06:53,300
and an islamic state
as absolutely non-negotiable.
124
00:06:53,300 --> 00:06:56,900
These values have become
sacred to them.
125
00:06:56,900 --> 00:06:59,640
Sacred values are
often religious values,
126
00:06:59,640 --> 00:07:01,600
because they're transcendent.
127
00:07:01,610 --> 00:07:04,070
You wouldn't comprise
or negotiate
128
00:07:04,080 --> 00:07:06,310
to trade off your children.
129
00:07:06,310 --> 00:07:08,450
Many of us wouldn't
negotiate or comprise
130
00:07:08,450 --> 00:07:11,610
to trade off
our country or our religion
131
00:07:11,620 --> 00:07:15,350
for any amount of money,
under any social pressure.
132
00:07:22,060 --> 00:07:24,390
They respond on
all other measures...
133
00:07:24,400 --> 00:07:27,760
willingness to fight, to die,
to sacrifice for one other,
134
00:07:27,770 --> 00:07:31,070
to torture,
to have their families suffer.
135
00:07:31,070 --> 00:07:35,140
And so, if you have a group
of people fused around
136
00:07:35,140 --> 00:07:36,710
a set of ideas they hold sacred,
137
00:07:36,710 --> 00:07:39,880
for which no negotiation
and compromise is possible,
138
00:07:39,880 --> 00:07:41,480
well, then you have
the most formable
139
00:07:41,480 --> 00:07:44,610
fighting force possible.
140
00:07:44,620 --> 00:07:46,920
And Isis is no exception.
141
00:07:46,920 --> 00:07:50,490
Scott's work has shown
how terrorists are single-minded
142
00:07:50,490 --> 00:07:54,060
and absolutely committed
to a cause.
143
00:07:54,060 --> 00:07:55,830
But how do they
take this behavior
144
00:07:55,830 --> 00:07:59,360
so far as to load a car
full of explosives
145
00:07:59,360 --> 00:08:02,300
and blow up a building
full of innocent people?
146
00:08:07,040 --> 00:08:11,940
How do we shut down our
ability to care for others?
147
00:08:11,940 --> 00:08:15,040
According to psychologist
Jay Van bavel,
148
00:08:15,050 --> 00:08:17,310
our brains have
a remarkable ability
149
00:08:17,320 --> 00:08:21,890
to dehumanize others
under the right circumstances.
150
00:08:21,890 --> 00:08:24,690
Jay has designed
a test to demonstrate
151
00:08:24,690 --> 00:08:28,390
how we recognize humanity
in the first place.
152
00:08:28,390 --> 00:08:31,830
People came in,
we took an image of a doll
153
00:08:31,830 --> 00:08:35,430
and morphed it with a human face
that looked very similar.
154
00:08:35,430 --> 00:08:38,030
And when does this face
look human to you?
155
00:08:41,310 --> 00:08:44,470
Jay asked participants
the same question.
156
00:08:44,480 --> 00:08:47,180
And right about now.
157
00:08:47,180 --> 00:08:48,680
Just passed the half-way mark
158
00:08:48,680 --> 00:08:51,480
between the doll face
and the real face,
159
00:08:51,480 --> 00:08:54,150
most of us perceive
that the face is human
160
00:08:54,150 --> 00:08:55,690
and has a mind.
161
00:08:55,690 --> 00:08:57,520
Right... There.
162
00:08:57,520 --> 00:09:02,030
In Jay's field, it's called
"perception of mind."
163
00:09:02,030 --> 00:09:05,930
But he says it's not fixed.
164
00:09:05,930 --> 00:09:09,300
Jay adjusted the experiment,
and told his subjects
165
00:09:09,300 --> 00:09:12,540
what college the person
in the photo went to.
166
00:09:12,540 --> 00:09:15,370
Their own college,
an in-group...
167
00:09:15,370 --> 00:09:17,670
And right about now.
168
00:09:17,680 --> 00:09:20,040
...or a rival college
across town,
169
00:09:20,050 --> 00:09:21,740
an out-group.
170
00:09:21,750 --> 00:09:25,320
This rivalry had
a measurable effect.
171
00:09:25,320 --> 00:09:27,980
What we found
was that these students
172
00:09:27,990 --> 00:09:31,690
were able to mind behind
the in-group face much faster.
173
00:09:31,690 --> 00:09:33,690
And they were slower
to see the humanity
174
00:09:33,690 --> 00:09:35,990
in students
from another college.
175
00:09:35,990 --> 00:09:38,030
Right about now.
176
00:09:38,030 --> 00:09:41,500
These are subtle kinds
of dehumanization.
177
00:09:41,500 --> 00:09:44,270
How do we get into
the mind of a terrorist
178
00:09:44,270 --> 00:09:46,570
who can view a tower
full of civilians
179
00:09:46,570 --> 00:09:49,440
as a justifiable target?
180
00:09:49,440 --> 00:09:52,540
Jay believes it's easier
than you think.
181
00:09:52,540 --> 00:09:55,550
For a second experiment,
Jay divides subjects,
182
00:09:55,550 --> 00:09:57,950
arbitrarily, into two teams...
183
00:09:57,950 --> 00:10:00,980
the "rattlers"
and the "eagles."
184
00:10:00,990 --> 00:10:03,790
We told people that we
were keeping track of points,
185
00:10:03,790 --> 00:10:06,160
and the winning team
would walk away with the money.
186
00:10:06,160 --> 00:10:09,030
Jay's creating
the essence of conflict...
187
00:10:09,030 --> 00:10:12,300
two group competing
for resources.
188
00:10:12,300 --> 00:10:15,870
So, now I'm gonna read both
teams a number of statements.
189
00:10:15,870 --> 00:10:18,330
He reads the group
short stories about people
190
00:10:18,340 --> 00:10:22,370
who supposedly belong
to neither team.
191
00:10:22,370 --> 00:10:26,840
Jane managed to get indoors
before it started to rain.
192
00:10:26,850 --> 00:10:30,250
These stories
are designed to measure empathy.
193
00:10:30,250 --> 00:10:32,550
If you empathize
with the person's story,
194
00:10:32,550 --> 00:10:34,820
you put your thumb up.
195
00:10:34,820 --> 00:10:37,450
If you don't care, thumb down.
196
00:10:37,460 --> 00:10:42,590
Brandon got soaked by
a taxi driver through a puddle.
197
00:10:42,590 --> 00:10:47,930
Eagles and rattlers give
equally empathetic responses.
198
00:10:47,930 --> 00:10:51,530
Then Jay changes the game.
199
00:10:51,540 --> 00:10:53,670
Andrew sat in gum
on a park bench.
200
00:10:53,670 --> 00:10:56,740
Andrew is a member
of the eagles.
201
00:10:56,740 --> 00:10:58,510
Serves him right.
202
00:10:58,510 --> 00:11:01,880
Even in an entirely
artificial setting,
203
00:11:01,880 --> 00:11:04,050
the results are profound.
204
00:11:04,050 --> 00:11:06,420
Once there's competition
added to the mix,
205
00:11:06,420 --> 00:11:08,220
conflict can
escalate very quickly
206
00:11:08,220 --> 00:11:09,790
towards the out-group.
207
00:11:09,790 --> 00:11:12,320
Team members
actually take pleasure
208
00:11:12,320 --> 00:11:14,690
in the other team's pain.
209
00:11:14,690 --> 00:11:17,230
Remember, they've only been
assigned to these teams
210
00:11:17,230 --> 00:11:19,830
for a matter of minutes.
211
00:11:19,830 --> 00:11:21,300
The thing
that's really remarkable
212
00:11:21,300 --> 00:11:23,430
is that there needs to be
no history of conflict
213
00:11:23,440 --> 00:11:25,440
or any stereotypes
towards the groups
214
00:11:25,440 --> 00:11:28,370
for them to start feel
negative towards them.
215
00:11:28,370 --> 00:11:30,440
Right there.
216
00:11:30,440 --> 00:11:33,480
Jay's tests reveal
that dehumanization can happen
217
00:11:33,480 --> 00:11:35,480
with very little prodding.
218
00:11:35,480 --> 00:11:37,180
Whenever groups compete,
219
00:11:37,180 --> 00:11:40,120
the dehumanization process
begins.
220
00:11:42,190 --> 00:11:45,790
When that competition
includes whole cultures,
221
00:11:45,790 --> 00:11:48,020
the results can be deadly.
222
00:11:54,370 --> 00:11:57,930
If science has discovered
anything about terrorists,
223
00:11:57,940 --> 00:12:00,770
it's that we all have
the potential
224
00:12:00,770 --> 00:12:02,910
to think like one.
225
00:12:02,910 --> 00:12:07,110
All of our brains
are programmed to dehumanize.
226
00:12:07,110 --> 00:12:11,410
We all have values
that we can't compromise.
227
00:12:11,420 --> 00:12:14,480
So what stops us from thinking
228
00:12:14,490 --> 00:12:18,190
and acting in extreme ways?
229
00:12:18,190 --> 00:12:21,960
The answer to that lies less
in what you believe
230
00:12:21,960 --> 00:12:24,160
and more in who you know.
231
00:12:28,330 --> 00:12:30,530
After every big
coordinated terrorist attack
232
00:12:30,540 --> 00:12:34,140
in Europe, the U.S., or Asia,
233
00:12:34,140 --> 00:12:37,610
we wonder how it
could have happened again.
234
00:12:37,610 --> 00:12:39,140
How did a terrorist cell
235
00:12:39,140 --> 00:12:41,680
live among us for months
or years
236
00:12:41,680 --> 00:12:44,350
before deciding to strike?
237
00:12:44,350 --> 00:12:47,650
What's the glue
that binds them together,
238
00:12:47,650 --> 00:12:51,450
and how can we dismantle them
before they strike again.
239
00:12:53,790 --> 00:12:55,190
A café in central London
240
00:12:55,190 --> 00:12:57,890
may seem like
a strange place for a lab,
241
00:12:57,900 --> 00:13:00,130
but this is where
social psychologist
242
00:13:00,130 --> 00:13:03,100
nafees Hamid is trying
to uncover
243
00:13:03,100 --> 00:13:05,700
the structure
of terror networks.
244
00:13:05,700 --> 00:13:09,870
My goal is to understand
the "how" of recruitment.
245
00:13:09,880 --> 00:13:12,440
If you start with how,
you understand
246
00:13:12,440 --> 00:13:15,040
the networks that are in play.
247
00:13:15,050 --> 00:13:16,950
Nafees is trying to figure out
248
00:13:16,950 --> 00:13:19,180
how terror networks function
249
00:13:19,180 --> 00:13:22,050
and how they can
be pulled apart.
250
00:13:22,050 --> 00:13:24,250
It's not a simple task.
251
00:13:24,260 --> 00:13:25,460
There's a lot of
theories out there,
252
00:13:25,460 --> 00:13:27,390
but there's very little data,
253
00:13:27,390 --> 00:13:29,290
and this is largely
because it's hard
254
00:13:29,290 --> 00:13:31,860
to get jihadis
to go into a laboratory.
255
00:13:31,860 --> 00:13:33,230
But nafees,
256
00:13:33,230 --> 00:13:35,970
a Pakistani-American
living in London,
257
00:13:35,970 --> 00:13:39,040
tried something a little out
of the ordinary.
258
00:13:41,170 --> 00:13:43,570
He called them.
259
00:13:43,580 --> 00:13:45,280
Salaam alaikum, brother.
260
00:13:45,280 --> 00:13:47,780
How are you?
How's it going?
261
00:13:47,780 --> 00:13:51,010
I decided to just contact them
directly online...
262
00:13:51,020 --> 00:13:54,050
found them on social media,
on Twitter, on various accounts.
263
00:13:54,050 --> 00:13:56,450
Getting terrorists on the phone
264
00:13:56,450 --> 00:13:58,920
is not as hard as you think.
265
00:13:58,920 --> 00:14:00,590
I'm very honest,
and that's the key.
266
00:14:00,590 --> 00:14:02,590
There's so many people,
intelligence officers
267
00:14:02,590 --> 00:14:04,460
and so forth,
who are pretending to be people
268
00:14:04,460 --> 00:14:06,460
they're not to contact
these people,
269
00:14:06,460 --> 00:14:09,000
and they can usually
sniff those people out.
270
00:14:09,000 --> 00:14:11,270
How's the weather in sham today?
271
00:14:11,270 --> 00:14:13,670
Nafees uses
these conversations to make
272
00:14:13,670 --> 00:14:17,510
social network analysis models
of terrorist groups.
273
00:14:17,510 --> 00:14:20,610
He's not tracking
"likes" on Facebook.
274
00:14:20,610 --> 00:14:24,010
He's looking at terrorists
real-life social networks...
275
00:14:24,020 --> 00:14:27,180
friends, family,
and acquaintances.
276
00:14:27,190 --> 00:14:30,320
He's talked to members
of Isis, al-qaeda,
277
00:14:30,320 --> 00:14:32,520
groups all over the world.
278
00:14:32,520 --> 00:14:35,290
Nafees' research
leads him to believe
279
00:14:35,290 --> 00:14:37,730
the reasons people join
a terror cell
280
00:14:37,730 --> 00:14:41,360
has little to do with how
religious their family is
281
00:14:41,370 --> 00:14:45,270
or how how poor
their neighborhood is.
282
00:14:45,270 --> 00:14:46,800
There doesn't seem
to be a correlation
283
00:14:46,810 --> 00:14:49,740
between the ecology
of an environment
284
00:14:49,740 --> 00:14:52,440
and who radicalizes
and who doesn't radicalize.
285
00:14:52,440 --> 00:14:53,880
Instead, your best predictor
286
00:14:53,880 --> 00:14:56,110
of whether someone's
going to go join
287
00:14:56,110 --> 00:14:59,020
an islamist group
is whether they have a friend
288
00:14:59,020 --> 00:15:01,080
who's already a member
of that group.
289
00:15:01,090 --> 00:15:03,920
Friendships basically create
an echo chamber
290
00:15:03,920 --> 00:15:08,760
that allows the radicalization
process to take shape.
291
00:15:08,760 --> 00:15:11,660
So, does knowing
how these networks form
292
00:15:11,660 --> 00:15:14,900
tell us how to break them apart?
293
00:15:14,900 --> 00:15:17,300
Paris, 2015.
294
00:15:19,740 --> 00:15:23,640
Brussels, 2016.
295
00:15:23,640 --> 00:15:25,580
After the attacks,
attention focused
296
00:15:25,580 --> 00:15:27,010
on an Isis terror cell
297
00:15:27,010 --> 00:15:30,380
from the poor Brussels
suburb of Molenbeek
298
00:15:30,380 --> 00:15:34,380
and their radical leader
Abdelhamid Abaaoud.
299
00:15:37,090 --> 00:15:41,020
What's amazing about
the Paris, Brussels attack
300
00:15:41,030 --> 00:15:43,490
is that a terror cell
carried out
301
00:15:43,500 --> 00:15:46,030
a major attack on European soil,
302
00:15:46,030 --> 00:15:48,630
then had the weight
of all of the police
303
00:15:48,630 --> 00:15:51,170
and intelligence agencies
of Europe
304
00:15:51,170 --> 00:15:52,900
brought down on it,
305
00:15:52,900 --> 00:15:54,970
and then was able
to successfully carry out
306
00:15:54,970 --> 00:15:58,680
a second attack within months.
307
00:15:58,680 --> 00:16:01,240
How were they able to do this?
308
00:16:01,250 --> 00:16:05,280
Nafees supplies social network
analysis to the Molenbeek cell
309
00:16:05,280 --> 00:16:08,620
to see how the group functioned.
310
00:16:08,620 --> 00:16:11,590
Nafees measures
each man's importance
311
00:16:11,590 --> 00:16:13,720
to the group by the quantity
312
00:16:13,730 --> 00:16:16,360
and type of his connections
within it.
313
00:16:16,360 --> 00:16:19,730
How long had he known
the others, from where?
314
00:16:19,730 --> 00:16:21,800
Had they served
on the battlefield
315
00:16:21,800 --> 00:16:24,970
or done time together in prison?
316
00:16:24,970 --> 00:16:28,400
In order to carry off
an attack of this magnitude,
317
00:16:28,410 --> 00:16:30,840
you need deep friendships
with people.
318
00:16:30,840 --> 00:16:32,980
In the end, you're trusting
these people with your lives,
319
00:16:32,980 --> 00:16:34,680
and so if they're siblings,
320
00:16:34,680 --> 00:16:37,980
if they're childhood
friends, prison mates.
321
00:16:37,980 --> 00:16:41,180
And you really see
this in this network.
322
00:16:41,190 --> 00:16:43,650
Predictably, abdelhamid abaaoud
323
00:16:43,660 --> 00:16:46,420
has the most connections.
324
00:16:46,420 --> 00:16:48,930
But abaaoud was killed
in a police shootout
325
00:16:48,930 --> 00:16:52,430
right after the Paris attacks.
326
00:16:52,430 --> 00:16:54,560
And instead of falling apart,
327
00:16:54,570 --> 00:16:57,830
the terror cell struck
again in Brussels.
328
00:16:57,840 --> 00:16:59,700
Nafees says
that network analysis
329
00:16:59,700 --> 00:17:02,510
doesn't just track
how many connections you have,
330
00:17:02,510 --> 00:17:06,180
it also tracks how you know
those connections.
331
00:17:06,180 --> 00:17:09,810
"Are you the only one
bridging one group to another?"
332
00:17:09,810 --> 00:17:13,380
"Do you only know people
who live close to you?"
333
00:17:13,390 --> 00:17:17,720
Using all of these metrics,
a seemingly minor player
334
00:17:17,720 --> 00:17:22,830
named salah abdeslam
rises to the surface.
335
00:17:22,830 --> 00:17:25,290
Salah was a guy
who never went to Syria,
336
00:17:25,300 --> 00:17:27,400
never recruited anyone himself,
337
00:17:27,400 --> 00:17:30,730
was supposed to
blow himself up in Paris,
338
00:17:30,740 --> 00:17:32,670
but chickened out
at the last minute.
339
00:17:32,670 --> 00:17:34,940
This was guy who was
clearly less radicalized
340
00:17:34,940 --> 00:17:37,040
than abdelhamid abaaoud.
341
00:17:37,040 --> 00:17:40,240
Salah didn't know
as many people as abaaoud,
342
00:17:40,250 --> 00:17:42,610
but he knew
more kinds of people.
343
00:17:42,610 --> 00:17:46,320
He was often the only bridge
between different groups.
344
00:17:46,320 --> 00:17:49,090
Salah abdeslam
was driving all over Europe,
345
00:17:49,090 --> 00:17:51,120
back and forth,
connecting people,
346
00:17:51,120 --> 00:17:52,620
moving money around.
347
00:17:52,620 --> 00:17:54,420
He was connecting players
348
00:17:54,430 --> 00:17:57,090
that wouldn't have been
connected to each other
349
00:17:57,100 --> 00:18:00,700
had he not been filling
that social network up.
350
00:18:00,700 --> 00:18:03,900
Facilitators
like salah are often
351
00:18:03,900 --> 00:18:06,640
less radical, less visible.
352
00:18:06,640 --> 00:18:10,970
They stay off the radar
and help a terror group survive,
353
00:18:10,980 --> 00:18:14,080
even after the leader
is taken out.
354
00:18:14,080 --> 00:18:16,580
Generally, police,
intelligence agencies
355
00:18:16,580 --> 00:18:19,720
don't pay enough attention
to the facilitators.
356
00:18:19,720 --> 00:18:21,120
If you go after those people,
357
00:18:21,120 --> 00:18:24,220
you will do more in reducing
the efficacy of that network
358
00:18:24,220 --> 00:18:27,460
then if you go
after the central figures.
359
00:18:27,460 --> 00:18:30,390
"Focus on the facilitators."
360
00:18:30,400 --> 00:18:34,200
This is a good rule of thumb,
until it's not.
361
00:18:34,200 --> 00:18:36,570
As law enforcement
has grown more successful
362
00:18:36,570 --> 00:18:38,530
tracking terror cells,
363
00:18:38,540 --> 00:18:42,610
more attacks are being
carried out by solo actors.
364
00:18:42,610 --> 00:18:47,440
And these lone wolves
could be anywhere.
365
00:18:51,010 --> 00:18:53,310
The lone wolf...
366
00:18:53,310 --> 00:18:56,780
the killer
who's part of no cell,
367
00:18:56,780 --> 00:19:01,050
who arouses no suspicion
until he acts.
368
00:19:01,050 --> 00:19:05,450
In September 2014,
Isis issued a global call
369
00:19:05,450 --> 00:19:08,290
for its followers
to attack the west.
370
00:19:08,290 --> 00:19:09,960
And since then,
371
00:19:09,960 --> 00:19:11,630
the pace of lone wolf attacks
372
00:19:11,630 --> 00:19:13,830
has been accelerating.
373
00:19:13,830 --> 00:19:17,230
The Orlando nightclub shooting,
the deadliest attack
374
00:19:17,230 --> 00:19:20,200
by a lone gunman
in American history,
375
00:19:20,200 --> 00:19:23,840
was part of that
horrifying trend.
376
00:19:23,840 --> 00:19:27,570
Can science trace
the origin of lone wolves
377
00:19:27,580 --> 00:19:31,850
and help us stop them?
378
00:19:31,850 --> 00:19:33,280
Social psychologist
379
00:19:33,280 --> 00:19:36,480
Sophia Moskalenko
researches radicalism
380
00:19:36,490 --> 00:19:39,520
at a university
of Maryland think tank.
381
00:19:39,520 --> 00:19:42,220
She grew up immersed
in the subject,
382
00:19:42,220 --> 00:19:44,060
but not by choice.
383
00:19:44,060 --> 00:19:47,230
I was born
in the Republic of Ukraine,
384
00:19:47,230 --> 00:19:49,000
and when I was growing up,
385
00:19:49,000 --> 00:19:53,270
I was part of
the Soviet union's effort
386
00:19:53,270 --> 00:19:56,470
to radicalize all
of its citizens.
387
00:19:56,470 --> 00:19:59,970
I often questioned that
and got into a lot of trouble
388
00:19:59,980 --> 00:20:02,640
at school for questioning it.
389
00:20:02,650 --> 00:20:06,150
Radicalism is still
the focus of her life.
390
00:20:06,150 --> 00:20:10,220
Now she studies the path
people take to terrorism.
391
00:20:10,220 --> 00:20:12,990
More frequently,
people become radicalized
392
00:20:12,990 --> 00:20:16,360
through groups or through people
that they already know
393
00:20:16,360 --> 00:20:19,190
who belong to terrorists
or radical groups.
394
00:20:19,200 --> 00:20:21,330
On rare occasions, however,
395
00:20:21,330 --> 00:20:23,760
people radicalize on their own.
396
00:20:23,770 --> 00:20:25,930
Typically, these lone wolves,
397
00:20:25,940 --> 00:20:28,600
like Ted Kaczynski,
the Unabomber,
398
00:20:28,600 --> 00:20:32,710
or Omar Mateen
the Orlando mass shooter,
399
00:20:32,710 --> 00:20:34,980
are disturbed misfits.
400
00:20:34,980 --> 00:20:37,480
And people like that can turn
to terrorism
401
00:20:37,480 --> 00:20:40,410
to escape their demons.
402
00:20:40,420 --> 00:20:43,320
But in 2004,
UK Investigators uncovered
403
00:20:43,320 --> 00:20:45,720
a plot that challenged
everyone's notion
404
00:20:45,720 --> 00:20:48,220
of what makes a terrorist.
405
00:20:48,220 --> 00:20:50,560
They were tailing a terror cell.
406
00:20:50,560 --> 00:20:52,990
All the members were known
to law enforcement,
407
00:20:53,000 --> 00:20:56,060
except one man
who only was discovered
408
00:20:56,070 --> 00:20:59,600
when another member picked him
up from the airport.
409
00:20:59,600 --> 00:21:02,400
They saw someone
they didn't yet know
410
00:21:02,410 --> 00:21:06,670
who, in fact, turned out
to be Momin Khawaja.
411
00:21:06,680 --> 00:21:08,680
24 year old Momin Khawaja
412
00:21:08,680 --> 00:21:11,210
lived in Ottawa, Canada.
413
00:21:11,210 --> 00:21:14,920
He was a computer programer
with no criminal record.
414
00:21:14,920 --> 00:21:18,520
He had become radicalized
in total isolation,
415
00:21:18,520 --> 00:21:20,650
just like a lone wolf.
416
00:21:20,660 --> 00:21:23,290
Police arrested Khawaja
and the others
417
00:21:23,290 --> 00:21:25,560
before hey had time to act.
418
00:21:25,560 --> 00:21:29,360
The cell was plotting
to make and detonate
419
00:21:29,370 --> 00:21:32,930
a number of fertilizer bombs
in and around London,
420
00:21:32,940 --> 00:21:37,840
and Khawaja's role in
the plot was to design
421
00:21:37,840 --> 00:21:41,140
and make the detonators.
422
00:21:41,140 --> 00:21:43,780
Captured terrorists
rarely take you back
423
00:21:43,780 --> 00:21:46,210
to the turning points
of their youth,
424
00:21:46,220 --> 00:21:48,780
but Khawaja was different.
425
00:21:48,780 --> 00:21:53,590
In blogs and e-mails,
he chronicled his journey...
426
00:21:53,590 --> 00:21:56,890
growing up in the suburbs
of Ottawa, Canada,
427
00:21:56,890 --> 00:21:59,990
in a house much like this.
428
00:22:00,000 --> 00:22:03,500
Khawaja was unique in that
429
00:22:03,500 --> 00:22:05,930
he liked to write down
his thoughts,
430
00:22:05,940 --> 00:22:09,640
his feelings, his intentions.
431
00:22:09,640 --> 00:22:12,870
For Sophia,
finding these written records
432
00:22:12,880 --> 00:22:16,380
was like stumbling upon
a terrorist's private journal.
433
00:22:18,780 --> 00:22:21,110
I once was a normal kid, too.
434
00:22:21,120 --> 00:22:24,520
I played basketball,
went swimming, bike riding,
435
00:22:24,520 --> 00:22:29,020
and did all the naughty,
little things kids do.
436
00:22:29,020 --> 00:22:30,560
For us, as researchers,
437
00:22:30,560 --> 00:22:34,460
this was a very rare opportunity
to look into the mind
438
00:22:34,460 --> 00:22:40,400
of a terrorist as radicalization
was unfolding in real time.
439
00:22:40,400 --> 00:22:43,240
How did he turn to terror?
440
00:22:43,240 --> 00:22:46,770
He and his family
had no contact with terrorists.
441
00:22:46,780 --> 00:22:50,680
He lived a typical teenage life
of friends and schoolwork.
442
00:22:50,680 --> 00:22:53,450
In his writing,
he expressed one motive
443
00:22:53,450 --> 00:22:58,020
for his radicalization...
empathy.
444
00:22:58,020 --> 00:22:59,450
He identified deeply
445
00:22:59,460 --> 00:23:02,520
with the suffering
of fellow muslims.
446
00:23:02,530 --> 00:23:05,390
Khawaja
was emotionally affected.
447
00:23:05,390 --> 00:23:08,430
He watched fundamentalist
islamic videos
448
00:23:08,430 --> 00:23:10,360
that depicted injustices
449
00:23:10,370 --> 00:23:13,070
perpetuated by the west
on muslims
450
00:23:13,070 --> 00:23:17,940
or revenge that muslims
take on westerners.
451
00:23:17,940 --> 00:23:21,610
Then, right after
I got out of college,
452
00:23:21,610 --> 00:23:24,310
the invasion of
Afghanistan happened.
453
00:23:24,310 --> 00:23:27,980
I felt that something
was wrong... terribly wrong.
454
00:23:27,980 --> 00:23:30,250
He was driven by ideology
455
00:23:30,250 --> 00:23:33,620
and felt compelled
to pick a side.
456
00:23:33,620 --> 00:23:37,220
He picked the side of terror.
457
00:23:37,230 --> 00:23:40,460
He was
a very sensitive individual
458
00:23:40,460 --> 00:23:43,630
who couldn't just stand idly
459
00:23:43,630 --> 00:23:46,970
while someone else
was suffering.
460
00:23:46,970 --> 00:23:49,870
Finally, fully radicalized,
461
00:23:49,870 --> 00:23:51,770
Khawaja boarded a plane
462
00:23:51,770 --> 00:23:54,980
to a terror training
camp in Pakistan.
463
00:23:54,980 --> 00:23:58,980
That's where he first met
the other members of his cell.
464
00:23:58,980 --> 00:24:01,680
Khawaja combined many qualities
465
00:24:01,680 --> 00:24:05,220
that can be prized
in any society.
466
00:24:05,220 --> 00:24:08,390
He was a self-starter,
he was very smart,
467
00:24:08,390 --> 00:24:10,660
and under different
circumstances,
468
00:24:10,660 --> 00:24:13,190
if he became a doctor,
which, you know,
469
00:24:13,200 --> 00:24:16,400
he contemplated at one point,
he would have utilized
470
00:24:16,400 --> 00:24:18,970
all of those talents
and propensities
471
00:24:18,970 --> 00:24:20,870
in a very different way.
472
00:24:20,870 --> 00:24:23,300
But he was instead
put on this path
473
00:24:23,310 --> 00:24:25,510
that lead him to terrorism.
474
00:24:30,480 --> 00:24:34,210
The goals of men
like Momin Khawaja
475
00:24:34,220 --> 00:24:38,490
are a despicable
perversion of empathy.
476
00:24:38,490 --> 00:24:41,050
Psychologists
say most young terrorists
477
00:24:41,060 --> 00:24:44,660
share the same traits
as young people everywhere.
478
00:24:44,660 --> 00:24:48,760
They desperately want
to belong to something.
479
00:24:48,760 --> 00:24:50,160
But how can they decide
480
00:24:50,170 --> 00:24:53,600
to kill themselves
for their cause?
481
00:24:55,770 --> 00:24:57,940
It may be easier than you think.
482
00:25:00,940 --> 00:25:03,040
Suicide attacks,
483
00:25:03,050 --> 00:25:06,780
the most insidious
aspect of terrorism -
484
00:25:06,780 --> 00:25:09,380
killers who walk calmly
among their victims
485
00:25:09,380 --> 00:25:11,790
before they strike.
486
00:25:11,790 --> 00:25:15,820
The phenomenon has haunted us
for over a century.
487
00:25:15,820 --> 00:25:18,190
The first we know of was
a Russian revolutionary
488
00:25:18,190 --> 00:25:20,930
who assassinated a czar.
489
00:25:20,930 --> 00:25:22,300
Since the 1980s,
490
00:25:22,300 --> 00:25:25,300
the number of suicide attacks
has risen dramatically,
491
00:25:25,300 --> 00:25:30,000
taking the lives
of almost 50,000 people.
492
00:25:30,010 --> 00:25:32,740
How can this happen so often?
493
00:25:32,740 --> 00:25:36,080
What can make someone
want to end their life
494
00:25:36,080 --> 00:25:38,280
in an act of mass murder?
495
00:25:43,750 --> 00:25:46,750
It's a question
psychologist arie kruglanski
496
00:25:46,760 --> 00:25:49,060
has long grappled with.
497
00:25:49,060 --> 00:25:51,220
The early assumption on the part
498
00:25:51,230 --> 00:25:53,960
of social scientists
was that it reflects
499
00:25:53,960 --> 00:25:57,330
a kind of psychopathology...
that is people are basically
500
00:25:57,330 --> 00:25:59,730
mentally disturbed and abnormal.
501
00:25:59,740 --> 00:26:03,740
That was in the '70s
and early '80s.
502
00:26:03,740 --> 00:26:05,840
But research has never shown
503
00:26:05,840 --> 00:26:08,040
that terrorists have
any more mental problems
504
00:26:08,040 --> 00:26:10,780
than the general population.
505
00:26:10,780 --> 00:26:12,750
So how do they decide
to kill themselves
506
00:26:12,750 --> 00:26:15,020
in order to murder many?
507
00:26:15,020 --> 00:26:19,120
Surely, most people
would not make this choice.
508
00:26:19,120 --> 00:26:20,290
Or could they?
509
00:26:20,290 --> 00:26:23,660
Let's go, big red!
510
00:26:23,660 --> 00:26:26,460
Could these
cheerleaders become martyrs?
511
00:26:26,460 --> 00:26:27,660
Let's go!
512
00:26:27,660 --> 00:26:29,900
It's an absurd question,
513
00:26:29,900 --> 00:26:34,400
but arie believes anyone can,
given the right push.
514
00:26:34,400 --> 00:26:36,170
In the particular experiment,
515
00:26:36,170 --> 00:26:40,970
we try to emulate the process
of group identification...
516
00:26:40,980 --> 00:26:44,780
what are you ready to commit
on behalf of the group?
517
00:26:44,780 --> 00:26:47,950
Arie divides
his subjects into two groups,
518
00:26:47,950 --> 00:26:51,120
then he asks each group
to read a story
519
00:26:51,120 --> 00:26:54,350
and to circle
the pronouns as they go.
520
00:26:54,360 --> 00:26:55,890
But arie has given each groups
521
00:26:55,890 --> 00:26:59,260
slightly different versions
of the story.
522
00:26:59,260 --> 00:27:00,660
One group has a story
523
00:27:00,660 --> 00:27:03,560
which only has
singular pronouns...
524
00:27:03,570 --> 00:27:05,730
"I," "me," and "my."
525
00:27:05,730 --> 00:27:07,330
"I go to the city often."
526
00:27:07,340 --> 00:27:08,970
My anticipation fills me
527
00:27:08,970 --> 00:27:11,770
"as I see the skyscrapers
come into view."
528
00:27:11,770 --> 00:27:14,370
The other group
has the same story,
529
00:27:14,380 --> 00:27:18,780
but with plural pronouns...
"we," "us," and "ours."
530
00:27:18,780 --> 00:27:20,780
"We go to the city often."
531
00:27:20,780 --> 00:27:22,720
Our anticipation fills us
532
00:27:22,720 --> 00:27:25,790
"as we see the skyscrapers
come into view."
533
00:27:25,790 --> 00:27:28,190
Arie's priming his subjects,
534
00:27:28,190 --> 00:27:31,690
activating their subconscious
to focus on belonging
535
00:27:31,690 --> 00:27:34,490
or not belonging to a group.
536
00:27:34,500 --> 00:27:38,130
It has been demonstrated
that once you prime these
537
00:27:38,130 --> 00:27:39,930
plural pronouns,
538
00:27:39,940 --> 00:27:43,500
a person gets into a mind set
of group identification.
539
00:27:44,840 --> 00:27:46,470
After this priming,
540
00:27:46,480 --> 00:27:49,880
it's time to test exactly
how much the "we"
541
00:27:49,880 --> 00:27:52,010
and the "I"
groups would sacrifice
542
00:27:52,010 --> 00:27:55,080
for members of their own group.
543
00:27:55,080 --> 00:27:57,780
He asks the cheerleaders
to participate
544
00:27:57,790 --> 00:27:59,950
in a classic
psychological scenario
545
00:27:59,960 --> 00:28:02,690
called the trolley problem.
546
00:28:02,690 --> 00:28:06,090
It portrays individuals
who are in danger
547
00:28:06,090 --> 00:28:08,330
of being run over by a trolley,
548
00:28:08,330 --> 00:28:10,130
and the dilemma is for a person
549
00:28:10,130 --> 00:28:13,100
who could safe them
by sacrificing their life
550
00:28:13,100 --> 00:28:16,500
in throwing themselves
in front of the trolley.
551
00:28:16,510 --> 00:28:20,070
First, arie tests
the subjects who were primed
552
00:28:20,080 --> 00:28:23,280
to think of themselves
as individuals.
553
00:28:23,280 --> 00:28:25,680
Let's go, big red!
554
00:28:30,120 --> 00:28:32,320
Only 30% say they would give
555
00:28:32,320 --> 00:28:34,850
their lives for their friends.
556
00:28:34,860 --> 00:28:37,620
Then arie tests the "we" group.
557
00:28:37,630 --> 00:28:39,930
Let's go, big red!
558
00:28:41,800 --> 00:28:44,800
The difference is profound.
559
00:28:44,800 --> 00:28:49,370
Over 90% of people primed
with the "we," "ours,"
560
00:28:49,370 --> 00:28:53,970
"us" pronouns were willing
to self-sacrifice for the group.
561
00:28:57,310 --> 00:28:59,950
Arie's experiment
shows that group dynamics
562
00:28:59,950 --> 00:29:04,050
have incredible influence
on individual behavior.
563
00:29:04,050 --> 00:29:06,090
Even a subtle feeling
of belonging
564
00:29:06,090 --> 00:29:10,120
can dramatically change
what you're willing to do.
565
00:29:10,130 --> 00:29:13,690
And that doesn't mean that
everybody's equally susceptible
566
00:29:13,700 --> 00:29:16,930
to the influence
of violent ideologies.
567
00:29:16,930 --> 00:29:18,600
So there are
individual differences,
568
00:29:18,600 --> 00:29:19,830
but by and large,
569
00:29:19,840 --> 00:29:22,370
it's not a psychopathological
phenomena.
570
00:29:22,370 --> 00:29:23,800
These people are not crazy,
571
00:29:23,810 --> 00:29:27,470
so it's a question of group
pressure, group influence.
572
00:29:27,480 --> 00:29:29,410
Arie says that group pressure
573
00:29:29,410 --> 00:29:32,450
and the human desire to belong
is the lever
574
00:29:32,450 --> 00:29:37,250
that allows terrorists
to give their lives for a cause.
575
00:29:37,250 --> 00:29:41,620
Under certain circumstances,
even the most normal person
576
00:29:41,620 --> 00:29:44,820
can become a violent extremist.
577
00:29:44,830 --> 00:29:46,760
If subtle cues can push someone
578
00:29:46,760 --> 00:29:50,100
towards violent self-sacrifice,
579
00:29:50,100 --> 00:29:54,800
the right push in the other
direction might stop it.
580
00:29:54,800 --> 00:29:56,940
What kind of push?
581
00:29:56,940 --> 00:30:00,540
The first step to changing
an extremist's mind
582
00:30:00,540 --> 00:30:03,010
might be to agree with him.
583
00:30:06,130 --> 00:30:10,330
We've looked inside
the mind of a terrorist.
584
00:30:10,340 --> 00:30:13,570
What we really want to know
is whether there is a way
585
00:30:13,570 --> 00:30:18,240
to get into that mind
and change it.
586
00:30:18,240 --> 00:30:19,810
You've heard the expression
587
00:30:19,810 --> 00:30:23,350
"winning the hearts
and minds of the enemy."
588
00:30:23,350 --> 00:30:27,480
With terrorists,
that seems like a tall order.
589
00:30:27,490 --> 00:30:30,090
But there might be a way,
590
00:30:30,090 --> 00:30:32,720
and it starts by making
a slight adjustment
591
00:30:32,730 --> 00:30:35,690
to another old phrase...
592
00:30:35,690 --> 00:30:39,360
"if you can't beat them,
agree with them."
593
00:30:42,130 --> 00:30:45,400
Psychology professor
eran halperin studies
594
00:30:45,400 --> 00:30:47,570
the science of changing minds.
595
00:30:47,570 --> 00:30:51,480
It's an uphill battle
in his country, Israel,
596
00:30:51,480 --> 00:30:53,280
home to the intractable struggle
597
00:30:53,280 --> 00:30:56,750
between Israelis
and palestinians.
598
00:30:56,750 --> 00:30:59,050
One of the reasons most
traditional peace interventions
599
00:30:59,050 --> 00:31:02,420
do not work is that we have
two groups of people
600
00:31:02,420 --> 00:31:06,690
with opposing views trying
to reason with each other.
601
00:31:06,690 --> 00:31:10,090
Israelis and palestinians
argue like humans everywhere...
602
00:31:10,100 --> 00:31:14,870
one side express an opinion,
the other counters it.
603
00:31:14,870 --> 00:31:17,940
When someone tells you something
that you disagree with,
604
00:31:17,940 --> 00:31:19,600
the most automatic reaction
605
00:31:19,610 --> 00:31:23,610
is to try to confront him
with a counter message.
606
00:31:23,610 --> 00:31:25,580
But we've all been in arguments
607
00:31:25,580 --> 00:31:29,250
where reason gets you nowhere,
for deeply held beliefs,
608
00:31:29,250 --> 00:31:32,680
say, for gun control
or abortion.
609
00:31:32,690 --> 00:31:35,290
Science has shown
that counter messages
610
00:31:35,290 --> 00:31:38,520
can actually be
counter productive.
611
00:31:38,520 --> 00:31:40,020
Because, basically,
their beliefs,
612
00:31:40,030 --> 00:31:42,260
their attitudes are part
of their identity,
613
00:31:42,260 --> 00:31:43,960
and then anything
that contradicts
614
00:31:43,960 --> 00:31:45,900
what they believe
in sounds to them,
615
00:31:45,900 --> 00:31:48,030
or looks to them, like a threat.
616
00:31:48,030 --> 00:31:51,740
In psychology, we call
the the state of mind "frozen."
617
00:31:51,740 --> 00:31:54,200
This kind of freezing
can happen around
618
00:31:54,210 --> 00:31:56,440
any tightly held belief,
619
00:31:56,440 --> 00:31:59,440
and it happens with terrorists.
620
00:31:59,450 --> 00:32:01,680
Their believes
are sacred to them,
621
00:32:01,680 --> 00:32:03,180
and they become frozen,
622
00:32:03,180 --> 00:32:07,020
resisting any argument,
no matter how rational.
623
00:32:08,650 --> 00:32:12,160
As a young man,
eran looked around his homeland.
624
00:32:12,160 --> 00:32:15,360
He saw a conflict
seemingly without end
625
00:32:15,360 --> 00:32:18,400
and a sea of frozen minds.
626
00:32:18,400 --> 00:32:21,470
I was very seriously
injured in the Israeli army,
627
00:32:21,470 --> 00:32:24,700
and I decided
that this is my mission.
628
00:32:24,700 --> 00:32:28,040
We cannot just accept
it as the reality.
629
00:32:28,040 --> 00:32:30,170
This cannot be
the only situation
630
00:32:30,180 --> 00:32:32,310
in which we can live in.
631
00:32:32,310 --> 00:32:34,610
But how do you change minds
632
00:32:34,610 --> 00:32:37,350
without having an argument?
633
00:32:37,350 --> 00:32:40,320
Eran and his students
at Tel Aviv university
634
00:32:40,320 --> 00:32:42,390
had an idea.
635
00:32:42,390 --> 00:32:45,360
We are going to tell
people with extreme views,
636
00:32:45,360 --> 00:32:47,930
"you know, what?
You're right."
637
00:32:47,930 --> 00:32:51,530
Eran calls it
"paradoxical thinking."
638
00:32:51,530 --> 00:32:53,700
It's like mental jujitsu...
639
00:32:53,700 --> 00:32:58,070
you use people's own
opinions against them.
640
00:32:58,070 --> 00:33:01,840
Let's go go back to a rivalry
we already know.
641
00:33:01,840 --> 00:33:03,310
Say you have a friend
642
00:33:03,310 --> 00:33:05,940
who is a rabid fan
of the rattlers.
643
00:33:05,950 --> 00:33:08,580
You could tell him
how awesome the eagles are,
644
00:33:08,580 --> 00:33:10,610
but this will just
freeze his love
645
00:33:10,620 --> 00:33:12,820
of the rattlers in place.
646
00:33:12,820 --> 00:33:16,150
However, if you use the
paradoxical thinking technique,
647
00:33:16,160 --> 00:33:19,520
you tell him how awesome
the rattlers are,
648
00:33:19,530 --> 00:33:22,330
you tell him
it's the best team ever,
649
00:33:22,330 --> 00:33:24,830
better than family,
better than love,
650
00:33:24,830 --> 00:33:26,430
better than anything.
651
00:33:26,430 --> 00:33:30,230
Overwhelmed,
your friend might reconsider
652
00:33:30,240 --> 00:33:33,840
his love of his team.
653
00:33:33,840 --> 00:33:36,240
Eran wanted to test
his theory on Israel's
654
00:33:36,240 --> 00:33:38,440
most pressing issue...
655
00:33:38,440 --> 00:33:42,150
the Israeli-palestinian
conflict.
656
00:33:42,150 --> 00:33:44,680
So, he made some ads about it.
657
00:33:48,090 --> 00:33:51,560
It's meant to shock.
658
00:33:51,560 --> 00:33:53,360
The text reads...
659
00:34:01,270 --> 00:34:04,300
We took the core ideas
that Israel belive in,
660
00:34:04,300 --> 00:34:07,440
and we practically
took them to the extreme.
661
00:34:07,440 --> 00:34:09,270
Israelis, by and large,
662
00:34:09,280 --> 00:34:12,580
believe their side
of the conflict is just.
663
00:34:12,580 --> 00:34:13,940
But the ads concluded
664
00:34:13,950 --> 00:34:17,250
that not only was
the conflict just,
665
00:34:17,250 --> 00:34:21,150
but Israelis needed
the conflict.
666
00:34:21,150 --> 00:34:23,050
For the sake of these ideas,
667
00:34:23,060 --> 00:34:24,660
to preserve these ideas,
668
00:34:24,660 --> 00:34:26,590
we have to preserve
the conflict.
669
00:34:26,590 --> 00:34:30,090
And this is absurd
in the eyes of most Israelis.
670
00:34:30,100 --> 00:34:32,560
Eran started
the experiment small,
671
00:34:32,560 --> 00:34:35,630
with a focus group
of conservative Israelis.
672
00:34:35,630 --> 00:34:38,370
While they were watching
these videos for the first time,
673
00:34:38,370 --> 00:34:41,040
Israelis got really,
really angry.
674
00:34:41,040 --> 00:34:42,410
But as the time went by,
675
00:34:42,410 --> 00:34:44,710
and as they saw
these video clips again
676
00:34:44,710 --> 00:34:46,210
and again and again,
677
00:34:46,210 --> 00:34:51,180
they started what we call
"a process of unfreezing."
678
00:34:51,180 --> 00:34:52,620
When bombarded by views
679
00:34:52,620 --> 00:34:54,650
even more extreme
than their own,
680
00:34:54,650 --> 00:34:58,760
the test subjects started to
question their own positions,
681
00:34:58,760 --> 00:35:00,120
even those who had said
682
00:35:00,130 --> 00:35:02,960
they would never compromise
with palestinians
683
00:35:02,960 --> 00:35:06,900
sudden signal
a willingness to talk.
684
00:35:06,900 --> 00:35:10,100
It seemed to work
with eran's small sample.
685
00:35:10,100 --> 00:35:13,740
How would it work
in a real Israeli town?
686
00:35:13,740 --> 00:35:15,540
It is called giv'at shmuel,
687
00:35:15,540 --> 00:35:18,810
a city mainly dominated
by Israeli
688
00:35:18,810 --> 00:35:21,040
rightist and centrist people.
689
00:35:21,050 --> 00:35:23,880
And we tried to implement
this intervention
690
00:35:23,880 --> 00:35:27,650
on this entire city
of giv'at shmuel.
691
00:35:27,650 --> 00:35:29,890
Eran and the other researchers
692
00:35:29,890 --> 00:35:31,790
handed out fliers
on the streets,
693
00:35:31,790 --> 00:35:33,060
put up billboards,
694
00:35:33,060 --> 00:35:35,290
and targeted the video clips
to people
695
00:35:35,290 --> 00:35:38,430
in the neighborhood
who were online.
696
00:35:38,430 --> 00:35:40,360
What they discovered was
697
00:35:40,370 --> 00:35:44,370
a change beyond
their expectations.
698
00:35:44,370 --> 00:35:47,270
Again, in some cases
in the beginning,
699
00:35:47,270 --> 00:35:48,940
people got very angry,
700
00:35:48,940 --> 00:35:50,710
but then they discussed
these issues,
701
00:35:50,710 --> 00:35:52,040
when they talked about them,
702
00:35:52,040 --> 00:35:54,850
when they really exposed
themselves to these ideas,
703
00:35:54,850 --> 00:35:58,620
suddenly they started
reconsidering their positions.
704
00:35:58,620 --> 00:36:01,220
In fact,
the intervention worked best
705
00:36:01,220 --> 00:36:04,860
with people who had
the most hard-line views.
706
00:36:04,860 --> 00:36:07,760
One year later, eran found
the shift in opinions
707
00:36:07,760 --> 00:36:11,190
in the test group
had held steady.
708
00:36:11,200 --> 00:36:14,060
We hope that by
exposing more and more people
709
00:36:14,070 --> 00:36:16,370
to this paradoxical
thinking intervention,
710
00:36:16,370 --> 00:36:20,040
we can help them consider
more seriously positive
711
00:36:20,040 --> 00:36:24,210
or peaceful solutions
to the conflict.
712
00:36:24,210 --> 00:36:26,310
But one scientist has another
713
00:36:26,310 --> 00:36:30,780
far more radical proposition
to reduce terrorism...
714
00:36:30,780 --> 00:36:33,320
complete disengagement.
715
00:36:40,590 --> 00:36:42,960
Our struggle against
terrorism feels like
716
00:36:42,960 --> 00:36:46,060
a wall with no foreseeable end.
717
00:36:46,060 --> 00:36:49,400
We take out osama bin laden,
718
00:36:49,400 --> 00:36:52,270
and we find ourselves
fighting Isis.
719
00:36:52,270 --> 00:36:56,540
And a military victory
over Isis in Iraq or Syria
720
00:36:56,540 --> 00:37:00,510
won't end their attacks
on civilians around the world.
721
00:37:00,510 --> 00:37:02,810
For every terrorist
we eliminate,
722
00:37:02,820 --> 00:37:06,250
many more seem
to take their place.
723
00:37:06,250 --> 00:37:08,490
What can we do?
724
00:37:08,490 --> 00:37:12,690
Is there any way to stop
terrorism once and for all?
725
00:37:15,690 --> 00:37:18,160
Evolutionary
anthropologist Peter turchin
726
00:37:18,160 --> 00:37:21,900
has combed through the library
of our collective history
727
00:37:21,900 --> 00:37:25,140
to study the rise
and fall of nations.
728
00:37:25,140 --> 00:37:26,640
He rose to prominence
729
00:37:26,640 --> 00:37:30,270
by making one
stunning prediction.
730
00:37:30,280 --> 00:37:31,810
My book "war and peace and war"
731
00:37:31,810 --> 00:37:34,040
which I published in 2005,
732
00:37:34,050 --> 00:37:36,180
two years
after the occupation of Iraq
733
00:37:36,180 --> 00:37:40,050
by the U.S. and allies, I wrote
the following prediction...
734
00:37:40,050 --> 00:37:41,950
"the western intrusion"
735
00:37:41,950 --> 00:37:44,590
will eventually generate
a counter response,
736
00:37:44,590 --> 00:37:49,060
"possibly in the form
of a new theocratic caliphate."
737
00:37:51,700 --> 00:37:53,830
The U.S. went to Iraq in part
738
00:37:53,830 --> 00:37:56,200
to promote nation building.
739
00:37:56,200 --> 00:37:57,600
According to Peter,
740
00:37:57,600 --> 00:38:01,510
that effort was
a stunning success.
741
00:38:01,510 --> 00:38:05,180
The name of that success
is "Isis,"
742
00:38:05,180 --> 00:38:09,510
an islamic caliphate declared
in 2013,
743
00:38:09,520 --> 00:38:12,150
eight years after
Peter's prediction.
744
00:38:14,850 --> 00:38:19,090
What had Peter seen
that others had missed?
745
00:38:19,090 --> 00:38:22,290
Peter is not your
typical anthropologist.
746
00:38:22,290 --> 00:38:25,130
He takes a dim view
of most so-called
747
00:38:25,130 --> 00:38:27,060
"lessons" of history.
748
00:38:27,070 --> 00:38:29,500
One German historian counted
749
00:38:29,500 --> 00:38:32,100
how many explanations
people have proposed
750
00:38:32,100 --> 00:38:34,340
for the fall
of the Roman empire,
751
00:38:34,340 --> 00:38:37,570
and he counted over 220.
752
00:38:37,580 --> 00:38:39,540
Peter wanted to find a way
753
00:38:39,550 --> 00:38:43,210
to put historical hypotheses
to a scientific test.
754
00:38:43,220 --> 00:38:45,920
He developed a new
mathematical approach
755
00:38:45,920 --> 00:38:49,520
to history called cliodynamics.
756
00:38:49,520 --> 00:38:52,090
I see it as a slayer
of theories.
757
00:38:52,090 --> 00:38:54,690
By the time you're done,
I want to have whole cemeteries
758
00:38:54,690 --> 00:38:57,690
of dead theories out there.
759
00:38:57,700 --> 00:38:59,200
In the early 2000s,
760
00:38:59,200 --> 00:39:01,260
he started to build
mathematical models
761
00:39:01,270 --> 00:39:03,400
to do just that.
762
00:39:03,400 --> 00:39:05,100
One theory he wanted to explore
763
00:39:05,100 --> 00:39:08,770
was what causes the rise
of strong nation-states.
764
00:39:08,770 --> 00:39:11,880
He had hypothesized
that strong nations form
765
00:39:11,880 --> 00:39:15,550
when two dramatically
different cultures go to war.
766
00:39:15,550 --> 00:39:18,050
He thought of these wars
as a global version
767
00:39:18,050 --> 00:39:22,050
of Darwin's survival
of the fittest.
768
00:39:22,050 --> 00:39:24,250
It's a little like
a backgammon tournament
769
00:39:24,260 --> 00:39:28,160
where players stand in
for nation-states.
770
00:39:28,160 --> 00:39:30,990
The conditions
of intense warfare create
771
00:39:31,000 --> 00:39:35,100
a constant struggle for survival
amongst armed groups.
772
00:39:38,340 --> 00:39:42,040
The group that wins is the one
which is the most cohesive,
773
00:39:42,040 --> 00:39:43,410
the most functional,
774
00:39:43,410 --> 00:39:45,980
and oftentimes,
the nastiest one.
775
00:39:52,150 --> 00:39:55,720
Peter set a team
to work collecting data points,
776
00:39:55,720 --> 00:39:58,960
over 100,000 in all...
777
00:39:58,960 --> 00:40:00,690
birth and death rates,
778
00:40:00,690 --> 00:40:04,030
whether they had iron weapons,
agriculture,
779
00:40:04,030 --> 00:40:07,700
anything that might measure
a nation's strength or weakness.
780
00:40:10,270 --> 00:40:13,400
He observed that the more
the groups fought,
781
00:40:13,410 --> 00:40:16,140
the stronger the winning
group became.
782
00:40:16,140 --> 00:40:19,040
Eventually, he developed
a mathematical formula
783
00:40:19,040 --> 00:40:24,410
that predicted the rise
and fall of the Roman empire,
784
00:40:24,420 --> 00:40:26,920
the rise of the ottoman empire,
785
00:40:26,920 --> 00:40:28,420
and the historical growth
786
00:40:28,420 --> 00:40:31,190
of islamic caliphates
in the middle east.
787
00:40:31,190 --> 00:40:32,960
It seemed to work in the past.
788
00:40:32,960 --> 00:40:35,030
How would it work in the future?
789
00:40:35,030 --> 00:40:37,430
That's when Peter
turned his attention
790
00:40:37,430 --> 00:40:40,300
to the U.S. invasion of Iraq.
791
00:40:42,470 --> 00:40:47,440
So, after the U.S.
overthrew Saddam Hussein,
792
00:40:47,440 --> 00:40:51,470
this created a large
swath of territory,
793
00:40:51,480 --> 00:40:54,540
which was essentially
stateless with multiple
794
00:40:54,550 --> 00:40:58,520
armed groups battling
against each other.
795
00:40:58,520 --> 00:41:00,620
Eventually, the strongest,
796
00:41:00,620 --> 00:41:04,750
most ruthless group survived...
797
00:41:04,760 --> 00:41:07,190
Isis.
798
00:41:07,190 --> 00:41:10,360
If data can predict
the rise of Isis,
799
00:41:10,360 --> 00:41:12,500
can it tell us
how the islamic state
800
00:41:12,500 --> 00:41:14,000
could be defeated?
801
00:41:14,000 --> 00:41:16,430
So, how should we deal
with the islamic state?
802
00:41:16,440 --> 00:41:19,970
We have three options...
one of them is stay the course,
803
00:41:19,970 --> 00:41:22,940
which essentially means
continue using air power.
804
00:41:25,010 --> 00:41:28,550
Peter believes this
will only perpetuate warfare,
805
00:41:28,550 --> 00:41:29,810
the evolutionary pressure
806
00:41:29,820 --> 00:41:32,880
that gave rise to Isis
in the first place.
807
00:41:32,880 --> 00:41:35,290
The second option
is to escalate.
808
00:41:35,290 --> 00:41:39,620
This means we must be
more ruthless than our enemy,
809
00:41:39,620 --> 00:41:43,760
but given that this would cause
widespread civilian deaths,
810
00:41:43,760 --> 00:41:46,730
Peter believes
it's not an option.
811
00:41:46,730 --> 00:41:50,200
The other opposite
extreme is to do nothing.
812
00:41:50,200 --> 00:41:53,540
In the face of a brutal
organization like Isis,
813
00:41:53,540 --> 00:41:57,010
doing nothing seems shocking.
814
00:41:57,010 --> 00:41:59,040
But Peter is convinced that
815
00:41:59,040 --> 00:42:01,640
if we completely turn
our backs on the region,
816
00:42:01,650 --> 00:42:05,420
the war that created
Isis will diminish.
817
00:42:05,420 --> 00:42:08,320
They'll be left like
a backgammon champion
818
00:42:08,320 --> 00:42:10,720
with no one to play against.
819
00:42:10,720 --> 00:42:13,690
Peter knows this course
will be painful.
820
00:42:13,690 --> 00:42:15,890
Isis and the horrors
they perpetuate
821
00:42:15,890 --> 00:42:18,260
will not go away in an instant.
822
00:42:18,260 --> 00:42:21,400
We'll have to sit by
and let it happen.
823
00:42:21,400 --> 00:42:24,070
But Peter points to his data.
824
00:42:24,070 --> 00:42:28,470
This is the best option
in terms of saving most lives.
825
00:42:28,470 --> 00:42:32,440
Essentially, it means
closing the board
826
00:42:32,440 --> 00:42:33,680
and going home.
827
00:42:36,050 --> 00:42:39,080
In this era of terrorism,
828
00:42:39,080 --> 00:42:42,150
we may not like our options.
829
00:42:42,150 --> 00:42:46,420
But disagreements and debate
are what make us free.
830
00:42:46,420 --> 00:42:50,890
Today, the tools of science
offer us new approaches.
831
00:42:50,900 --> 00:42:55,100
The war against terror
is a war of ideas.
832
00:42:55,100 --> 00:42:59,640
As terrorists seek
to impose their rigid ideas,
833
00:42:59,640 --> 00:43:04,610
our greatest weapon
is our openness to new ideas.
64530