Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:06,006 --> 00:00:09,551
♪ ♪
2
00:00:09,551 --> 00:00:15,891
♪ ♪
3
00:00:15,891 --> 00:00:18,518
[Producer] In the 50 years
since the experiment,
4
00:00:18,518 --> 00:00:22,856
no one had done as extensive
research as you.
5
00:00:22,856 --> 00:00:24,233
[Thibault Le Texier] Yes.
6
00:00:26,777 --> 00:00:28,904
It started as a side project.
7
00:00:28,904 --> 00:00:31,448
I used to do
found footage films.
8
00:00:31,448 --> 00:00:34,576
♪ ♪
9
00:00:34,576 --> 00:00:38,247
And I discovered
the Stanford Prison Experiment.
10
00:00:38,247 --> 00:00:41,792
I've heard about it,
but never really digged into it.
11
00:00:41,792 --> 00:00:45,295
I decided to fund
a trip to Stanford
12
00:00:45,295 --> 00:00:47,047
to go through the archive,
13
00:00:47,047 --> 00:00:52,052
because I wanted to be able
to film from the archive.
14
00:00:52,052 --> 00:00:55,055
I thought it was fascinating
to investigate this.
15
00:00:55,055 --> 00:00:58,058
The narrative is very striking.
16
00:00:58,058 --> 00:01:01,144
It's really a story
about good and evil,
17
00:01:01,144 --> 00:01:03,897
about these innocent
young students
18
00:01:03,897 --> 00:01:05,941
behaving like tough guards
19
00:01:05,941 --> 00:01:09,736
just because they were given
power in the situation.
20
00:01:09,736 --> 00:01:11,822
It plays on the inner fears
of people
21
00:01:11,822 --> 00:01:15,450
that there's a monster
inside of all of us.
22
00:01:15,450 --> 00:01:17,828
Maybe all men are
potential rapists.
23
00:01:17,828 --> 00:01:21,164
Maybe we are all
potential Nazis.
24
00:01:21,164 --> 00:01:23,584
[Man] The monster
isn't out there.
25
00:01:23,584 --> 00:01:25,085
It's in you.
26
00:01:25,085 --> 00:01:26,336
[Le Texier]
The Stanford Prison Experiment
27
00:01:26,336 --> 00:01:28,255
became rapidly famous.
28
00:01:28,255 --> 00:01:29,339
[Jon Stewart] The famed...
[Amy Goodman] Stanford...
29
00:01:29,339 --> 00:01:30,799
-Prison...
-Experiment.
30
00:01:30,799 --> 00:01:32,467
[Joe Rogan] Are you aware of
the Stanford Prison Experiment?
31
00:01:32,467 --> 00:01:35,137
[Melissa Harris-Perry]
Dr. Z, this is a high nerd
moment for me.
32
00:01:35,137 --> 00:01:37,556
I know all of your work
very well.
33
00:01:37,556 --> 00:01:41,685
[Philip Zimbardo] My legacy is
to make psychology appetizing
34
00:01:41,685 --> 00:01:43,103
to the general public.
35
00:01:43,103 --> 00:01:44,271
[Le Texier] You have a rock band
36
00:01:44,271 --> 00:01:46,690
called the Stanford
Prison Experiment.
37
00:01:46,690 --> 00:01:48,608
You have series
talking about it.
38
00:01:48,608 --> 00:01:49,609
[Actor] In your cell!
39
00:01:49,609 --> 00:01:51,194
[Le Texier] And you have
feature films,
40
00:01:51,194 --> 00:01:53,530
a German one, an American one.
41
00:01:53,530 --> 00:01:55,532
[Zimbardo] It's important
to get it right, but for me,
42
00:01:55,532 --> 00:01:57,701
it's equally important
to make it interesting.
43
00:01:57,701 --> 00:01:59,036
[Le Texier]
It was in many books,
44
00:01:59,036 --> 00:02:02,956
books about group violence,
World War II, obesity.
45
00:02:02,956 --> 00:02:05,959
It was told in colleges,
everywhere in the US.
46
00:02:05,959 --> 00:02:07,127
[Student] I'm Dr. Zimbardo.
47
00:02:07,127 --> 00:02:09,379
We're recreating
the Stanford Prison Experiment.
48
00:02:09,379 --> 00:02:10,547
Let's go, prisoner.
49
00:02:10,547 --> 00:02:11,965
[Student] I have six
participants involved
50
00:02:11,965 --> 00:02:14,801
in something I call
the Stanford Prison Experiment.
51
00:02:14,801 --> 00:02:15,927
[Eminem]
♪ Hi! My name is... ♪
52
00:02:15,927 --> 00:02:17,304
[Man] Zimbardo.
53
00:02:17,304 --> 00:02:19,473
[Le Texier] It has become
more of a pop culture item
54
00:02:19,473 --> 00:02:21,016
than a scientific item.
55
00:02:21,016 --> 00:02:23,143
[Zimbardo]
I want to reach the world.
56
00:02:23,143 --> 00:02:24,686
[Host] His Holiness,
the Dalai Lama
57
00:02:24,686 --> 00:02:26,021
and Professor Phil Zimbardo.
58
00:02:26,021 --> 00:02:27,564
[applause]
59
00:02:27,564 --> 00:02:30,192
[Le Texier] Zimbardo says that
the Stanford Prison Experiment
60
00:02:30,192 --> 00:02:34,321
explains how millions of people
could agree with mass murdering.
61
00:02:34,321 --> 00:02:36,031
[Zimbardo] In Rwanda,
people killed
62
00:02:36,031 --> 00:02:38,575
800,000 of their neighbors.
63
00:02:38,575 --> 00:02:40,535
That's the power
of the situation.
64
00:02:40,535 --> 00:02:42,621
[Le Texier] He has
a very simple explanation
65
00:02:42,621 --> 00:02:44,831
to this very complex
world event.
66
00:02:44,831 --> 00:02:47,292
[Zimbardo] Maybe it was a bad
barrel that corrupted them.
67
00:02:47,292 --> 00:02:49,711
[Le Texier] You really feel like
you understand human nature
68
00:02:49,711 --> 00:02:53,131
after you've learned about
the Stanford Prison Experiment.
69
00:02:53,131 --> 00:02:57,594
Zimbardo became worldwide famous
from this and other topics.
70
00:02:57,594 --> 00:02:58,595
[Newscaster] Philip Zimbardo.
71
00:02:58,595 --> 00:02:59,596
[Newscaster] Dr. Zimbardo.
72
00:02:59,596 --> 00:03:01,098
-Philip Zimbardo.
-Phil Zimbardo.
73
00:03:01,098 --> 00:03:02,849
-Phil Zimbardo.
-Dr. Zimbardo.
74
00:03:02,849 --> 00:03:04,101
[multiple voices] Philip.
Philip. Philip. Philip.
75
00:03:04,101 --> 00:03:05,852
Philip. Philip. Philip. Philip.
Philip. Zimbardo.
76
00:03:05,852 --> 00:03:08,188
[Zimbardo] Yes. Philip Zimbardo.
77
00:03:08,188 --> 00:03:09,815
[Le Texier] But after going
through the archive,
78
00:03:09,815 --> 00:03:14,319
I started to realize that the
official narrative was a lie.
79
00:03:14,319 --> 00:03:15,862
[Zimbardo]
I hope that the message gets
carried to the public.
80
00:03:15,862 --> 00:03:17,364
[Newscaster] It just did.
81
00:03:32,212 --> 00:03:33,839
[Dave Eshleman] Well, you know,
the popular narrative
82
00:03:33,839 --> 00:03:35,590
for this thing,
as you probably know,
83
00:03:35,590 --> 00:03:37,634
is that you take
these normal people
84
00:03:37,634 --> 00:03:39,302
and you put them
in this evil environment
85
00:03:39,302 --> 00:03:41,096
and suddenly they become evil.
86
00:03:41,096 --> 00:03:44,015
But, you know, I just, I don't
think you can draw conclusions
87
00:03:44,015 --> 00:03:46,518
from it because of the way
that it was done.
88
00:03:46,518 --> 00:03:50,480
And the effect that he had in
tipping the scale, so to speak.
89
00:03:50,480 --> 00:03:53,567
[Doug Korpi] Zimbardo had a
perspective he wanted to advance
90
00:03:53,567 --> 00:03:56,736
when he did the experiment, that
the guards did do evil things
91
00:03:56,736 --> 00:03:59,322
and the prisoners did
become more passive
92
00:03:59,322 --> 00:04:04,244
than they might otherwise,
but it's not the whole story.
93
00:04:04,244 --> 00:04:05,787
[John Mark] If you want
to have an experiment
94
00:04:05,787 --> 00:04:09,833
where everyone's the same,
you should use fruit flies.
95
00:04:09,833 --> 00:04:11,585
[Eshleman] We all behave
because we have reasons
96
00:04:11,585 --> 00:04:13,086
to behave the way that we do.
97
00:04:13,086 --> 00:04:14,337
It's very good theater,
98
00:04:14,337 --> 00:04:17,507
but I don't think it qualifies
as good science.
99
00:04:19,509 --> 00:04:23,138
♪ ♪
100
00:04:23,138 --> 00:04:25,140
[Le Texier] At Stanford,
in the archive,
101
00:04:25,140 --> 00:04:28,059
you have pretty much everything
that was recording
102
00:04:28,059 --> 00:04:29,394
during the experiment.
103
00:04:29,394 --> 00:04:32,147
You had the forms that
the prisoners and the guard
104
00:04:32,147 --> 00:04:36,109
had to fill, reports that the
guards had to write every day.
105
00:04:36,109 --> 00:04:38,528
A lot of press.
106
00:04:38,528 --> 00:04:41,323
There was about
six hours of films
107
00:04:41,323 --> 00:04:44,618
and 15 hours of audio material.
108
00:04:44,618 --> 00:04:47,454
For the next two weeks, I sat
every morning at the library
109
00:04:47,454 --> 00:04:49,581
and went through these boxes.
110
00:04:49,581 --> 00:04:52,751
I also went through
all the articles and books
111
00:04:52,751 --> 00:04:55,796
written by Zimbardo.
112
00:04:55,796 --> 00:05:01,009
I discovered in the archive
elements that I found shocking.
113
00:05:01,009 --> 00:05:02,135
[Producer]
Can you walk me through
114
00:05:02,135 --> 00:05:04,930
what happened on day one?
115
00:05:04,930 --> 00:05:09,684
[Le Texier] It depends
on what you call day one.
116
00:05:09,684 --> 00:05:11,019
I found in the archive
117
00:05:11,019 --> 00:05:14,773
that there was a whole
training day for the guards.
118
00:05:14,773 --> 00:05:18,401
And Zimbardo talked about this
training day in his articles
119
00:05:18,401 --> 00:05:19,653
and in his books,
120
00:05:19,653 --> 00:05:22,364
but he's saying that
it's just to restrain them
121
00:05:22,364 --> 00:05:24,449
from using violence,
for instance.
122
00:05:24,449 --> 00:05:26,535
[Lester Holt] Were they given
guidelines at all?
123
00:05:26,535 --> 00:05:28,286
[Zimbardo] They were given
very simple guidelines.
124
00:05:28,286 --> 00:05:30,497
If the prisoners escape,
the study is over,
125
00:05:30,497 --> 00:05:33,458
maintain law and order,
but no physical abuse.
126
00:05:33,458 --> 00:05:36,044
Essentially they were given no
training in how to be a guard.
127
00:05:36,044 --> 00:05:37,546
[Le Texier]
Actually, in the archives,
128
00:05:37,546 --> 00:05:41,007
you can see that it was
a real training session.
129
00:05:41,007 --> 00:05:42,551
[Zimbardo] We got these clubs.
130
00:05:42,551 --> 00:05:43,802
They'll have no privacy.
131
00:05:43,802 --> 00:05:46,429
We have total power
in the situation.
132
00:05:46,429 --> 00:05:48,723
It's totally controlled by us.
133
00:05:48,723 --> 00:05:51,476
[repeated]
Totally controlled by us.
134
00:05:53,520 --> 00:05:55,063
[Le Texier] And the discovery
of Kent Cotter
135
00:05:55,063 --> 00:05:57,566
was another red flag.
136
00:05:57,566 --> 00:05:59,442
[Kent Cotter] I'm the guy
that you never heard about
137
00:05:59,442 --> 00:06:01,278
that you should be
hearing about,
138
00:06:01,278 --> 00:06:04,030
because I am the guy that quit.
139
00:06:04,030 --> 00:06:06,658
[Producer] Why is it important
to know about you?
140
00:06:06,658 --> 00:06:08,535
[Cotter] Well, not everybody
participated.
141
00:06:08,535 --> 00:06:10,745
I would think there would be...
142
00:06:10,745 --> 00:06:14,499
that should have come up
at some point.
143
00:06:14,499 --> 00:06:19,754
♪ ♪
144
00:06:19,754 --> 00:06:21,923
I was born in the Midwest
145
00:06:21,923 --> 00:06:25,176
and came to California
as soon as I could,
146
00:06:25,176 --> 00:06:27,137
because California
was the place to be.
147
00:06:27,137 --> 00:06:29,264
Everybody's doing drugs
and smoking pot.
148
00:06:29,264 --> 00:06:32,225
And so I came out
to move to Palo Alto.
149
00:06:32,225 --> 00:06:34,728
I did odd jobs, handyman stuff.
150
00:06:34,728 --> 00:06:38,023
And I was doing
experiments at Stanford.
151
00:06:38,023 --> 00:06:39,441
In fact, I don't really
remember any of them.
152
00:06:39,441 --> 00:06:40,984
I just remember
making the money,
153
00:06:40,984 --> 00:06:44,738
and that was a good thing, other
than the, the prison experiment.
154
00:06:47,407 --> 00:06:51,703
I signed up for this experiment
and walked into this room
155
00:06:51,703 --> 00:06:55,040
and found out that I'd
been chosen as a guard.
156
00:06:55,040 --> 00:06:56,791
They had all the guards
come to this meeting
157
00:06:56,791 --> 00:06:59,210
to explain what they were
doing in the experiment.
158
00:07:14,851 --> 00:07:15,977
[Cotter]
My feeling about Zimbardo
159
00:07:15,977 --> 00:07:18,521
was distrust and suspicion.
160
00:07:18,521 --> 00:07:19,939
There was something
about the way
161
00:07:19,939 --> 00:07:21,441
that he conducted himself
in the meeting.
162
00:07:32,369 --> 00:07:33,620
[Cotter] To me, it seemed
like the whole thing,
163
00:07:33,620 --> 00:07:35,538
we were being pushed
in a direction.
164
00:07:35,538 --> 00:07:38,833
That that's why we were there,
to do what they expected.
165
00:07:38,833 --> 00:07:40,126
It sort of escalated, you know,
166
00:07:40,126 --> 00:07:42,045
it was like guys are
really getting into it.
167
00:07:42,045 --> 00:07:44,130
You could do this and, you know,
you can control them like that,
168
00:07:44,130 --> 00:07:45,590
and it was hard for me
to listen to it,
169
00:07:45,590 --> 00:07:50,512
but also I felt more and more
isolated from that group.
170
00:07:50,512 --> 00:07:52,722
I just decided I wasn't
gonna participate in it.
171
00:07:52,722 --> 00:07:55,850
So, I started thinking right
then, how do I bail out of this?
172
00:07:55,850 --> 00:07:58,103
Right after the meeting,
I went into his office
173
00:07:58,103 --> 00:08:01,648
and I told him I just couldn't
see myself doing this.
174
00:08:01,648 --> 00:08:04,943
I had a real problem with
what the guards were saying.
175
00:08:04,943 --> 00:08:07,779
His initial reaction to me was,
"Well, why don't you stay, then?
176
00:08:07,779 --> 00:08:09,906
Why don't you stay
and redirect this?"
177
00:08:09,906 --> 00:08:14,244
And I said, "I don't think
it can be redirected."
178
00:08:14,244 --> 00:08:18,581
I felt like this was set up
for the guards to abuse.
179
00:08:18,581 --> 00:08:21,793
So how could it go
any other way?
180
00:08:21,793 --> 00:08:23,795
[Le Texier] Zimbardo wanted
people to believe
181
00:08:23,795 --> 00:08:26,840
that the participants were
spontaneously responding
182
00:08:26,840 --> 00:08:30,051
to the situation,
whereas the case of Kent Cotter
183
00:08:30,051 --> 00:08:33,346
showed that even before
starting the experiment,
184
00:08:33,346 --> 00:08:35,682
the participant could
really easily guess
185
00:08:35,682 --> 00:08:38,518
what Zimbardo wanted to prove.
186
00:08:38,518 --> 00:08:41,896
What's also a red flag is
that Zimbardo always pretended
187
00:08:41,896 --> 00:08:44,774
that the guards made up
their own rules.
188
00:08:44,774 --> 00:08:46,776
[Zimbardo] First thing the
guards said is, "We need rules."
189
00:08:46,776 --> 00:08:48,194
And so they set about
190
00:08:48,194 --> 00:08:50,196
to make up the systematic
number of rules.
191
00:08:50,196 --> 00:08:51,197
[Interviewer]
The guards made the rules?
192
00:08:51,197 --> 00:08:52,574
[Zimbardo] Oh, oh, absolutely.
193
00:08:52,574 --> 00:08:54,617
[Interviewer] Not you?
[Zimbardo] Oh, no, not at all.
194
00:08:54,617 --> 00:08:57,954
[Le Texier] But in the archives,
I discovered a lot of evidence
195
00:08:57,954 --> 00:09:01,291
showing that the guards
were given the rules
196
00:09:01,291 --> 00:09:02,709
that they were to enforce.
197
00:09:02,709 --> 00:09:06,796
They knew what punishments
they could impose to prisoners.
198
00:09:10,008 --> 00:09:13,636
And in their own reports, they
wrote at the end of each day,
199
00:09:13,636 --> 00:09:16,181
schedule followed
pretty closely.
200
00:09:16,181 --> 00:09:17,515
Punishment.
201
00:09:19,476 --> 00:09:21,144
They were following a script.
202
00:09:21,144 --> 00:09:24,522
They were not reacting
spontaneously to the situation.
203
00:09:24,522 --> 00:09:25,690
[Zimbardo]
Oh, this is all spontaneous.
204
00:09:25,690 --> 00:09:27,609
The only thing we told
the guards were,
205
00:09:27,609 --> 00:09:29,235
"We want to maintain
a prison here for a week,
206
00:09:29,235 --> 00:09:31,821
do whatever you have
to maintain law and order."
207
00:09:34,240 --> 00:09:36,076
[Eshleman]
After all of these instructions,
208
00:09:36,076 --> 00:09:38,078
I felt that we were all
on the same page
209
00:09:38,078 --> 00:09:40,163
with what we wanted to see
come out of this,
210
00:09:40,163 --> 00:09:44,667
and that was proof that prisons
are an evil environment.
211
00:09:44,667 --> 00:09:47,921
Given the times and given
the fact we were students
212
00:09:47,921 --> 00:09:51,758
and very anti-establishment,
we would have done anything
213
00:09:51,758 --> 00:09:55,720
to prove that this prison system
was an evil institution.
214
00:09:55,720 --> 00:09:58,765
We were happy to play that role.
215
00:10:01,476 --> 00:10:07,440
[guitar playing]
216
00:10:07,440 --> 00:10:09,484
[Man] Test one, two.
217
00:10:09,484 --> 00:10:11,027
[Eshleman in British accent]
Uh, okay.
218
00:10:11,027 --> 00:10:14,072
When you do your mic cable, if
you bring it down around here,
219
00:10:14,072 --> 00:10:15,365
come around to the left.
220
00:10:15,365 --> 00:10:17,408
Can you do that?
221
00:10:17,408 --> 00:10:19,327
My stage name is Nigel.
222
00:10:19,327 --> 00:10:22,038
This is Nigel and Clive
and the British Invasion.
223
00:10:22,038 --> 00:10:27,377
♪ ♪
224
00:10:27,377 --> 00:10:28,753
Come to my dressing room.
225
00:10:28,753 --> 00:10:31,297
And here I must leave you.
226
00:10:31,297 --> 00:10:37,846
♪ ♪
227
00:10:37,846 --> 00:10:39,722
Really, really nice there.
228
00:10:39,722 --> 00:10:41,099
The hair is growing out a bit,
229
00:10:41,099 --> 00:10:43,351
covers the bald spot,
doesn't it?
230
00:10:43,351 --> 00:10:46,855
Which ones do you prefer?
Do you like these here?
231
00:10:46,855 --> 00:10:48,398
Or these?
232
00:10:51,067 --> 00:10:53,611
These are a bit smaller,
aren't they?
233
00:10:53,611 --> 00:10:57,699
♪ ♪
234
00:10:57,699 --> 00:10:58,867
You're a star.
235
00:10:58,867 --> 00:11:00,577
Go out and knock 'em dead.
236
00:11:00,577 --> 00:11:02,829
Let's do some music.
237
00:11:05,582 --> 00:11:07,250
A one, two, three.
238
00:11:07,250 --> 00:11:10,086
♪ ♪
239
00:11:10,086 --> 00:11:12,422
In high school,
I got very much into theater.
240
00:11:12,422 --> 00:11:15,091
I was in every play,
every musical, the choir.
241
00:11:15,091 --> 00:11:17,844
And then I decided
I wanted to be an actor.
242
00:11:17,844 --> 00:11:22,265
My acting teacher always said,
"Do not break character,"
243
00:11:22,265 --> 00:11:24,017
and that is the goal.
244
00:11:24,017 --> 00:11:29,606
♪ ♪
245
00:11:29,606 --> 00:11:35,737
♪ ♪
246
00:11:35,737 --> 00:11:38,448
Are you wearing
your hearing aid?
247
00:11:38,448 --> 00:11:42,452
The way that I was portrayed
in the experiment by Zimbardo
248
00:11:42,452 --> 00:11:45,204
was that I step
into this evil environment
249
00:11:45,204 --> 00:11:47,165
and suddenly I become
this evil character,
250
00:11:47,165 --> 00:11:50,710
when in fact before I stepped
into the experiment,
251
00:11:50,710 --> 00:11:53,671
I stay outside
and I get in character.
252
00:11:53,671 --> 00:11:57,467
And when I'm ready, I enter,
and now I am my character.
253
00:12:05,433 --> 00:12:07,352
[Zimbardo] You see how quickly
the good boys
254
00:12:07,352 --> 00:12:08,811
become brutal guards.
255
00:12:13,107 --> 00:12:14,859
[Eshleman] It's like
it never occurred to them,
256
00:12:14,859 --> 00:12:17,195
like they've never been
to improv class.
257
00:12:23,076 --> 00:12:24,535
[Prisoner] One, two, three...
258
00:12:24,535 --> 00:12:26,955
[Eshleman] So, I decided
I would become Strother Martin
259
00:12:26,955 --> 00:12:28,748
from Cool Hand Luke.
260
00:12:28,748 --> 00:12:31,417
He said, "You know, man,
I can be a nice guy..."
261
00:12:31,417 --> 00:12:34,045
[Strother Martin] Or I can be
one real mean son of a bitch.
262
00:12:34,045 --> 00:12:36,881
[Eshleman] I would bang
my nightstick against my hand.
263
00:12:36,881 --> 00:12:40,260
I would say,
"I'm gonna hit you so hard,
264
00:12:40,260 --> 00:12:43,596
it's gonna kill
your whole family."
265
00:12:43,596 --> 00:12:44,973
And I kept a straight face.
266
00:12:47,058 --> 00:12:48,518
[Jerry Shue]
He was pretty convincing.
267
00:12:48,518 --> 00:12:49,435
[chuckles]
268
00:12:51,896 --> 00:12:54,941
[John Loftus] "What we have here
is a failure to communicate."
269
00:12:54,941 --> 00:12:56,567
Prisoners thought
it was hilarious.
270
00:13:01,155 --> 00:13:02,657
[Eshleman]
I never broke character.
271
00:13:02,657 --> 00:13:04,409
That would have destroyed
the whole effect
272
00:13:04,409 --> 00:13:05,827
that I was going for.
273
00:13:05,827 --> 00:13:07,203
I like to stir it up.
274
00:13:07,203 --> 00:13:09,831
You know, I was kind of a jerk
in that way, frankly.
275
00:13:09,831 --> 00:13:12,542
[Zimbardo] He was creative
in his evil.
276
00:13:12,542 --> 00:13:16,087
He would think of very
ingenious ways to degrade,
277
00:13:16,087 --> 00:13:17,296
to demean the prisoners.
278
00:13:18,715 --> 00:13:20,466
[Eshleman] A lot of
the behaviors that appeared
279
00:13:20,466 --> 00:13:23,302
in the experiment,
maybe given a couple of months,
280
00:13:23,302 --> 00:13:24,971
you would have seen
the same thing happen,
281
00:13:24,971 --> 00:13:27,181
I just made them happen
in a couple of days.
282
00:13:27,181 --> 00:13:31,269
I was very sympathetic to what
I thought their goals were,
283
00:13:31,269 --> 00:13:35,106
and that was to prove that
prison was a lousy place.
284
00:13:35,106 --> 00:13:36,899
Turn it around. Push-ups.
285
00:13:36,899 --> 00:13:39,027
So, I may have made
an impact on, you know,
286
00:13:39,027 --> 00:13:43,698
the understanding of
human nature with my acting job.
287
00:13:43,698 --> 00:13:45,533
Thank you, 416.
[bang]
288
00:13:45,533 --> 00:13:48,619
-Over there.
-Oh, that's perfect.
289
00:13:48,619 --> 00:13:51,664
[Korpi] Not all the guards
became sadists; only some did.
290
00:13:51,664 --> 00:13:54,792
Phil makes it into like
everybody became a sadist.
291
00:13:54,792 --> 00:13:56,085
[Zimbardo] He became a sadist.
292
00:13:56,085 --> 00:13:57,587
Guards behaved sadistically.
293
00:13:57,587 --> 00:13:59,672
What it brang out was the worst.
294
00:13:59,672 --> 00:14:01,758
Cruel, sadistic, dominating.
295
00:14:01,758 --> 00:14:03,885
Going way beyond the rules
that they had set.
296
00:14:03,885 --> 00:14:05,261
[Korpi] You had
different feelings
297
00:14:05,261 --> 00:14:06,637
toward different guards.
298
00:14:06,637 --> 00:14:08,931
Some of you were worse [bleep]
than the others.
299
00:14:08,931 --> 00:14:12,393
You're all [bleep],
but you're a really bad [bleep],
300
00:14:12,393 --> 00:14:13,728
you're a medium [bleep].
301
00:14:13,728 --> 00:14:16,856
Some were really, really
deeply into their [bleep].
302
00:14:16,856 --> 00:14:19,400
[Zimbardo] These good guards
were totally into the role
303
00:14:19,400 --> 00:14:22,236
of being sadistic,
controlling, dominant.
304
00:14:22,236 --> 00:14:23,571
[Korpi]
They're different people.
305
00:14:23,571 --> 00:14:26,783
But Phil doesn't think
they're different.
306
00:14:26,783 --> 00:14:32,205
♪ ♪
307
00:14:32,205 --> 00:14:37,126
[Mark] Professor Zimbardo truly
believes that there is an evil
308
00:14:37,126 --> 00:14:39,921
that lurks in every human.
309
00:14:39,921 --> 00:14:44,592
He was using this experiment
to make his point.
310
00:14:44,592 --> 00:14:46,886
He said it's part
of human nature.
311
00:14:46,886 --> 00:14:50,515
It wasn't even true
of the nine guards.
312
00:14:52,600 --> 00:14:54,477
I didn't like being
a guard at all.
313
00:14:54,477 --> 00:14:59,690
I had a lot of compassion
for the prisoners.
314
00:14:59,690 --> 00:15:00,817
[ding]
315
00:15:00,817 --> 00:15:03,569
When I had been at Stanford
in France,
316
00:15:03,569 --> 00:15:06,405
I was smoking pot every day.
317
00:15:06,405 --> 00:15:09,534
We smoked openly on the campus.
318
00:15:09,534 --> 00:15:14,080
My first quarter, I took
Buddhism, Confucianism,
319
00:15:14,080 --> 00:15:19,168
and Daoism, Jewish mysticism,
and Chinese art,
320
00:15:19,168 --> 00:15:23,631
but I also just absorbed
what I could absorb.
321
00:15:23,631 --> 00:15:28,386
I'd had very close encounters
with getting arrested for hash,
322
00:15:28,386 --> 00:15:31,848
so when I saw
this prison experiment,
323
00:15:31,848 --> 00:15:34,058
it just really felt like
it'd be a way
324
00:15:34,058 --> 00:15:37,854
to have a cathartic experience
of what it would have been like
325
00:15:37,854 --> 00:15:40,273
had I been incarcerated.
326
00:15:40,273 --> 00:15:43,568
When I was a guard,
I thought pretty much every day
327
00:15:43,568 --> 00:15:47,989
about giving some joints
to the prisoners.
328
00:15:47,989 --> 00:15:50,449
I wanted to be one of them.
329
00:15:50,449 --> 00:15:52,869
[Guard] Three.
[Prisoners] Three.
330
00:15:52,869 --> 00:15:54,036
[Guard] Four.
[Prisoners] Four.
331
00:15:54,036 --> 00:15:55,121
[Guard] Too slow!
332
00:15:55,121 --> 00:15:56,122
[Guard] Five.
[Prisoners] Five.
333
00:15:56,122 --> 00:15:57,874
[Mark] One day, the warden,
334
00:15:57,874 --> 00:16:00,960
who was a graduate student
of Professor Zimbardo,
335
00:16:00,960 --> 00:16:05,006
took me aside and he said
that they had noticed
336
00:16:05,006 --> 00:16:08,551
that I wasn't acting tough.
337
00:16:23,900 --> 00:16:25,902
[Mark] They thought I was
kind of laying back
338
00:16:25,902 --> 00:16:28,696
and kind of letting others
take the lead,
339
00:16:28,696 --> 00:16:32,200
and I was pretty neutral.
340
00:16:32,200 --> 00:16:33,659
All of which was true.
341
00:16:39,582 --> 00:16:42,043
[Mark] He kind of gave me
a pep talk about how I needed
342
00:16:42,043 --> 00:16:46,547
to help out my fellow guards and
how to help out the experiment.
343
00:16:55,556 --> 00:16:58,726
[Mark] They did try and
interfere with the way
344
00:16:58,726 --> 00:17:00,186
the experiment was going
345
00:17:00,186 --> 00:17:03,773
and they tried to mold it
to their expectations.
346
00:17:17,078 --> 00:17:19,163
[Mark] They could say
whatever they wanted to say,
347
00:17:19,163 --> 00:17:22,458
but I wasn't gonna
change anything.
348
00:17:22,458 --> 00:17:24,168
[Le Texier]
Real scientific experiment,
349
00:17:24,168 --> 00:17:28,381
you would not intervene to
produce the results you want.
350
00:17:28,381 --> 00:17:30,841
And you can see
the experimenters
351
00:17:30,841 --> 00:17:33,594
really putting their hands
into the material
352
00:17:33,594 --> 00:17:35,471
that they're supposed
just to watch.
353
00:17:35,471 --> 00:17:37,974
This is bad
scientific methodology.
354
00:17:37,974 --> 00:17:40,101
Unfortunately,
when you produce science
355
00:17:40,101 --> 00:17:43,688
just in order to make
something striking,
356
00:17:43,688 --> 00:17:47,358
that's when you're
on a slippery slope.
357
00:17:47,358 --> 00:17:51,404
With his team, Zimbardo was
not behaving as a scientist.
358
00:17:51,404 --> 00:17:53,030
He was behaving as someone
359
00:17:53,030 --> 00:17:56,492
who really wanted
to shock the world.
360
00:17:59,370 --> 00:18:03,040
What's more is that the ethical
dimension of the experiment
361
00:18:03,040 --> 00:18:05,293
wasn't questioned at all.
362
00:18:05,293 --> 00:18:08,421
You have the feeling that
the prisoners could get out
363
00:18:08,421 --> 00:18:11,048
at any moment and,
which was not true.
364
00:18:11,048 --> 00:18:14,552
Actually, the participants, they
had signed a document saying
365
00:18:14,552 --> 00:18:17,179
that they could only get out
366
00:18:17,179 --> 00:18:21,809
with the authorization
of Zimbardo.
367
00:18:21,809 --> 00:18:26,105
One prisoner, Doug Korpi, is
trying to rebel and get out,
368
00:18:26,105 --> 00:18:28,983
and Zimbardo tells him
he cannot leave.
369
00:18:28,983 --> 00:18:30,776
[Korpi] Zimbardo told me
that I couldn't
370
00:18:30,776 --> 00:18:32,820
get out of the experiment
with a tummy ache.
371
00:18:32,820 --> 00:18:36,073
All I'm thinking is,
"[bleep], this is for real.
372
00:18:36,073 --> 00:18:38,159
These people don't
want to let me out."
373
00:18:38,159 --> 00:18:39,368
And I'm pissed off.
374
00:18:48,044 --> 00:18:49,920
[Korpi] They threw me
in the hole.
375
00:18:49,920 --> 00:18:52,715
[door slams]
376
00:18:52,715 --> 00:18:55,259
I was in this closet...
377
00:18:55,259 --> 00:18:56,510
[Guard] Don't let him in.
378
00:18:56,510 --> 00:18:58,971
[Korpi] ...on my side,
lying down
379
00:18:58,971 --> 00:19:00,723
in the dark.
380
00:19:00,723 --> 00:19:03,851
And I remember being really
like, "Oh, [bleep].
381
00:19:03,851 --> 00:19:06,771
I can't get out
of this experiment."
382
00:19:06,771 --> 00:19:09,690
Somebody had control
over my life.
383
00:19:09,690 --> 00:19:11,442
I'm in this emotional state.
384
00:19:11,442 --> 00:19:12,443
[bleep]
385
00:19:12,443 --> 00:19:15,196
And then I got desperate.
386
00:19:15,196 --> 00:19:16,947
I had to be creative.
387
00:19:20,409 --> 00:19:25,748
And that's when I decided
I had to act like I was crazy.
388
00:19:25,748 --> 00:19:27,792
I thought, well, there's
three things you need to do.
389
00:19:27,792 --> 00:19:29,919
You need to be agitated
in your body,
390
00:19:29,919 --> 00:19:33,881
you needed to be screaming, and
you needed to generate tears.
391
00:19:33,881 --> 00:19:36,592
I want out! I want out!
I want out!
392
00:19:36,592 --> 00:19:39,345
All I was thinking was,
keep this up, keep this up.
393
00:19:39,345 --> 00:19:41,514
Try not to give in
too quickly to saying,
394
00:19:41,514 --> 00:19:43,599
"Oh, I'm faking it to get out."
395
00:19:43,599 --> 00:19:46,102
You don't just sit and be upset.
396
00:19:46,102 --> 00:19:47,978
You, you have to act upset.
397
00:19:47,978 --> 00:19:50,439
I was acting agitated.
398
00:19:50,439 --> 00:19:53,275
And it's a lot of work.
399
00:19:53,275 --> 00:19:56,570
Somehow traumatizing
to, to rev yourself up
400
00:19:56,570 --> 00:19:58,531
into that mental state.
401
00:19:58,531 --> 00:20:01,200
It was yucky.
402
00:20:01,200 --> 00:20:03,911
[Zimbardo] In 36 hours,
the first prisoner
403
00:20:03,911 --> 00:20:05,496
had an emotional breakdown.
404
00:20:05,496 --> 00:20:08,416
[Eshleman] If you listen to the
tape of the mental breakdown...
405
00:20:11,001 --> 00:20:14,422
[Eshleman] It does sound a
little suspect, I'll be honest.
406
00:20:14,422 --> 00:20:17,883
[Prisoner Actor] I'm burning up
inside, don't you know that?
407
00:20:17,883 --> 00:20:19,635
This door isn't locked.
408
00:20:19,635 --> 00:20:22,304
[laughter]
409
00:20:22,304 --> 00:20:24,849
So, if I touch this,
it doesn't lock?
410
00:20:24,849 --> 00:20:27,518
[Guard Actor] No, I don't think
it, I don't think it does. No.
411
00:20:27,518 --> 00:20:29,228
[Prisoner Actor]
So, I should hit this one?
412
00:20:29,228 --> 00:20:30,771
[Producer] Starting on, he says,
413
00:20:30,771 --> 00:20:32,857
"At some point,
I was lying down."
414
00:20:32,857 --> 00:20:35,359
Just at some point,
you're lying down.
415
00:20:35,359 --> 00:20:36,360
[Prisoner Actor] You want me
to just start lying down?
416
00:20:36,360 --> 00:20:37,736
[Producer] Yeah.
417
00:20:37,736 --> 00:20:39,155
[Guard Actor] In the action,
you just woke up, but--
418
00:20:39,155 --> 00:20:41,490
[Prisoner Actor]
Okay. Sounds great. Yeah.
419
00:20:41,490 --> 00:20:48,038
♪ ♪
420
00:20:48,038 --> 00:20:52,626
♪ ♪
421
00:20:52,626 --> 00:20:59,633
♪ ♪
422
00:20:59,633 --> 00:21:03,053
[Le Texier] Zimbardo explained
that he had a lot of material,
423
00:21:03,053 --> 00:21:05,973
but actually,
he recorded only six hours
424
00:21:05,973 --> 00:21:10,936
of over 150 hours of experiment.
425
00:21:10,936 --> 00:21:13,189
He recorded mostly
the spectacular,
426
00:21:13,189 --> 00:21:16,692
the most violent parts
of the experiments.
427
00:21:16,692 --> 00:21:20,446
This is also the way Zimbardo
wanted it to happen.
428
00:21:20,446 --> 00:21:21,822
[Guard Actor] Come on out.
429
00:21:21,822 --> 00:21:22,865
[Stephen Scott-Bottoms]
One of the things that most
430
00:21:22,865 --> 00:21:24,784
interested me
about this experiment
431
00:21:24,784 --> 00:21:27,536
is that what you have is people
who are being asked to play
432
00:21:27,536 --> 00:21:29,163
the roles of guards
and prisoners
433
00:21:29,163 --> 00:21:31,207
in a simulated setting.
434
00:21:31,207 --> 00:21:34,126
[Producer] This is cell two
and this is cell three.
435
00:21:34,126 --> 00:21:36,670
This is the cell room...
436
00:21:36,670 --> 00:21:38,506
[Scott-Bottoms] And this
basically, immediately
437
00:21:38,506 --> 00:21:42,009
is comparable to any number
of improvisation exercises
438
00:21:42,009 --> 00:21:44,428
that happen in acting workshops.
439
00:21:44,428 --> 00:21:45,888
[Producer]
And see what we want to happen.
440
00:21:45,888 --> 00:21:47,348
[Scott-Bottoms]
Which is a bit like saying
441
00:21:47,348 --> 00:21:49,683
is there a difference between
experiment and experience,
442
00:21:49,683 --> 00:21:52,144
which are almost the same word.
443
00:21:52,144 --> 00:21:53,687
[Guard Actor]
Guess we'll never know!
444
00:21:53,687 --> 00:21:56,232
[Scott-Bottoms] In French, they
are the same word, expérience.
445
00:21:56,232 --> 00:21:58,192
And what really interests me
446
00:21:58,192 --> 00:22:00,402
about these
behavioral experiments
447
00:22:00,402 --> 00:22:04,073
is that the way they get
written about by the scientists
448
00:22:04,073 --> 00:22:08,327
is always from this completely
objective, distant perspective.
449
00:22:08,327 --> 00:22:10,788
The subject behaved
in this manner,
450
00:22:10,788 --> 00:22:13,207
as if they're rats in a maze.
451
00:22:13,207 --> 00:22:15,793
Nobody's actually asking
the rats what they think
452
00:22:15,793 --> 00:22:19,129
is happening, because that's
not how scientists do it.
453
00:22:19,129 --> 00:22:23,384
♪ ♪
454
00:22:32,852 --> 00:22:37,439
[Korpi] This is all B-roll.
455
00:22:37,439 --> 00:22:39,900
[Producer] How do you think
the experiment has affected Doug
456
00:22:39,900 --> 00:22:42,361
over the years?
457
00:22:42,361 --> 00:22:45,823
[Theresa Hanna] He gets some
gratification out of being able
458
00:22:45,823 --> 00:22:49,952
to tell his side of the story,
because his side of the story
459
00:22:49,952 --> 00:22:52,705
is quite different from
Zimbardo's side of the story.
460
00:22:52,705 --> 00:22:54,123
And if Zimbardo had his way,
461
00:22:54,123 --> 00:22:57,167
no one would say anything
but what Zimbardo proposes.
462
00:22:57,167 --> 00:22:58,377
It gives you some power
463
00:22:58,377 --> 00:23:02,214
over what was a terribly
disempowered situation.
464
00:23:02,214 --> 00:23:05,634
The fact that one of the
prisoners had a mental breakdown
465
00:23:05,634 --> 00:23:08,679
because of the "treatment"
of the guards,
466
00:23:08,679 --> 00:23:12,474
it gave the whole experiment
a huge pop, you know?
467
00:23:12,474 --> 00:23:17,354
It's like, "Oh, look,
even after five hours
468
00:23:17,354 --> 00:23:21,358
of very mild oppression,
469
00:23:21,358 --> 00:23:24,486
we can cause mental breakdowns
in people."
470
00:23:24,486 --> 00:23:26,322
Well...
471
00:23:26,322 --> 00:23:27,406
[Korpi] Why didn't he understand
472
00:23:27,406 --> 00:23:29,867
I was just trying to get
out of a bad job?
473
00:23:29,867 --> 00:23:31,035
[Hanna]
Because he didn't want to.
474
00:23:31,035 --> 00:23:32,786
That didn't fit his narrative.
475
00:23:32,786 --> 00:23:35,623
His girlfriend at the time said,
"This is out of control."
476
00:23:35,623 --> 00:23:38,876
And he goes,
"Oh. Oh. Yeah, okay.
477
00:23:38,876 --> 00:23:41,337
You mean I can't have them
kill each other?"
478
00:23:41,337 --> 00:23:42,463
No, you can't.
479
00:23:42,463 --> 00:23:44,381
So that's when he finally--
480
00:23:44,381 --> 00:23:49,386
But I think up till that moment,
he was, he was in heaven.
481
00:23:49,386 --> 00:23:51,305
[Korpi] Yeah. He really was.
482
00:23:55,559 --> 00:23:57,061
[Producer] What is
your understanding
483
00:23:57,061 --> 00:23:59,730
as to why the experiment
ended early?
484
00:23:59,730 --> 00:24:02,066
[Eshleman] What I'd been told
was that Christina Maslach,
485
00:24:02,066 --> 00:24:05,653
after witnessing my supposed
magical transformation
486
00:24:05,653 --> 00:24:08,864
from normal kid
to sadistic bastard,
487
00:24:08,864 --> 00:24:11,116
that she went to Philip
and demanded
488
00:24:11,116 --> 00:24:13,452
that he stop torturing
these boys
489
00:24:13,452 --> 00:24:14,912
and call an end
to the experiment.
490
00:24:14,912 --> 00:24:18,082
[Shue] Probably there was
good reason to shut it down,
491
00:24:18,082 --> 00:24:21,919
not to mention that his
girlfriend chastised him
492
00:24:21,919 --> 00:24:23,337
for doing this.
493
00:24:23,337 --> 00:24:24,797
That made a good headline.
494
00:24:24,797 --> 00:24:26,674
[Christina Maslach] I just began
to feel sick to my stomach.
495
00:24:26,674 --> 00:24:30,427
I had this just chilling,
sickening feeling
496
00:24:30,427 --> 00:24:32,554
of watching this,
and I just turned away.
497
00:24:32,554 --> 00:24:35,224
[Guard Actor]
Down, up, down, up.
498
00:24:35,224 --> 00:24:36,725
[Zimbardo] She came down,
saw that madhouse and said,
499
00:24:36,725 --> 00:24:39,561
"You know what? It's terrible
what you're doing to those boys,
500
00:24:39,561 --> 00:24:41,313
and you are responsible."
501
00:24:41,313 --> 00:24:43,023
And I ended the study
the next day.
502
00:24:43,023 --> 00:24:45,067
The good news is
I married her the next year.
503
00:24:45,067 --> 00:24:47,986
[laughter]
504
00:24:47,986 --> 00:24:49,988
[Shue] The woman with fresh eyes
505
00:24:49,988 --> 00:24:54,201
saw the inhumanity
and woke him up.
506
00:24:54,201 --> 00:24:55,369
[chuckles] God.
507
00:24:55,369 --> 00:24:57,162
Wonder if he scripted that, too.
508
00:24:57,162 --> 00:24:59,790
[Le Texier] It wasn't until
15 years after the experiment
509
00:24:59,790 --> 00:25:01,959
that Zimbardo created
this narrative,
510
00:25:01,959 --> 00:25:05,421
saying that Christina made
the experiment stop.
511
00:25:05,421 --> 00:25:08,757
He wanted drama
and a Hollywood ending.
512
00:25:08,757 --> 00:25:11,760
[Clay Ramsay] At first,
Zimbardo had implied
513
00:25:11,760 --> 00:25:17,850
that this had gone overboard,
so we reined it in.
514
00:25:17,850 --> 00:25:22,271
It was clear that this was not
a structure built to last.
515
00:25:22,271 --> 00:25:26,066
It was built to stay hanging
there for long enough
516
00:25:26,066 --> 00:25:29,570
to get certain kinds
of images of behavior.
517
00:25:29,570 --> 00:25:31,071
That was its purpose.
518
00:25:31,071 --> 00:25:35,784
He now had a lot of material,
a lot of video, a lot of audio,
519
00:25:35,784 --> 00:25:38,996
and he might just have gone,
"This is good."
520
00:25:38,996 --> 00:25:42,458
[tape rewinding]
521
00:25:42,458 --> 00:25:44,126
[beep]
522
00:25:44,126 --> 00:25:46,628
[Scott-Bottoms] The very first
version of the story
523
00:25:46,628 --> 00:25:50,591
that he tried to tell
was the version that says,
524
00:25:50,591 --> 00:25:54,887
"Prisoners are badly affected
by prison conditions.
525
00:25:54,887 --> 00:25:57,598
Look what happened to our
prisoners inside a week."
526
00:25:57,598 --> 00:25:59,808
And there's a truth to that,
but it didn't sell,
527
00:25:59,808 --> 00:26:01,810
because a lot of people
turned around and said,
528
00:26:01,810 --> 00:26:03,896
"Why did you do that
to those kids?"
529
00:26:03,896 --> 00:26:09,610
And so, almost immediately
you get the Attica Uprising,
530
00:26:09,610 --> 00:26:12,237
the bloodiest prison riot
in American history,
531
00:26:12,237 --> 00:26:14,907
which happened the month
after the Stanford Experiment.
532
00:26:14,907 --> 00:26:17,701
[Newscaster] Nine hostages
and 28 convicts were killed,
533
00:26:17,701 --> 00:26:20,287
perhaps 100 convicts injured,
534
00:26:20,287 --> 00:26:23,415
as the authorities moved in
with force to bring an end
535
00:26:23,415 --> 00:26:25,793
to a revolt which started
last Thursday.
536
00:26:25,793 --> 00:26:28,045
[Scott-Bottoms] I don't think we
would be talking about this now
537
00:26:28,045 --> 00:26:30,839
if not for Attica,
which made the prisons
538
00:26:30,839 --> 00:26:33,383
this enormous thing
in the popular culture.
539
00:26:33,383 --> 00:26:34,760
[crowd chanting]
540
00:26:34,760 --> 00:26:37,096
We want to know why the guards
and the state troopers
541
00:26:37,096 --> 00:26:39,473
have gone in there
with shotguns and tear gas
542
00:26:39,473 --> 00:26:41,141
and killed that many prisoners.
543
00:26:41,141 --> 00:26:42,434
What could be going on there?
544
00:26:42,434 --> 00:26:43,811
[Protester] What about
the thousands of prisoners
545
00:26:43,811 --> 00:26:44,978
who die all over the country?
546
00:26:44,978 --> 00:26:46,146
[Scott-Bottoms]
Zimbardo realized that
547
00:26:46,146 --> 00:26:48,232
it's a much better way
to spin the story
548
00:26:48,232 --> 00:26:49,566
if you look at
what the guards did,
549
00:26:49,566 --> 00:26:51,276
so let's tell the story
that way.
550
00:26:51,276 --> 00:26:52,528
[Zimbardo]
Guards have total power.
551
00:26:52,528 --> 00:26:53,904
Getting away with murder.
Evil of prison.
552
00:26:53,904 --> 00:26:56,281
Prison. Prison. Prisons.
553
00:26:56,281 --> 00:26:57,783
[Scott-Bottoms] This is what
Zimbardo is brilliant at
554
00:26:57,783 --> 00:27:00,828
is finding an angle in order
to get people to talk about
555
00:27:00,828 --> 00:27:03,455
the thing underneath it
that matters.
556
00:27:05,082 --> 00:27:09,044
[applause]
557
00:27:09,044 --> 00:27:10,963
[Zimbardo] We wanted to ask
the question what happens
558
00:27:10,963 --> 00:27:12,881
if you take people,
all of whom are good,
559
00:27:12,881 --> 00:27:14,675
and put them in a bad place?
560
00:27:14,675 --> 00:27:17,427
Does the goodness of people
dominate the bad place,
561
00:27:17,427 --> 00:27:21,140
or does the bad situation come
to corrupt the good people?
562
00:27:21,140 --> 00:27:23,517
And the sad answer
was the latter.
563
00:27:23,517 --> 00:27:25,435
[Karl Van Orsdol] I did not
believe that we were part
564
00:27:25,435 --> 00:27:26,979
of the experimental subjects.
565
00:27:26,979 --> 00:27:30,691
I believed we were helping
implement the experiment.
566
00:27:30,691 --> 00:27:32,568
[Chuck Burton] You know,
they never said anything about
567
00:27:32,568 --> 00:27:34,111
that they're studying us, too.
568
00:27:34,111 --> 00:27:36,613
That doesn't excuse
lots of my behavior,
569
00:27:36,613 --> 00:27:39,074
but it's a backdrop
for some of my behavior.
570
00:27:39,074 --> 00:27:41,451
[Eshleman]
If it was made clear to us
571
00:27:41,451 --> 00:27:45,998
that we're there to be studied,
not to help study prisoners,
572
00:27:45,998 --> 00:27:49,001
we would have acted
completely differently.
573
00:27:49,001 --> 00:27:50,294
[Le Texier] In the archive,
574
00:27:50,294 --> 00:27:53,505
several guards say they've been
shocked to discover
575
00:27:53,505 --> 00:27:57,050
that they were a subject
in the experiment.
576
00:27:57,050 --> 00:27:59,887
Zimbardo, he pretended
that they were not observed.
577
00:27:59,887 --> 00:28:01,013
[Guard] It's been a pleasure.
578
00:28:01,013 --> 00:28:02,848
[Le Texier]
And the guards, they thought
579
00:28:02,848 --> 00:28:05,017
they were a part
of the experimenters,
580
00:28:05,017 --> 00:28:09,062
they were not part of
the subjects, which is a lie.
581
00:28:09,062 --> 00:28:11,940
They thought they were
like theater actors,
582
00:28:11,940 --> 00:28:14,735
and the experiment was not
about their behavior,
583
00:28:14,735 --> 00:28:17,571
it was about
the prisoners' reactions.
584
00:28:17,571 --> 00:28:21,033
But they were part
of the experiment, too.
585
00:28:23,243 --> 00:28:27,414
He had this agenda, let's make
something very striking,
586
00:28:27,414 --> 00:28:30,208
very graphic, very extreme.
587
00:28:30,208 --> 00:28:32,836
Let's go to the media
with this argument.
588
00:28:32,836 --> 00:28:35,797
The Stanford Prison Experiment
was like a media event.
589
00:28:35,797 --> 00:28:39,301
It was not planned really
as a scientific experiment.
590
00:28:39,301 --> 00:28:41,762
It was more like
producing material
591
00:28:41,762 --> 00:28:44,890
that could be used
by the journalist.
592
00:28:47,976 --> 00:28:49,645
The media was
a central character
593
00:28:49,645 --> 00:28:53,857
in the aftermath
of the Stanford Experiment.
594
00:28:56,777 --> 00:28:58,862
Right after the end
of the experiment
595
00:28:58,862 --> 00:29:00,781
Zimbardo was invited on KRON-TV,
596
00:29:00,781 --> 00:29:03,367
which was a local station
in San Francisco.
597
00:29:03,367 --> 00:29:07,829
And there he met a TV producer,
Larry Goldstein.
598
00:29:07,829 --> 00:29:12,459
He had the idea of doing a piece
on the experiment.
599
00:29:12,459 --> 00:29:16,046
Zimbardo is telling him that
there have been ups and downs,
600
00:29:16,046 --> 00:29:18,715
but what Larry Goldstein
would like to show
601
00:29:18,715 --> 00:29:20,842
is the situation building up,
602
00:29:20,842 --> 00:29:23,220
the violence
growing and growing,
603
00:29:23,220 --> 00:29:26,848
and in so he's pushing Zimbardo
to change his narrative
604
00:29:26,848 --> 00:29:29,351
and to have a narrative
which is more linear.
605
00:29:41,613 --> 00:29:43,782
[Zimbardo] On day two,
the prisoners rebelled.
606
00:29:46,535 --> 00:29:49,204
[Zimbardo] By day three,
the guards were exerting control
607
00:29:49,204 --> 00:29:51,164
over every aspect
of the prisoner's life.
608
00:29:56,378 --> 00:29:58,046
[Zimbardo]
Only four days had passed,
609
00:29:58,046 --> 00:30:01,883
and the guards' sense of moral
value had been totally altered.
610
00:30:01,883 --> 00:30:03,343
Things had gotten so bad
611
00:30:03,343 --> 00:30:05,470
that we no longer had
a group of prisoners.
612
00:30:05,470 --> 00:30:09,057
What we had were individuals
struggling for survival.
613
00:30:09,057 --> 00:30:10,851
[Le Texier]
He's streamlining the story.
614
00:30:10,851 --> 00:30:12,728
And that's how, in November,
615
00:30:12,728 --> 00:30:16,440
they screened the piece on
the Stanford Prison Experiment.
616
00:30:16,440 --> 00:30:21,903
It gave a national publicity
to the experiment.
617
00:30:21,903 --> 00:30:26,491
[Ramsay] Some of this video
is just really compelling.
618
00:30:26,491 --> 00:30:32,789
It has the feel of pornography
without really going there,
619
00:30:32,789 --> 00:30:36,126
and it has a moral justification
620
00:30:36,126 --> 00:30:38,420
for why you're watching
this thing.
621
00:30:38,420 --> 00:30:40,797
And that kind of combination
622
00:30:40,797 --> 00:30:46,178
becomes one of the ancestors
of reality TV.
623
00:30:46,178 --> 00:30:48,346
[Korpi] This was unscripted,
free flowing.
624
00:30:48,346 --> 00:30:49,556
It was creative.
625
00:30:49,556 --> 00:30:50,974
It was a great idea.
626
00:30:50,974 --> 00:30:52,768
And he knew it would get press.
627
00:30:52,768 --> 00:30:54,227
[Eshleman]
This particular experiment
628
00:30:54,227 --> 00:30:56,313
really made a name for him,
629
00:30:56,313 --> 00:30:58,690
and I think that
he basks in the limelight.
630
00:30:58,690 --> 00:31:01,026
[applause]
631
00:31:01,026 --> 00:31:03,945
He really built
his career on this.
632
00:31:03,945 --> 00:31:05,655
[Korpi] Zimbardo,
for years and years,
633
00:31:05,655 --> 00:31:07,699
told this story to the media.
634
00:31:07,699 --> 00:31:10,327
[John Davidson] Please welcome
Dr. Philip Zimbardo.
635
00:31:10,327 --> 00:31:13,830
[applause]
636
00:31:13,830 --> 00:31:18,210
[Zimbardo] The guards then began
to escalate their use of power.
637
00:31:18,210 --> 00:31:21,505
Brutal, sadistic. Some of them
creatively sadistic.
638
00:31:21,505 --> 00:31:23,965
They were often here
with no clothes on.
639
00:31:23,965 --> 00:31:25,675
You can't see someone's eyes.
640
00:31:25,675 --> 00:31:27,886
This prison is about to erupt.
641
00:31:27,886 --> 00:31:29,721
[Korpi] He's a genius
at figuring out
642
00:31:29,721 --> 00:31:32,974
what will attract
media attention.
643
00:31:32,974 --> 00:31:34,476
[Zimbardo]
In this demonstration,
644
00:31:34,476 --> 00:31:37,312
we want to see whether ordinary
people blindly obey someone
645
00:31:37,312 --> 00:31:38,730
who seems like an authority
646
00:31:38,730 --> 00:31:41,817
when he asks them to do
something potentially dangerous.
647
00:31:41,817 --> 00:31:44,027
[taser clicking, yelling]
648
00:31:44,027 --> 00:31:45,403
What just happened?
649
00:31:45,403 --> 00:31:47,447
[Announcer] What makes
good people do bad things?
650
00:31:47,447 --> 00:31:48,532
[applause]
651
00:31:48,532 --> 00:31:49,908
[Dr. Phil]
You've been at this a while.
652
00:31:49,908 --> 00:31:53,078
[Zimbardo] I've been creating
evil for a long time.
653
00:31:58,083 --> 00:31:59,126
[Holt] Welcome back, everyone.
Right now at MSNBC,
654
00:31:59,126 --> 00:32:01,002
charges of abuse
of Iraqi prisoners.
655
00:32:01,002 --> 00:32:03,171
[Newscaster] These powerful
images lie at the heart
656
00:32:03,171 --> 00:32:05,132
of the prisoner abuse scandal.
657
00:32:05,132 --> 00:32:08,176
Naked Iraqis forced to create
a human pyramid
658
00:32:08,176 --> 00:32:10,929
while US soldiers
look on grinning.
659
00:32:10,929 --> 00:32:12,514
[Goodman] The pictures
were leaked to the press
660
00:32:12,514 --> 00:32:16,101
and first revealed to the world
in May of 2004.
661
00:32:16,101 --> 00:32:17,394
[Holt] Many of us are just
trying to figure out
662
00:32:17,394 --> 00:32:19,521
what happened to make
apparently normal soldiers
663
00:32:19,521 --> 00:32:21,565
to do the things
depicted in the pictures.
664
00:32:21,565 --> 00:32:23,108
[Anderson Cooper] You might
wonder what kind of person
665
00:32:23,108 --> 00:32:25,193
could participate
in those situations.
666
00:32:25,193 --> 00:32:28,864
[Ramsay] When the news came out
about the Abu Ghraib prison,
667
00:32:28,864 --> 00:32:32,284
Zimbardo implied that Abu Ghraib
668
00:32:32,284 --> 00:32:35,495
was the replication
of his experiment.
669
00:32:35,495 --> 00:32:37,914
[Zimbardo] I had seen
those images 35 years ago.
670
00:32:37,914 --> 00:32:40,458
My good guards did that
in our mock prison
671
00:32:40,458 --> 00:32:42,294
the same way these
Army reservists did it
672
00:32:42,294 --> 00:32:45,213
in Tier 1A Abu Ghraib.
673
00:32:45,213 --> 00:32:47,674
[Shue] That might be
a little bit of a stretch.
674
00:32:47,674 --> 00:32:51,386
[Loftus] If we were killing
other students,
675
00:32:51,386 --> 00:32:54,973
that might be a valid judgment.
676
00:32:54,973 --> 00:32:59,394
[Stephen Reicher] Zimbardo acted
as witness for the defense.
677
00:32:59,394 --> 00:33:02,355
[Goodman] You testified
in the court martial
678
00:33:02,355 --> 00:33:03,940
in the Abu Ghraib scandal.
679
00:33:03,940 --> 00:33:05,901
[Zimbardo] Yes, I did.
I was part of the defense team.
680
00:33:05,901 --> 00:33:08,069
Coming to the defense
of Chip Frederick,
681
00:33:08,069 --> 00:33:10,322
what that gave me
the opportunity to say,
682
00:33:10,322 --> 00:33:11,823
"Were these bad apples
683
00:33:11,823 --> 00:33:15,035
or were these good American
soldiers put in a bad barrel?"
684
00:33:15,035 --> 00:33:18,246
Before Chip Frederick went down
in that dungeon in Abu Ghraib,
685
00:33:18,246 --> 00:33:22,083
he was as normal, healthy
as the good guards in my study.
686
00:33:22,083 --> 00:33:26,004
And within several weeks,
he was as bad as the worst ones.
687
00:33:29,090 --> 00:33:31,301
[Korpi] At Abu Ghraib,
politically he's saying,
688
00:33:31,301 --> 00:33:35,096
"Look, it's not the individual,
it's the [bleep] institution."
689
00:33:37,807 --> 00:33:40,852
[Eshleman] Does it excuse any
bad behavior you may have done?
690
00:33:40,852 --> 00:33:42,896
You know, beyond my control,
691
00:33:42,896 --> 00:33:45,398
you know, I was,
I was following orders.
692
00:33:45,398 --> 00:33:47,317
[Reicher]
Now, we have this narrative,
693
00:33:47,317 --> 00:33:50,737
the scenes of sort
of homoerotic humiliation
694
00:33:50,737 --> 00:33:54,783
that went on in the SPE are
so much like those pictures
695
00:33:54,783 --> 00:33:57,285
that we saw of the prisoners
in Abu Ghraib.
696
00:34:01,539 --> 00:34:04,960
[Le Texier] Zimbardo started
talking about sexual abuse
697
00:34:04,960 --> 00:34:06,836
in the Stanford
Prison Experiment
698
00:34:06,836 --> 00:34:09,047
right after
the Abu Ghraib scandal.
699
00:34:09,047 --> 00:34:11,549
[Zimbardo] Our guards were
forcing the prisoners to engage
700
00:34:11,549 --> 00:34:15,220
in simulated sodomy,
exactly as in, in this prison.
701
00:34:15,220 --> 00:34:18,348
[Le Texier] He's pretending
that guards made some prisoners
702
00:34:18,348 --> 00:34:22,018
play sexual games in the
Stanford Prison Experiment.
703
00:34:31,236 --> 00:34:33,446
[Le Texier] Zimbardo
exaggerated these games,
704
00:34:33,446 --> 00:34:35,073
because it was very useful
705
00:34:35,073 --> 00:34:37,284
to match the Stanford
Prison Experiment
706
00:34:37,284 --> 00:34:38,910
with the Abu Ghraib scandal.
707
00:34:38,910 --> 00:34:41,079
But we know that's
not what happened.
708
00:34:41,079 --> 00:34:44,916
[Ramsay] Abu Ghraib revived
interest in the experiment,
709
00:34:44,916 --> 00:34:48,003
and Zimbardo saw the opportunity
for a book.
710
00:34:48,003 --> 00:34:51,256
[Newscaster] Stanford University
psychologist Philip Zimbardo.
711
00:34:51,256 --> 00:34:53,508
[Goodman] It's called
The Lucifer Effect.
712
00:34:53,508 --> 00:34:55,093
[Stephen Colbert] The book is
The Lucifer Effect.
713
00:34:55,093 --> 00:34:56,970
[Stewart] Now, when you say
the slow descent of man
714
00:34:56,970 --> 00:34:59,347
from good to evil,
it took a week, did it not?
715
00:34:59,347 --> 00:35:02,058
[Zimbardo]
No. It actually took 36 hours.
716
00:35:02,058 --> 00:35:03,935
[laughter]
717
00:35:03,935 --> 00:35:08,982
[Ramsay] Without Dr. Zimbardo
continually pressing his case,
718
00:35:08,982 --> 00:35:14,029
the Stanford Prison Experiment
would have lapsed into the haze
719
00:35:14,029 --> 00:35:18,867
where most previous social
psychology experiments exist.
720
00:35:18,867 --> 00:35:23,204
It takes continuous pumping
to keep the bicycle going.
721
00:35:23,204 --> 00:35:24,748
[applause]
722
00:35:29,753 --> 00:35:32,714
[Le Texier] When I started doing
research on the experiment,
723
00:35:32,714 --> 00:35:36,634
I never thought
about debunking it.
724
00:35:36,634 --> 00:35:40,138
Zimbardo was a reputed
Stanford professor who did it,
725
00:35:40,138 --> 00:35:43,642
and nobody debunked
the experiment in 50 years,
726
00:35:43,642 --> 00:35:46,519
so I thought, "Okay, it's not
me, this French guy from nowhere
727
00:35:46,519 --> 00:35:49,147
who is going to debunk this."
728
00:35:49,147 --> 00:35:50,357
When I went through the archive,
729
00:35:50,357 --> 00:35:52,776
I had no idea that
the official narrative
730
00:35:52,776 --> 00:35:55,695
was not what really happened.
731
00:35:55,695 --> 00:35:58,865
But when you do science, you
have a commitment to the truth,
732
00:35:58,865 --> 00:36:03,161
and you cannot let something
false circulate like this.
733
00:36:03,161 --> 00:36:05,163
Maybe I'm a bit old-fashioned,
734
00:36:05,163 --> 00:36:09,209
but for me, it's a duty
to make things right.
735
00:36:09,209 --> 00:36:10,960
I decided to write a book
736
00:36:10,960 --> 00:36:14,672
and to get in touch
with the participants.
737
00:36:14,672 --> 00:36:16,299
[Eshleman]
I got a call from Thibault.
738
00:36:16,299 --> 00:36:17,300
[Shue] Thibault.
739
00:36:17,300 --> 00:36:18,301
[Korpi] Thibault.
740
00:36:18,301 --> 00:36:19,469
[Mark] Thibault.
741
00:36:19,469 --> 00:36:21,221
[Korpi] Oh. Thibault.
742
00:36:21,221 --> 00:36:22,764
Thibault, Thibault,
Thibault, Thibault.
743
00:36:22,764 --> 00:36:24,140
[Producer] Thibault.
[Korpi] Thibault.
744
00:36:24,140 --> 00:36:25,975
[Mark] He's very bold.
745
00:36:25,975 --> 00:36:31,815
He wrote a book that was
very damning in picking apart
746
00:36:31,815 --> 00:36:34,484
the experiment
and the methodology.
747
00:36:34,484 --> 00:36:38,238
And I think as a result
of his book,
748
00:36:38,238 --> 00:36:42,200
other researchers are looking
at the experiment
749
00:36:42,200 --> 00:36:46,538
with a more skeptical eye.
750
00:36:46,538 --> 00:36:48,123
[Producer] I noticed, John,
that sometimes
751
00:36:48,123 --> 00:36:50,291
you, you close your eyes
when you speak.
752
00:36:50,291 --> 00:36:54,087
Is that something
that you do often?
753
00:36:54,087 --> 00:36:58,258
[Mark] If it's normal
conversation, no.
754
00:36:58,258 --> 00:36:59,676
If it's thoughtful, yeah.
755
00:37:05,432 --> 00:37:07,350
[Reicher]
I'm a social psychologist
756
00:37:07,350 --> 00:37:09,018
at the University
of St. Andrews,
757
00:37:09,018 --> 00:37:12,313
and I have been studying
human social behavior,
758
00:37:12,313 --> 00:37:15,108
in particular group
and collective processes,
759
00:37:15,108 --> 00:37:19,487
for more years than I would
like to remember.
760
00:37:19,487 --> 00:37:23,867
Around 1999, the BBC came to us
and it said,
761
00:37:23,867 --> 00:37:26,453
"We now have the technology,
which we didn't have before,
762
00:37:26,453 --> 00:37:28,997
to observe behavior
systematically
763
00:37:28,997 --> 00:37:32,667
and want to use that to revisit
the Stanford Prison Study."
764
00:37:32,667 --> 00:37:34,544
So, we took the same
basic setup.
765
00:37:34,544 --> 00:37:37,422
[Man] 15 men randomly divided
into two groups.
766
00:37:37,422 --> 00:37:40,508
One group of men given power
over the other.
767
00:37:42,427 --> 00:37:43,595
[Reicher]
We thought that the guards
768
00:37:43,595 --> 00:37:45,388
would impose their authority.
769
00:37:45,388 --> 00:37:47,682
And we were rather surprised to
find that didn't happen at all.
770
00:37:47,682 --> 00:37:49,142
They hated being guards.
771
00:37:49,142 --> 00:37:52,479
[Guard] I don't want to be
a guard, to be honest, really.
772
00:37:52,479 --> 00:37:54,981
I'd rather be a prisoner,
really, in all honesty.
773
00:37:54,981 --> 00:37:58,109
[Reicher] We began to realize
that actually leadership
774
00:37:58,109 --> 00:38:00,403
was absolutely critical,
775
00:38:00,403 --> 00:38:03,698
because the more you look
at Zimbardo's study,
776
00:38:03,698 --> 00:38:04,866
you realize that the guards
777
00:38:04,866 --> 00:38:07,494
didn't just become guards
willy-nilly.
778
00:38:07,494 --> 00:38:11,247
He acted as leader
to tell them what to do.
779
00:38:11,247 --> 00:38:12,665
But without leadership,
780
00:38:12,665 --> 00:38:15,418
you don't get the types
of toxic behavior
781
00:38:15,418 --> 00:38:20,006
we saw in Zimbardo's SPE,
and we don't see in our study.
782
00:38:23,301 --> 00:38:27,972
♪ And how'd we let ourselves
get duped like this? ♪
783
00:38:27,972 --> 00:38:30,517
♪ No, no ♪
784
00:38:30,517 --> 00:38:33,770
♪ No, no ♪
785
00:38:33,770 --> 00:38:39,150
♪ Childish leaders
who cannot lead ♪
786
00:38:39,150 --> 00:38:44,364
♪ Wealthy plutocrats
consumed by greed ♪
787
00:38:44,364 --> 00:38:45,740
♪ Hate groups growing ♪
788
00:38:45,740 --> 00:38:46,950
When I went to college,
789
00:38:46,950 --> 00:38:49,369
I wanted to stay in
the arts and theater,
790
00:38:49,369 --> 00:38:51,246
and I ended up in a place down
in Southern California
791
00:38:51,246 --> 00:38:55,625
called Chapman University, had
a very good music department.
792
00:38:55,625 --> 00:38:58,461
♪ President ♪
793
00:38:58,461 --> 00:39:01,256
I love my music,
and I create things.
794
00:39:01,256 --> 00:39:04,467
I write plays, I direct them.
795
00:39:06,386 --> 00:39:10,723
I'm a very nonviolent guy.
796
00:39:10,723 --> 00:39:12,809
I can't even be mean to my cats.
797
00:39:12,809 --> 00:39:17,188
And I think that comes from,
a lot from my father,
798
00:39:17,188 --> 00:39:20,316
who grew up almost Amish.
799
00:39:20,316 --> 00:39:23,111
He's extremely gentle
800
00:39:23,111 --> 00:39:26,698
and would never even show anger,
801
00:39:26,698 --> 00:39:30,118
unless we really
pushed him to the brink.
802
00:39:30,118 --> 00:39:31,786
And I think that's
who I've become
803
00:39:31,786 --> 00:39:33,997
is kind of more like my father.
804
00:39:33,997 --> 00:39:36,833
I'm mellow.
805
00:39:36,833 --> 00:39:39,002
People really need
to learn the difference
806
00:39:39,002 --> 00:39:42,839
between a role somebody's
playing and who they are.
807
00:39:46,634 --> 00:39:50,305
Because of the experiment, I
still get hate mail from people.
808
00:39:50,305 --> 00:39:52,807
People that say,
"You sick [bleep].
809
00:39:52,807 --> 00:39:55,685
You sadistic bastard.
I hope you rot in hell."
810
00:39:55,685 --> 00:39:57,896
And you know,
somebody going as far
811
00:39:57,896 --> 00:40:02,609
as to go on to like my business
page and talk about how,
812
00:40:02,609 --> 00:40:05,695
you know, they went to get
a loan from me in my business,
813
00:40:05,695 --> 00:40:07,780
and I ended up, you know,
torturing them
814
00:40:07,780 --> 00:40:09,407
or something like that.
815
00:40:09,407 --> 00:40:11,576
You know what I mean? Come on.
This is my business.
816
00:40:11,576 --> 00:40:13,202
[Reporter] Dave, you've probably
had one of the most
817
00:40:13,202 --> 00:40:14,329
notorious reputations.
818
00:40:14,329 --> 00:40:15,788
They called you John Wayne
819
00:40:15,788 --> 00:40:18,249
because they said you were
brutal and you were sadistic.
820
00:40:18,249 --> 00:40:19,459
[Eshleman] Mm-hmm.
821
00:40:19,459 --> 00:40:21,544
It's been fascinating to see
how the interest
822
00:40:21,544 --> 00:40:24,339
never seems to go away.
823
00:40:24,339 --> 00:40:26,049
I mean, here we are,
it's been 50 years,
824
00:40:26,049 --> 00:40:28,217
and, you know, I'm still talking
about it.
825
00:40:28,217 --> 00:40:30,219
I get asked to come and give
a lot of talks,
826
00:40:30,219 --> 00:40:31,763
do a lot of interviews.
827
00:40:31,763 --> 00:40:36,100
What most people are surprised
at is the level of humiliation
828
00:40:36,100 --> 00:40:38,186
that we tried to put
the prisoners through.
829
00:40:38,186 --> 00:40:43,232
I started inventing
my own daily humiliations.
830
00:40:43,232 --> 00:40:46,319
When you give somebody
the absolute power
831
00:40:46,319 --> 00:40:48,571
to control what
other people are doing,
832
00:40:48,571 --> 00:40:50,531
then they're gonna
use that power.
833
00:40:50,531 --> 00:40:52,200
I go talk to
high school classes.
834
00:40:52,200 --> 00:40:53,993
The teacher will announce,
"Oh, you know,
835
00:40:53,993 --> 00:40:55,662
we've been talking about
the prison experiment,
836
00:40:55,662 --> 00:40:57,956
and I have one of the guards,
837
00:40:57,956 --> 00:40:59,499
you know, coming in
to talk to us.
838
00:40:59,499 --> 00:41:01,876
The guy that they would call
John Wayne."
839
00:41:01,876 --> 00:41:04,545
And the students goes,
"He's coming here?
840
00:41:04,545 --> 00:41:05,797
Oh, my god. What do we do?"
841
00:41:05,797 --> 00:41:09,717
You know, it's like,
"Is he gonna torture us?"
842
00:41:09,717 --> 00:41:11,302
So, I think they're always
surprised to find
843
00:41:11,302 --> 00:41:13,638
that, you know,
I'm a pretty easygoing guy.
844
00:41:13,638 --> 00:41:17,517
It bothers me that they don't
take the time to look beyond,
845
00:41:17,517 --> 00:41:20,186
you know, the very surface.
846
00:41:20,186 --> 00:41:21,688
And that's one of the problems,
of course,
847
00:41:21,688 --> 00:41:24,982
I think with the way the
experiment has been perceived
848
00:41:24,982 --> 00:41:29,195
and has been disseminated
in popular culture by Zimbardo.
849
00:41:29,195 --> 00:41:30,530
[Phil Donahue]
Dr. Philip Zimbardo
850
00:41:30,530 --> 00:41:32,740
is professor of psychology
at Stanford University,
851
00:41:32,740 --> 00:41:35,743
and several years ago engaged
in a fascinating experiment
852
00:41:35,743 --> 00:41:39,330
wherein he chose
middle-class students,
853
00:41:39,330 --> 00:41:41,416
put them all in a group.
How many?
854
00:41:41,416 --> 00:41:43,501
[Zimbardo] Well, there were
a dozen guards and 10 prisoners.
855
00:41:43,501 --> 00:41:44,961
[Donahue] Okay.
856
00:41:44,961 --> 00:41:48,089
[Korpi] 50 years later, the
experiment should be irrelevant,
857
00:41:48,089 --> 00:41:49,966
but I'm still pissed at Phil.
858
00:41:49,966 --> 00:41:51,467
[Zimbardo] What happened
that's not in the film
859
00:41:51,467 --> 00:41:54,762
is that within 36 hours,
the first prisoner had
860
00:41:54,762 --> 00:41:56,806
what could be called
an emotional breakdown.
861
00:41:56,806 --> 00:41:59,600
[Korpi] It wasn't the experiment
that pissed me off.
862
00:41:59,600 --> 00:42:01,185
It was a stupid experiment.
863
00:42:01,185 --> 00:42:04,981
There was a bunch of
obvious manipulation of us.
864
00:42:04,981 --> 00:42:08,109
But then he kept calling
after that
865
00:42:08,109 --> 00:42:10,111
to get me on all these TV shows.
866
00:42:10,111 --> 00:42:12,905
They don't take your bed
and your clothes in prison!
867
00:42:12,905 --> 00:42:14,490
It was so many things.
868
00:42:14,490 --> 00:42:15,742
[Donahue] You were a prisoner?
869
00:42:15,742 --> 00:42:16,784
Okay.
870
00:42:16,784 --> 00:42:18,494
That was you just yelling?
871
00:42:18,494 --> 00:42:21,164
[Korpi] I mean, year after year,
he's retraumatizing you.
872
00:42:21,164 --> 00:42:24,292
I didn't like that I had to work
so hard to get him to back off.
873
00:42:24,292 --> 00:42:26,002
[Woman] I was wondering how
the prisoners' attitudes
874
00:42:26,002 --> 00:42:28,921
towards each other changed
during this experiment.
875
00:42:28,921 --> 00:42:30,214
[Korpi] I was only
in the experiment
876
00:42:30,214 --> 00:42:31,883
a day and a half or two days.
877
00:42:31,883 --> 00:42:33,134
I got out quick.
878
00:42:33,134 --> 00:42:34,343
I don't want to talk
to that guy.
879
00:42:34,343 --> 00:42:36,804
I don't want to deal with him,
and I had to.
880
00:42:36,804 --> 00:42:38,556
He would come at me
this way and that way.
881
00:42:38,556 --> 00:42:40,767
And he was articulate
and he's well-published
882
00:42:40,767 --> 00:42:42,435
and he's gonna give me work.
883
00:42:42,435 --> 00:42:45,605
[Zimbardo] I think that we are
vulnerable to the conman.
884
00:42:45,605 --> 00:42:47,190
We are vulnerable to influence.
885
00:42:47,190 --> 00:42:49,984
[Korpi] The control of
the narrative has been Phil's.
886
00:42:49,984 --> 00:42:51,068
He can't help that.
887
00:42:51,068 --> 00:42:53,404
He's just a narcissistic
[bleep].
888
00:42:53,404 --> 00:42:55,364
That's Phil.
889
00:42:55,364 --> 00:42:58,076
[Mark] Professor Zimbardo
was kind of a clown.
890
00:42:58,076 --> 00:43:01,579
[Reicher] I think the
traditional narrative of the SPE
891
00:43:01,579 --> 00:43:02,914
needs to be challenged.
892
00:43:02,914 --> 00:43:04,874
[Korpi] Anybody that tried
to reduce human behavior
893
00:43:04,874 --> 00:43:06,834
to such simple notions is fraud.
894
00:43:06,834 --> 00:43:07,835
[Reicher] Fraud.
[Eshleman] Fraud.
895
00:43:07,835 --> 00:43:09,045
[Shue] Fraud.
896
00:43:09,045 --> 00:43:10,296
[Loftus] Kind of fraudulent.
897
00:43:10,296 --> 00:43:11,631
[Scott-Bottoms]
And that's what Zimbardo did.
898
00:43:16,594 --> 00:43:19,639
[Zimbardo] So, I'm looking
at the camera?
899
00:43:19,639 --> 00:43:21,849
In the last year,
the Stanford Prison study
900
00:43:21,849 --> 00:43:24,685
has received lots of criticism.
901
00:43:24,685 --> 00:43:27,271
"The Stanford Prison Experiment
is a lie."
902
00:43:27,271 --> 00:43:29,232
"Zimbardo is a fraud."
903
00:43:29,232 --> 00:43:31,818
And none of their
criticisms hold up.
904
00:43:31,818 --> 00:43:33,569
Zero.
905
00:43:33,569 --> 00:43:37,240
-♪ See no evil ♪
-♪ See no evil ♪
906
00:43:37,240 --> 00:43:41,077
-♪ Hear no evil ♪
-♪ Hear no evil ♪
907
00:43:41,077 --> 00:43:44,831
-♪ Speak no evil ♪
-♪ Speak no evil ♪
908
00:43:44,831 --> 00:43:45,832
[Burton] I am Chuck.
909
00:43:45,832 --> 00:43:47,083
[Korpi] I'm Doug.
[Burton] Doug.
910
00:43:47,083 --> 00:43:48,584
[Eshleman] Gentlemen,
good morning.
911
00:43:48,584 --> 00:43:50,378
[Shue] Morning.
[Eshleman] I'm Dave.
912
00:43:50,378 --> 00:43:51,462
[Burton] There's the man.
913
00:43:51,462 --> 00:43:52,463
[Eshleman] Do you?
[Korpi] And you are?
914
00:43:52,463 --> 00:43:55,383
[Shue] I'm Jerry Shue. 5486.
915
00:43:55,383 --> 00:43:56,425
[Burton] Okay.
[Ramsay] Hi, Dave.
916
00:43:56,425 --> 00:43:57,468
[Eshleman] Dave.
How are you, Clay?
917
00:43:57,468 --> 00:43:58,469
[Ramsay] Hi. I'm Clay.
918
00:43:58,469 --> 00:44:00,930
[Van Orsdol] Clay. Karl.
919
00:44:00,930 --> 00:44:03,266
♪ Not today ♪
920
00:44:03,266 --> 00:44:04,559
[Korpi] Isn't this incredible?
73460
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.