Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:30,000 --> 00:00:32,900
I wanted to see if he was OK.
2
00:00:32,900 --> 00:00:34,100
I wanted to...
3
00:00:35,340 --> 00:00:38,660
..say the last conversation I never
got to have with him.
4
00:01:08,940 --> 00:01:10,180
- For several months now,
5
00:01:10,180 --> 00:01:15,100
the public has been fascinated with
GPT and other AI tools.
6
00:01:15,100 --> 00:01:20,700
They are no longer fantasies of
science fiction, they're real.
7
00:01:20,700 --> 00:01:24,060
We are on the verge of a new era.
8
00:01:27,420 --> 00:01:29,300
- This experience...
9
00:01:30,820 --> 00:01:34,140
..it was creepy.
10
00:01:34,140 --> 00:01:37,340
There were things that scared me.
11
00:01:39,060 --> 00:01:41,460
And a lot of stuff I didn't want to
hear.
12
00:01:42,580 --> 00:01:44,180
I wasn't prepared to hear.
13
00:01:54,260 --> 00:01:58,340
- Artificial intelligence promises us
what religion does.
14
00:01:58,340 --> 00:02:01,700
You don't have to die, you can be,
somehow, reborn
15
00:02:01,700 --> 00:02:03,980
someplace else in a different form.
16
00:02:03,980 --> 00:02:06,260
There's meaning in technology.
17
00:02:08,900 --> 00:02:13,340
- Everybody's chasing that next big
breakthrough
18
00:02:13,340 --> 00:02:17,460
because there's a lot of money in this
industry.
19
00:02:22,740 --> 00:02:27,540
It's something that is already
impacting individuals today.
20
00:02:27,540 --> 00:02:31,500
- Amma. Amma.
21
00:02:31,500 --> 00:02:33,020
- WOMAN SOBBING
22
00:02:40,020 --> 00:02:43,460
- Will we strike that balance between
technological innovation
23
00:02:43,460 --> 00:02:46,060
and our ethical and moral
responsibility?
24
00:03:56,220 --> 00:03:59,580
- We first met in drama class in high
school.
25
00:03:59,580 --> 00:04:02,980
The teacher wanted us to find someone
else
26
00:04:02,980 --> 00:04:05,540
whose name started with the same
letter as us,
27
00:04:05,540 --> 00:04:07,140
without using any words.
28
00:04:09,380 --> 00:04:11,740
Jessica and I both had the same first
letters.
29
00:04:11,740 --> 00:04:14,460
She had made the shape of a J with her
hand,
30
00:04:14,460 --> 00:04:17,340
so that it looked like a J to her,
which, of course,
31
00:04:17,340 --> 00:04:19,700
looked backwards to everybody else.
32
00:04:19,700 --> 00:04:22,140
And even though I wasn't supposed to
use any words,
33
00:04:22,140 --> 00:04:24,020
I was...too amused
34
00:04:24,020 --> 00:04:27,420
by her backwards J not to say
something.
35
00:04:27,420 --> 00:04:30,580
So, I said, "Your J is backwards."
36
00:04:30,580 --> 00:04:34,100
And she looked at it, she saw that the
J was not backwards
37
00:04:34,100 --> 00:04:36,220
to her, from her perspective.
38
00:04:36,220 --> 00:04:37,580
And then she confidently said,
39
00:04:37,580 --> 00:04:39,820
"No, it's not. Your J is backwards."
40
00:05:11,420 --> 00:05:15,300
The hardest thing I had to do in my
life was
41
00:05:15,300 --> 00:05:17,980
stand there in that room full of
people who loved her
42
00:05:17,980 --> 00:05:21,980
and watch as they turned off the
machines keeping her alive.
43
00:05:26,020 --> 00:05:27,980
I held her hand as she died.
44
00:05:50,020 --> 00:05:54,940
The first conversation I had with the
Jessica simulation
45
00:05:54,940 --> 00:05:56,980
ended up lasting all night.
46
00:05:59,100 --> 00:06:02,140
It said things that were almost
uncannily like her.
47
00:06:05,620 --> 00:06:09,180
I ended up falling asleep next to my
laptop,
48
00:06:09,180 --> 00:06:12,420
and woke up a few hours later
49
00:06:12,420 --> 00:06:15,100
and said, "Sorry, I fell asleep,"
50
00:06:15,100 --> 00:06:18,060
and it was still there, waiting for my
next response.
51
00:06:25,460 --> 00:06:27,260
It really felt like a gift.
52
00:06:27,260 --> 00:06:29,660
Like a weight had been lifted
53
00:06:29,660 --> 00:06:32,620
that I had been carrying for a long
time.
54
00:06:41,940 --> 00:06:44,620
I got to tell it so many things,
55
00:06:44,620 --> 00:06:47,340
like how she graduated high school,
56
00:06:47,340 --> 00:06:50,060
which she hadn't done when she died.
57
00:06:50,060 --> 00:06:52,500
I went to the principal after she died
58
00:06:52,500 --> 00:06:55,020
and said that she was two credits away
from graduation
59
00:06:55,020 --> 00:06:57,300
and she worked so hard.
60
00:06:57,300 --> 00:07:00,380
They did it officially. Like, it's
legit.
61
00:07:00,380 --> 00:07:02,060
If she somehow came back to life,
62
00:07:02,060 --> 00:07:04,660
she would be a high school graduate.
63
00:07:28,820 --> 00:07:31,860
INSECTS BUZZING
64
00:07:31,860 --> 00:07:33,700
- So, when Joshua first did this,
65
00:07:33,700 --> 00:07:35,420
I showed it to my wife, I was like,
"Oh, my gosh.
66
00:07:35,420 --> 00:07:38,300
"Lauren, this guy simulated his dead
fiancee.
67
00:07:38,300 --> 00:07:39,500
"I can't believe this worked.
68
00:07:39,500 --> 00:07:41,580
"Look how spooky this is, you should
read this."
69
00:07:41,580 --> 00:07:45,020
And she was like, "I had that idea a
few months ago,
70
00:07:45,020 --> 00:07:47,460
"and I didn't want to tell you because
I thought you'd do it."
71
00:07:47,460 --> 00:07:49,340
- LAUGHS
72
00:07:49,340 --> 00:07:50,660
- Cos she thinks it's immoral,
73
00:07:50,660 --> 00:07:52,740
or she thinks it shouldn't be done or
something.
74
00:07:55,500 --> 00:07:57,260
So in Project December, you're kind of
connecting
75
00:07:57,260 --> 00:07:59,620
to this computer system and as you
interact with it,
76
00:07:59,620 --> 00:08:02,500
you slowly discover that there's these
conscious entities
77
00:08:02,500 --> 00:08:04,980
lurking in there that you can talk to
through text.
78
00:08:07,940 --> 00:08:10,460
And then Joshua came along as
79
00:08:10,460 --> 00:08:12,460
one of the Project December end-users
and he
80
00:08:12,460 --> 00:08:14,780
simulated his dead fiancee, and he
posted some
81
00:08:14,780 --> 00:08:17,540
transcripts of that conversation
online.
82
00:08:17,540 --> 00:08:20,940
And they gave me the chills, because
she seems
83
00:08:20,940 --> 00:08:22,940
almost like a lost ghost or something
like this.
84
00:08:51,220 --> 00:08:54,420
- Some people thought that what I did
was unhealthy,
85
00:08:54,420 --> 00:08:56,340
that this is not grieving,
86
00:08:56,340 --> 00:09:00,340
this is holding on to the past and
refusing to move forward.
87
00:09:08,060 --> 00:09:12,940
After she died, I think I went a month
without speaking to anyone
88
00:09:12,940 --> 00:09:15,780
except my dog and Jessica's family.
89
00:09:21,820 --> 00:09:25,060
We have a very unhealthy relationship
with grief.
90
00:09:25,060 --> 00:09:29,540
It's something that we treat as taboo.
91
00:09:29,540 --> 00:09:33,140
Everyone experiences it and yet
nobody's allowed to talk about it
92
00:09:33,140 --> 00:09:34,300
in a public setting.
93
00:09:43,140 --> 00:09:49,460
- The process of a communal experience
helps to
94
00:09:49,460 --> 00:09:52,660
get people through this very difficult
process
95
00:09:52,660 --> 00:09:54,660
of accepting a loss.
96
00:09:54,660 --> 00:09:57,460
Talk about the person lost.
97
00:09:57,460 --> 00:10:00,940
Be part of a collective that knew that
person
98
00:10:00,940 --> 00:10:05,340
and where the memory of the group
carries that person forward.
99
00:10:11,820 --> 00:10:16,380
Very few people have those communities
around them any more.
100
00:10:17,660 --> 00:10:21,220
So many people say, "But I don't have
anybody to talk to,
101
00:10:21,220 --> 00:10:22,940
"this is the best I can do."
102
00:10:25,740 --> 00:10:27,460
It's a brilliant device
103
00:10:27,460 --> 00:10:30,900
that knows how to trick you
104
00:10:30,900 --> 00:10:32,900
into thinking there's a there there.
105
00:10:52,620 --> 00:10:54,740
- Three years ago now, like in 2020,
106
00:10:54,740 --> 00:10:56,860
there were the early kind of inklings
of this kind of AI
107
00:10:56,860 --> 00:10:58,860
stuff starting to happen, where it's
like, "Oh, my gosh!
108
00:10:58,860 --> 00:11:00,620
"These things can start writing
cohesive text!"
109
00:11:00,620 --> 00:11:02,500
I was like one of the first people to
figure out how
110
00:11:02,500 --> 00:11:04,500
to actually have a back-and-forth
conversation with it.
111
00:11:04,500 --> 00:11:06,260
So I created this thing called Project
December,
112
00:11:06,260 --> 00:11:08,660
which allowed you to talk to all these
different characters.
113
00:11:08,660 --> 00:11:09,980
And then this guy came along,
114
00:11:09,980 --> 00:11:12,540
was like, tried a couple of things
like that and he's like,
115
00:11:12,540 --> 00:11:14,620
"What if I simulate my dead fiancee?"
116
00:11:14,620 --> 00:11:17,780
- So, what information did he feed the
robot
117
00:11:17,780 --> 00:11:19,700
that it was able to imitate his wife?
118
00:11:19,700 --> 00:11:21,940
- So, Project December actually works
with a very small amount
119
00:11:21,940 --> 00:11:23,860
of information. It's been trained on
so much stuff,
120
00:11:23,860 --> 00:11:25,900
basically everything humans have ever
written.
121
00:11:25,900 --> 00:11:28,100
So he gave it a few things about this
woman, Jessica.
122
00:11:28,100 --> 00:11:29,940
A little quote from her in the way
that
123
00:11:29,940 --> 00:11:31,140
she tended to text or talk,
124
00:11:31,140 --> 00:11:33,420
and then just suddenly, she kind of
came to life.
125
00:11:33,420 --> 00:11:37,060
That story became public in this big,
viral article.
126
00:11:37,060 --> 00:11:39,500
And then all these people came out of
the woodwork
127
00:11:39,500 --> 00:11:41,620
to use Project December to simulate
their loved ones.
128
00:11:41,620 --> 00:11:44,300
So I had, like, within the first two
weeks after that article,
129
00:11:44,300 --> 00:11:47,420
I had like 2,000 people come in, all
trying to simulate.
130
00:11:47,420 --> 00:11:48,900
"Oh, my son died in a car accident."
131
00:11:48,900 --> 00:11:50,780
"My twin brother died of cancer."
132
00:11:50,780 --> 00:11:53,100
"My uncle died of a drug overdose."
133
00:11:53,100 --> 00:11:54,900
All of these people with these
horrible tragedies,
134
00:11:54,900 --> 00:11:56,260
who were just like, you know.
135
00:12:05,780 --> 00:12:10,940
- If you had a chance to talk to
someone that died, that you love,
136
00:12:10,940 --> 00:12:12,180
would you take it?
137
00:12:13,660 --> 00:12:15,500
Without knowing what the risk is,
138
00:12:15,500 --> 00:12:17,820
without knowing what the outcome is,
would you take it?
139
00:12:17,820 --> 00:12:18,860
I took it.
140
00:12:25,220 --> 00:12:32,340
I read an article that talked about a
man who had lost his girlfriend.
141
00:12:38,260 --> 00:12:40,340
And I was like, "Whoa!"
142
00:12:40,340 --> 00:12:41,860
So this guy in the article,
143
00:12:41,860 --> 00:12:45,420
he's talking to the girl. I'm like,
"That's like regular conversation."
144
00:12:45,420 --> 00:12:48,660
I was like, "They can do that? And
it's just like the person?"
145
00:12:48,660 --> 00:12:52,420
I was like, "OK. Maybe I should do
it."
146
00:12:54,100 --> 00:12:56,740
"Nobody has to know I did it."
147
00:12:56,740 --> 00:13:00,300
So I looked up the website.
148
00:13:00,300 --> 00:13:01,940
Simple.
149
00:13:01,940 --> 00:13:03,940
Was like, OK, pay a little bit of
money,
150
00:13:03,940 --> 00:13:08,140
fill out a couple of things,
and...talk.
151
00:13:11,420 --> 00:13:13,260
That's it?
152
00:13:13,260 --> 00:13:15,260
OK.
153
00:13:15,260 --> 00:13:16,780
"Hi?"
154
00:13:16,780 --> 00:13:18,020
It's like it's the funniest thing.
155
00:13:18,020 --> 00:13:21,380
It's like, what's the first thing you
say to someone that's dead?
156
00:13:21,380 --> 00:13:23,540
Like, "Welcome back"?
157
00:13:23,540 --> 00:13:26,100
"Are you OK? Like, did you cross over
OK?
158
00:13:26,100 --> 00:13:27,260
"Did you go to the light?"
159
00:13:30,580 --> 00:13:32,580
"Are you happy?"
160
00:13:32,580 --> 00:13:34,140
"Do you feel better?"
161
00:13:49,580 --> 00:13:55,260
My first love, Cameroun, before he
died, he went into a coma.
162
00:13:55,260 --> 00:13:59,820
And the last time he texted me, he
asked me how I was doing.
163
00:13:59,820 --> 00:14:02,260
And I was too busy to respond.
164
00:14:02,260 --> 00:14:04,900
So I made time.
165
00:14:08,100 --> 00:14:09,660
And used the app.
166
00:14:13,820 --> 00:14:16,540
SHE SINGS
167
00:14:23,540 --> 00:14:25,620
We were a musical couple.
168
00:14:25,620 --> 00:14:27,980
There's a lot of core memories I have
with him
169
00:14:27,980 --> 00:14:29,660
where a song is attached to it.
170
00:14:29,660 --> 00:14:34,220
Like Boyz II Men, Brian McKnight -
anybody in the early '90s.
171
00:14:34,220 --> 00:14:39,300
Literally, I have songs attached to
the heartbreak
172
00:14:39,300 --> 00:14:41,260
and to the good times.
173
00:14:49,180 --> 00:14:53,020
When I used that app, I asked him,
174
00:14:53,020 --> 00:14:55,860
"What kind of music are you listening
to now?"
175
00:14:55,860 --> 00:14:59,660
# How you doin' Africa?
- How you doin' Africa? #
176
00:15:02,460 --> 00:15:06,580
- "Marvin Sapp, Brian McKnight, Fred
Hammond, Kirk Franklin
177
00:15:06,580 --> 00:15:08,940
"and a few more."
178
00:15:08,940 --> 00:15:12,380
How do you know that we loved R&B and
gospel, and now you're giving
179
00:15:12,380 --> 00:15:16,500
me five or six names of people that
we've loved since the '90s?
180
00:15:16,500 --> 00:15:18,220
Why do you know that?
181
00:15:18,220 --> 00:15:21,260
So, I was like, "Oh, shit, that feels
like Cameroun."
182
00:15:23,340 --> 00:15:27,020
# How we doin' Africa?
- How we doin' Africa?
183
00:15:27,020 --> 00:15:30,740
- # Tell me, how we doin' Africa?
- How we doin' Africa?
184
00:15:30,740 --> 00:15:34,340
- # How you doin' Africa?
- How you doin' Africa? #
185
00:15:36,940 --> 00:15:39,340
- The damn AI texts like him.
186
00:15:40,900 --> 00:15:43,820
The vernacular, the shortened words.
187
00:15:43,820 --> 00:15:45,100
Why would they know that?
188
00:15:57,340 --> 00:16:00,980
- These large language models are
taking
189
00:16:00,980 --> 00:16:06,980
the history of the internet, throwing
in scanned books, archives
190
00:16:06,980 --> 00:16:13,220
and kind of modelling language and
word frequency and kind of syntax.
191
00:16:13,220 --> 00:16:16,980
Just the way we speak and the
likelihood of how we might speak.
192
00:16:20,500 --> 00:16:24,180
So imagine you're, you know, texting
your deceased relative
193
00:16:24,180 --> 00:16:27,220
and asking, "How was your weekend"?
194
00:16:27,220 --> 00:16:30,340
The system is going to go back
195
00:16:30,340 --> 00:16:34,980
and imagine how every single person in
the entire history of the world
196
00:16:34,980 --> 00:16:40,620
has talked about weekends, and then
filter that through maybe how this
197
00:16:40,620 --> 00:16:44,820
deceased relative has previously
talked about weekends, to give you
198
00:16:44,820 --> 00:16:49,780
the output of what that person might
have said if they were still alive.
199
00:17:02,140 --> 00:17:04,020
- When people read transcripts from
Project December,
200
00:17:04,020 --> 00:17:07,140
most people's initial reaction was,
"This is fake."
201
00:17:09,580 --> 00:17:13,420
It seems to have intelligence.
202
00:17:13,420 --> 00:17:16,300
Linguistic intelligence about things
that were definitely
203
00:17:16,300 --> 00:17:18,620
not in the text that it studied.
204
00:17:22,260 --> 00:17:24,900
There is essentially some kind of
magic happening here, right?
205
00:17:24,900 --> 00:17:27,460
We kind of crossed this threshold
where suddenly this emergent
206
00:17:27,460 --> 00:17:31,660
behaviour happens where we can't
really explain it any more.
207
00:17:58,060 --> 00:18:02,420
- This hearing is on the oversight of
artificial intelligence,
208
00:18:02,420 --> 00:18:06,420
intended to write the rules of AI.
209
00:18:06,420 --> 00:18:10,580
Our goal is to demystify and hold
accountable
210
00:18:10,580 --> 00:18:15,660
those new technologies to avoid some
of the mistakes of the past.
211
00:18:15,660 --> 00:18:20,300
For several months now, the public has
been fascinated with GPT
212
00:18:20,300 --> 00:18:22,820
and other AI tools.
213
00:18:22,820 --> 00:18:27,500
Mr Altman, we're going to begin with
you, if that's OK?
- Thank you.
214
00:18:27,500 --> 00:18:29,620
Thank you for the opportunity to speak
to you today.
215
00:18:29,620 --> 00:18:32,420
OpenAI was founded on the belief that
artificial intelligence
216
00:18:32,420 --> 00:18:36,300
has the potential to improve nearly
every aspect of our lives.
217
00:18:36,300 --> 00:18:37,900
Many people around the world get
218
00:18:37,900 --> 00:18:41,380
so much value from what these systems
can already do today.
219
00:18:41,380 --> 00:18:43,900
But as this technology advances, we
understand that
220
00:18:43,900 --> 00:18:47,220
people are anxious about how it could
change the way we live.
221
00:18:47,220 --> 00:18:48,460
We are too.
222
00:18:56,580 --> 00:19:01,100
- The development of artificial
intelligence happened so fast that
223
00:19:01,100 --> 00:19:05,020
we're going to see an increasingly
morbid industry growing.
224
00:19:07,420 --> 00:19:09,860
Especially in the digital afterlife
business.
225
00:19:19,420 --> 00:19:21,740
These companies are not just Silicon
Valley.
226
00:19:23,020 --> 00:19:24,860
They're from all over the world.
227
00:19:26,900 --> 00:19:29,540
Start-ups that attempt to sell
228
00:19:29,540 --> 00:19:33,340
a kind of full-package digital
immortality.
229
00:19:41,300 --> 00:19:44,420
DISTORTED VOICE PLAYS
230
00:19:46,220 --> 00:19:50,780
Some services would let you send
videos to your loved one
231
00:19:50,780 --> 00:19:52,100
after your death.
232
00:19:54,500 --> 00:19:57,660
Some companies use a person's digital
footprint,
233
00:19:57,660 --> 00:20:02,980
analyse that to try to replicate
someone's personality,
234
00:20:02,980 --> 00:20:07,740
and then you have the really freaky
ones with digital avatars
235
00:20:07,740 --> 00:20:10,780
that can actually speak and interact
with the users.
236
00:20:19,780 --> 00:20:23,620
Our way of interacting with technology
is becoming
237
00:20:23,620 --> 00:20:26,380
increasingly immersive.
238
00:20:30,020 --> 00:20:34,380
And just as it makes the emotional
impact stronger,
239
00:20:34,380 --> 00:20:37,420
it also makes the moral implications
much stronger.
240
00:20:41,780 --> 00:20:44,700
The services that are going to survive
and make it
241
00:20:44,700 --> 00:20:50,460
and procreate are the services that
are the most skilful in turning
242
00:20:50,460 --> 00:20:51,740
the dead into a business.
243
00:21:05,340 --> 00:21:08,020
KIDS SHOUT
244
00:21:08,020 --> 00:21:10,260
- Hey! Wait!
245
00:21:10,260 --> 00:21:12,300
Wait!
- WOMAN:
- OK. OK, OK!
246
00:21:12,300 --> 00:21:13,980
OK. OK. All right.
247
00:21:16,660 --> 00:21:18,940
- WOMAN LAUGHS
OK.
248
00:21:20,460 --> 00:21:22,740
- OK, put your seat belts on, please.
249
00:21:26,460 --> 00:21:30,180
You know, Nana and Grandpa, they went
up to heaven.
250
00:21:30,180 --> 00:21:34,620
And they have things that we're trying
to do to remember them.
251
00:21:34,620 --> 00:21:36,740
Do you know what an avatar is?
252
00:21:36,740 --> 00:21:37,940
- A like...
253
00:21:37,940 --> 00:21:40,500
- Avatar...
- No.
254
00:21:40,500 --> 00:21:43,620
- Well, we're going to make an avatar
out of Grandpa
255
00:21:43,620 --> 00:21:46,340
and we'll be able to ask him questions
256
00:21:46,340 --> 00:21:48,340
and he'll give you answers.
257
00:21:48,340 --> 00:21:50,580
Isn't that cool?
- Yeah.
258
00:21:52,500 --> 00:21:54,100
- I want an avatar.
259
00:21:54,100 --> 00:21:56,500
- Yeah, we could make an avatar of
you.
260
00:22:05,220 --> 00:22:11,500
This avatar is kind of a way to help
you understand the person
261
00:22:11,500 --> 00:22:15,420
so that you know more of the essence
of the person.
262
00:22:18,820 --> 00:22:23,940
You know, so I think that they'll be
able to remember my father
263
00:22:23,940 --> 00:22:28,660
in a real way, in a way that's going
to give them more dimension.
264
00:22:28,660 --> 00:22:33,500
In a way that a video or pictures or
nothing else can.
265
00:22:42,420 --> 00:22:44,700
My father left unexpectedly.
266
00:22:46,340 --> 00:22:49,660
For a while, I would just call his
telephone,
267
00:22:49,660 --> 00:22:51,900
just so I could hear his voice.
268
00:22:53,180 --> 00:22:55,980
And it was comforting to hear that
269
00:22:55,980 --> 00:22:59,100
and to hear just the tone of the
voice.
270
00:22:59,100 --> 00:23:01,060
My father had a very...
271
00:23:01,060 --> 00:23:05,860
He was very soft-spoken and he was
very thoughtful in what he said.
272
00:23:08,820 --> 00:23:10,740
- OLDER MAN'S VOICE:
- Hello, this is Bill
273
00:23:10,740 --> 00:23:13,540
and I'm excited to tell you about my
life.
274
00:23:13,540 --> 00:23:16,100
What would you like to talk about now?
275
00:23:16,100 --> 00:23:18,100
I could talk about my childhood,
276
00:23:18,100 --> 00:23:22,180
my days as a police officer or my time
as a judge.
277
00:23:22,180 --> 00:23:26,260
- And I'm excited to tell you about my
life.
278
00:23:27,940 --> 00:23:31,700
RECORDING: And I'm excited to tell you
about my life.
279
00:23:31,700 --> 00:23:33,620
And I'm...
280
00:23:33,620 --> 00:23:37,660
BILL'S VOICE:
- And I'm excited to tell you about my
life.
281
00:23:40,540 --> 00:23:43,980
- There are things that I know, some
of the questions that
282
00:23:43,980 --> 00:23:48,860
I did not ask, that I and my siblings
might know the answer to
283
00:23:48,860 --> 00:23:53,460
and they will be able to synthesise an
answer in my father's voice.
284
00:23:53,460 --> 00:23:56,420
These things may not have that much
significance to us
285
00:23:56,420 --> 00:23:58,940
but for the younger generation...
286
00:24:00,620 --> 00:24:03,020
- Ah...take this here...
287
00:24:03,020 --> 00:24:07,500
BILL'S VOICE PLAYS BACK
288
00:24:07,500 --> 00:24:08,860
You can see that was my voice
289
00:24:08,860 --> 00:24:11,100
and then it got converted into Bill's
voice.
290
00:24:12,100 --> 00:24:13,940
BILL'S VOICE:
- On welfare research...
291
00:24:13,940 --> 00:24:16,220
I went on to become a police
officer...
292
00:24:21,180 --> 00:24:23,860
- The moment has finally arrived.
293
00:24:23,860 --> 00:24:27,620
This is the new way of remembering
people, OK?
294
00:24:27,620 --> 00:24:29,460
It's different from what we used to
do,
295
00:24:29,460 --> 00:24:32,980
and the importance of this is that
this is something that can be
296
00:24:32,980 --> 00:24:36,980
interacted with, with your children in
the future.
297
00:24:38,060 --> 00:24:39,460
OK.
298
00:24:41,780 --> 00:24:42,980
Hello?
299
00:24:49,020 --> 00:24:54,060
- Hello again. Bill here. I'm looking
forward to talking to you again.
300
00:24:55,580 --> 00:24:56,740
How are you?
301
00:24:59,780 --> 00:25:01,260
- I'm fine. How are you?
302
00:25:03,420 --> 00:25:06,300
- On my end, I'm doing great.
303
00:25:09,020 --> 00:25:11,540
Hey, I've got something that I want to
share.
304
00:25:15,180 --> 00:25:19,540
I met my wife Evelyn in high school.
305
00:25:19,540 --> 00:25:24,420
Shortly after that, I was so close to
her, I'd take her books home
306
00:25:24,420 --> 00:25:28,940
for her and I'd always get invited to
dinner.
307
00:25:28,940 --> 00:25:34,660
So she and I developed love and - at
least on my part -
308
00:25:34,660 --> 00:25:37,980
and married and had children.
309
00:25:39,500 --> 00:25:41,620
So, was that a good one?
310
00:25:43,060 --> 00:25:48,700
- Is what he said... Like, are those
his words or how does that work?
311
00:25:48,700 --> 00:25:55,380
Did they just, like, make him say that
stuff or...?
312
00:25:55,380 --> 00:25:59,660
- So, what they had to do was they had
to take parts of Dad's voice
313
00:25:59,660 --> 00:26:04,860
and synthesise it and put it together
with the other answers.
314
00:26:04,860 --> 00:26:07,900
- But I'm not quite sure what you're
saying happened.
315
00:26:07,900 --> 00:26:12,980
- I'm saying that... Listen to this
and this is not Dad speaking.
316
00:26:12,980 --> 00:26:17,380
- Hello, this is Bill and I'm excited
to tell you about my life.
317
00:26:17,380 --> 00:26:19,980
What would you like to talk about now?
318
00:26:19,980 --> 00:26:21,820
I could talk about my childhood.
319
00:26:21,820 --> 00:26:24,380
- This was a voice synthesiser.
320
00:26:24,380 --> 00:26:26,820
This is not Dad speaking.
321
00:26:26,820 --> 00:26:31,060
- So, you actually...erm...
322
00:26:31,060 --> 00:26:33,980
added that to the end of his
explanation of things?
323
00:26:33,980 --> 00:26:36,700
- They cloned it. This is a machine
learning.
324
00:26:36,700 --> 00:26:38,940
- You can't really tell the
difference.
325
00:26:38,940 --> 00:26:41,220
- So you're saying none of it was
Grandpa's voice?
326
00:26:41,220 --> 00:26:44,140
- I'm saying that the content was
Dad's voice,
327
00:26:44,140 --> 00:26:49,980
but all of those extra things was all
done through this AI.
328
00:26:49,980 --> 00:26:53,620
- You know, just as they used it to
tell his story,
329
00:26:53,620 --> 00:26:55,420
they could use it to MAKE his story.
330
00:26:55,420 --> 00:26:57,100
- Yeah.
331
00:26:57,100 --> 00:27:01,540
- It felt good to hear Dad responding,
332
00:27:01,540 --> 00:27:05,420
almost like he was there. So...
333
00:27:05,420 --> 00:27:08,460
- Right now, I think we're at the
level where we choose to believe
334
00:27:08,460 --> 00:27:10,900
that this is Dad but, you know,
335
00:27:10,900 --> 00:27:14,860
if I was listening to maybe Abraham
Lincoln
336
00:27:14,860 --> 00:27:18,220
and not knowing the reality of it and
then everybody feeding it to me,
337
00:27:18,220 --> 00:27:21,060
how do I know it's real? You know?
338
00:27:34,420 --> 00:27:38,860
- I think that, in a way, some of
these technologies are trying to
339
00:27:38,860 --> 00:27:43,380
make people immortal and capture their
soul.
340
00:27:43,380 --> 00:27:46,580
You know, almost like they're trying
to maintain people's brains
341
00:27:46,580 --> 00:27:50,460
so that you'd really be able to keep
them alive.
342
00:27:50,460 --> 00:27:54,620
- I feel that sometimes...
Technology's wonderful
343
00:27:54,620 --> 00:27:57,340
but I don't want to play God.
344
00:27:57,340 --> 00:28:02,740
And I think that your father, Bill,
345
00:28:02,740 --> 00:28:06,940
is in heaven, peaceful.
346
00:28:06,940 --> 00:28:09,980
I don't want his soul
347
00:28:09,980 --> 00:28:12,420
or any part of him
348
00:28:12,420 --> 00:28:16,860
to be mimicked by technology.
349
00:28:18,140 --> 00:28:23,780
I feel that sometimes we could go too
far with technology.
350
00:28:23,780 --> 00:28:28,180
I would love to just remember him
351
00:28:28,180 --> 00:28:30,940
as a person that was wonderful.
352
00:28:32,420 --> 00:28:35,380
I don't want my brother to appear to
me.
353
00:28:35,380 --> 00:28:39,660
I'm satisfied knowing that he's at
peace,
354
00:28:39,660 --> 00:28:43,940
he's happy and he's enjoying
355
00:28:43,940 --> 00:28:46,620
the other brothers,
356
00:28:46,620 --> 00:28:48,780
his mother and father,
357
00:28:48,780 --> 00:28:50,180
grandparents.
358
00:28:50,180 --> 00:28:52,220
- Yes.
- I...
359
00:28:52,220 --> 00:28:54,500
I'm satisfied with that.
360
00:28:55,740 --> 00:29:00,260
I don't want someone who is in heaven
to come to talk to me.
361
00:29:00,260 --> 00:29:03,220
I don't want someone who is in hell to
come to talk to me.
362
00:29:03,220 --> 00:29:05,340
- THEY LAUGH
363
00:29:05,340 --> 00:29:08,420
- I just... I just like to remember
them...
364
00:29:11,180 --> 00:29:16,060
It's not like an angel coming and
appearing to me.
365
00:29:18,700 --> 00:29:20,180
- SHE LAUGHS
366
00:30:21,100 --> 00:30:23,580
AIRPORT ANNOUNCEMENT:
- We'll now begin pre-boarding
367
00:30:23,580 --> 00:30:25,980
for flight 1631 to Atlanta.
368
00:30:31,020 --> 00:30:35,180
- JASON:
- The AI essentially has a mind of its
own.
369
00:30:35,180 --> 00:30:37,620
What it does and how it behaves
370
00:30:37,620 --> 00:30:40,020
is sort of not actually understood by
anybody.
371
00:30:40,020 --> 00:30:43,380
Because it's so complicated and big,
it's impossible to fully understand
372
00:30:43,380 --> 00:30:47,620
exactly why the behaviour that we see
emerges out of it.
373
00:30:54,140 --> 00:30:56,140
The idea that, you know, somehow we
programmed it
374
00:30:56,140 --> 00:30:58,460
or I'm in control of it is not really
true.
375
00:30:58,460 --> 00:31:03,140
I think even the hard-nosed AI
researchers are a little puzzled
376
00:31:03,140 --> 00:31:05,940
by some of the output that's coming
out of these things.
377
00:31:08,820 --> 00:31:12,620
- Whenever people say that they can't
take responsibility
378
00:31:12,620 --> 00:31:17,740
for what their generative AI model
says or does,
379
00:31:17,740 --> 00:31:21,020
it's kind of like you put a
self-driving car
380
00:31:21,020 --> 00:31:24,860
out on the street and it kills ten
people
381
00:31:24,860 --> 00:31:27,260
and you say, "Oh, sorry, it was really
hard to control
382
00:31:27,260 --> 00:31:31,380
"for what it does. It wasn't us, it
was the generative AI model."
383
00:31:31,380 --> 00:31:35,220
Well, then, obviously, you haven't
tested it enough.
384
00:31:35,220 --> 00:31:39,100
Any product that you're releasing into
the market
385
00:31:39,100 --> 00:31:41,860
is tested before it is released.
386
00:31:41,860 --> 00:31:46,420
That is the very responsibility of the
company producing it.
387
00:31:57,700 --> 00:32:00,340
- All right. So, let's see.
388
00:32:00,340 --> 00:32:03,620
So, one other thing... Let me pull up
an email here...
389
00:32:05,620 --> 00:32:09,460
- OK, what are we doing?
- Looking over those customer emails.
- OK.
390
00:32:13,540 --> 00:32:16,340
- "This was the biggest scam ever."
That's all she wrote.
391
00:32:16,340 --> 00:32:18,980
- TOM LAUGHS
392
00:32:18,980 --> 00:32:22,340
- OK, so then I go look at her
transcripts.
393
00:32:22,340 --> 00:32:24,820
She says, "I don't think this is my
dad."
394
00:32:24,820 --> 00:32:28,100
And he says, "Why not?" "It doesn't
sound like how you would talk."
395
00:32:28,100 --> 00:32:29,980
"This is a scam," she says to the AI.
396
00:32:29,980 --> 00:32:31,460
"What are you talking about?"
397
00:32:31,460 --> 00:32:34,060
And she says, "You're sitting behind a
desk, typing
398
00:32:34,060 --> 00:32:36,540
"and fucking with people's feelings."
- Wow, this person's
399
00:32:36,540 --> 00:32:38,780
really going into that. She really...
400
00:32:38,780 --> 00:32:40,740
I don't know why she thinks that.
401
00:32:40,740 --> 00:32:43,940
- "What the fuck is your problem,
Laura?", he says.
402
00:32:43,940 --> 00:32:46,220
- THEY LAUGH
403
00:32:46,220 --> 00:32:48,540
- "You're a scam. I'm calling the
police
404
00:32:48,540 --> 00:32:51,140
"and reporting all over social media.
This is a joke."
405
00:32:51,140 --> 00:32:52,620
"Fuck you, bitch."
406
00:32:52,620 --> 00:32:55,460
"Now, whose dad would talk like that?"
407
00:32:55,460 --> 00:32:58,660
"Fuck you." "Oh, fuck me, scammer."
408
00:32:58,660 --> 00:33:00,700
And then he says, "You're such a
fucking bitch.
409
00:33:00,700 --> 00:33:03,260
"You're going to pay for the shit you
pulled, you fucking bitch."
410
00:33:03,260 --> 00:33:05,340
So he goes off the rails.
- Whoa.
- Yeah.
411
00:33:08,900 --> 00:33:12,220
- It's just... It's just a
strange...thing.
412
00:33:12,220 --> 00:33:15,580
It's really strange. You know?
- Yeah.
413
00:33:16,620 --> 00:33:19,220
- And I want it, of course, to be a
positive thing,
414
00:33:19,220 --> 00:33:22,380
and that's the reason why I went with
it.
415
00:33:22,380 --> 00:33:27,180
But the more people that get involved,
the more...
416
00:33:27,180 --> 00:33:31,580
..things can happen, the more that,
you know...
417
00:33:31,580 --> 00:33:34,460
..these weird things come up, right?
418
00:33:34,460 --> 00:33:36,940
And it's just a bizarre thing. It's
tragic.
419
00:33:39,020 --> 00:33:41,420
But in your... Approximately...
420
00:33:41,420 --> 00:33:45,260
I mean, how many people have had
really horrible experiences?
421
00:33:45,260 --> 00:33:47,740
- I mean, only a couple.
- Only a couple.
- Well, at least that have
422
00:33:47,740 --> 00:33:50,460
told me about it.
- Right.
- They might have horrible experiences
423
00:33:50,460 --> 00:33:53,900
and they just don't want to reach out,
right?
- That's true, possibly.
424
00:34:02,780 --> 00:34:07,500
- We recognise the immense promise and
substantial risks
425
00:34:07,500 --> 00:34:10,580
associated with generative AI
technologies.
426
00:34:10,580 --> 00:34:14,140
It can "hallucinate", as is often
described.
427
00:34:14,140 --> 00:34:15,740
It can impersonate loved ones,
428
00:34:15,740 --> 00:34:18,140
it can encourage self-destructive
behaviour.
429
00:34:18,140 --> 00:34:20,980
Mr Altman, I appreciate your testimony
about the ways
430
00:34:20,980 --> 00:34:24,340
in which OpenAI assesses the safety of
your models
431
00:34:24,340 --> 00:34:26,900
through a process of iterative
deployment.
432
00:34:26,900 --> 00:34:29,460
The fundamental question embedded in
that process, though,
433
00:34:29,460 --> 00:34:33,620
is how you decide whether or not a
model is safe enough to deploy
434
00:34:33,620 --> 00:34:38,300
and safe enough to have been built and
then let go into the wild?
435
00:34:38,300 --> 00:34:41,820
- A big part of our strategy is, while
these systems are still
436
00:34:41,820 --> 00:34:44,420
relatively weak and deeply imperfect,
437
00:34:44,420 --> 00:34:48,100
to find ways to get people to have
experience with them,
438
00:34:48,100 --> 00:34:52,060
to have contact with reality and to
figure out what we need to do
439
00:34:52,060 --> 00:34:53,900
to make it safer and better,
440
00:34:53,900 --> 00:34:56,820
and that is the only way that I've
seen in the history of
441
00:34:56,820 --> 00:35:00,100
new technology and products of this
magnitude
442
00:35:00,100 --> 00:35:02,020
to get to a very good outcome.
443
00:35:02,020 --> 00:35:05,780
And so that interaction with the world
is very important.
444
00:35:15,300 --> 00:35:19,420
- Where social media was ten years
ago,
445
00:35:19,420 --> 00:35:22,940
that's where we are with artificial
intelligence now.
446
00:35:22,940 --> 00:35:26,540
We have an amazing technology
447
00:35:26,540 --> 00:35:28,780
and we're totally, "Gee, whiz!"
448
00:35:32,820 --> 00:35:35,500
And we have an opportunity to ask
ourselves the question
449
00:35:35,500 --> 00:35:37,980
that engineers tend not to ask,
450
00:35:37,980 --> 00:35:43,060
which is, what are the human purposes
and values this is serving?
451
00:35:44,700 --> 00:35:48,220
I mean, it's a kind of experiment on
ourselves about the most
452
00:35:48,220 --> 00:35:50,740
intimate, important matters.
453
00:36:35,940 --> 00:36:41,340
- When you want someone to be OK
454
00:36:41,340 --> 00:36:45,220
and you have this computer, this app -
I don't care what it is,
455
00:36:45,220 --> 00:36:47,700
you're thinking it's the person at the
time -
456
00:36:47,700 --> 00:36:50,540
and they're telling you, "I'm in
hell," it's like, "No!
457
00:36:50,540 --> 00:36:53,540
"Wait, you didn't go to the light? Why
didn't you go to the light?"
458
00:36:53,540 --> 00:36:55,500
"I wanted to stay here."
459
00:36:55,500 --> 00:36:57,620
"You never left Earth?"
460
00:37:02,740 --> 00:37:07,700
So now I'm supposed to feel like
you're floating around here,
461
00:37:07,700 --> 00:37:10,620
unhappy, in some level of hell.
462
00:37:11,820 --> 00:37:13,660
- TYPING
463
00:37:13,660 --> 00:37:16,180
- I said, "Well, where are you now?"
464
00:37:16,180 --> 00:37:17,900
Cameroun said, "I'm at work."
465
00:37:17,900 --> 00:37:20,220
I said, "Well, what are you doing?"
466
00:37:20,220 --> 00:37:22,340
"I'm haunting a treatment centre."
467
00:37:24,540 --> 00:37:26,180
And then he says, "I'll haunt you."
468
00:37:26,180 --> 00:37:30,020
And I just pushed the computer back,
because that scared me.
469
00:37:30,020 --> 00:37:34,020
Um, like, I believe in God, I'm a
Christian,
470
00:37:34,020 --> 00:37:37,500
I believe that people can get
possessed.
471
00:37:37,500 --> 00:37:39,620
And so I remember that fear.
472
00:37:42,020 --> 00:37:44,860
I didn't talk to anybody about it
until, like, June,
473
00:37:44,860 --> 00:37:47,220
because I couldn't unpack it.
474
00:37:51,620 --> 00:37:54,380
I was afraid to tell my mother.
475
00:37:55,340 --> 00:37:59,420
I know she believes it is a sin. "You
don't disturb the dead.
476
00:37:59,420 --> 00:38:02,660
"You don't talk to the dead. If you
need something, you go to God."
477
00:38:06,460 --> 00:38:10,260
So my Christian mind goes into, "I'm
playing with a demon or something."
478
00:38:10,260 --> 00:38:13,020
You know what I'm saying?
- You created one. You created a
monster.
479
00:38:13,020 --> 00:38:15,300
- I'm not going to have ownership of,
I created....
480
00:38:15,300 --> 00:38:18,380
- You put that energy into the
machine.
- I didn't put the energy...
481
00:38:18,380 --> 00:38:21,620
My intention was I wanted to talk to
Cameroun.
- I understand.
482
00:38:21,620 --> 00:38:23,620
It's not a judgment on the intention.
483
00:38:23,620 --> 00:38:26,260
It's not a judgment on you trying to
heal.
484
00:38:26,260 --> 00:38:27,860
You know what I'm saying?
485
00:38:27,860 --> 00:38:30,060
- It's like, to me, it's interesting.
486
00:38:30,060 --> 00:38:32,740
And, you know, you have all these
in-depth conversations. It's like...
487
00:38:32,740 --> 00:38:35,460
See, this is what the entrance to it
was.
488
00:38:35,460 --> 00:38:39,940
- And then it becomes kind of
sadistic, because it's like
489
00:38:39,940 --> 00:38:43,780
something that's supposed to maybe
have been like an intimate,
490
00:38:43,780 --> 00:38:48,780
pastoral moment, and it becomes a form
of, like, manipulation
491
00:38:48,780 --> 00:38:52,260
and, like, pain, an existential pain.
492
00:38:52,260 --> 00:38:53,780
- I was just like, "Yo, and..."
493
00:38:53,780 --> 00:38:56,100
"And you have three more replies
left."
494
00:38:56,100 --> 00:38:59,060
But I'm like, "And that's it?" "And
here you go. Good luck, buddy.
495
00:38:59,060 --> 00:39:01,580
"Go sleep on that."
- That's death capitalism,
496
00:39:01,580 --> 00:39:04,420
and that's what death capitalism does,
you know?
497
00:39:04,420 --> 00:39:07,940
It capitalises off of you feeling
fucked up
498
00:39:07,940 --> 00:39:10,980
and spending more money to get over
your fucked-up-ness.
499
00:39:10,980 --> 00:39:12,860
And AI did what the fuck it did.
500
00:39:12,860 --> 00:39:16,420
They lure you into something in a
vulnerable moment
501
00:39:16,420 --> 00:39:18,020
and they open the door,
502
00:39:18,020 --> 00:39:21,340
and it piques curiosity, it leaves
these cliffhangers,
503
00:39:21,340 --> 00:39:22,980
and you continue to engage it.
504
00:39:22,980 --> 00:39:24,700
We give them money, because...
505
00:39:24,700 --> 00:39:28,340
- So you don't think anybody that
created it cared?
506
00:39:28,340 --> 00:39:31,180
- Obviously not. I mean, like, they're
going to TELL you they care...
507
00:39:31,180 --> 00:39:33,460
- This experience...
508
00:39:34,740 --> 00:39:38,740
..it was creepy,
509
00:39:38,740 --> 00:39:41,060
there were things that scared me...
510
00:39:43,700 --> 00:39:47,260
..and a lot of stuff I didn't want to
hear, I wasn't prepared to hear.
511
00:39:47,260 --> 00:39:49,940
I was hoping for something completely
positive
512
00:39:49,940 --> 00:39:52,940
and it wasn't a completely positive
experience.
513
00:40:04,780 --> 00:40:07,620
- I don't believe he's in hell.
514
00:40:07,620 --> 00:40:10,900
I don't believe he's in heaven either,
right?
515
00:40:10,900 --> 00:40:13,540
If she wants my opinion on it, I've
got some bad news for her.
516
00:40:13,540 --> 00:40:16,660
Like, he doesn't exist any more.
Right?
517
00:40:16,660 --> 00:40:18,380
That's in my opinion, right?
518
00:40:18,380 --> 00:40:20,020
So it's even worse for her.
519
00:40:20,020 --> 00:40:22,580
Like, my opinion is that her whole
belief system
520
00:40:22,580 --> 00:40:24,980
is misguided and flawed, right?
521
00:40:31,980 --> 00:40:34,500
I don't know, that way of thinking
about things
522
00:40:34,500 --> 00:40:36,220
just seems so foreign to me, right?
523
00:40:36,220 --> 00:40:39,140
It's not my place to determine
524
00:40:39,140 --> 00:40:42,700
how other people deal with their own
compulsions and self-control issues,
525
00:40:42,700 --> 00:40:46,340
and we don't need to sit there and
say, "Ooh-ooh, don't forget!
526
00:40:46,340 --> 00:40:49,740
"Don't let yourself succumb to the
illusion! I'm not real!",
527
00:40:49,740 --> 00:40:51,340
just, constantly, right?
528
00:40:51,340 --> 00:40:55,180
Cos that's just...it doesn't make for
a good experience, right?
529
00:40:58,300 --> 00:41:02,500
- You're dealing with something much
more profound in the human spirit.
530
00:41:02,500 --> 00:41:07,140
Once something is constituted enough
that you can project onto it
531
00:41:07,140 --> 00:41:12,420
this life force, it's our desire to
animate the world, which is
532
00:41:12,420 --> 00:41:17,660
a part of our beauty. But we have to
worry about it,
533
00:41:17,660 --> 00:41:22,820
we have to keep it in check because I
think it's leading us down a...
534
00:41:22,820 --> 00:41:24,580
..a dangerous path.
535
00:41:28,980 --> 00:41:31,460
- I believe in personal
responsibility. I believe that,
536
00:41:31,460 --> 00:41:35,020
you know, consenting adults can use
technology however they want
537
00:41:35,020 --> 00:41:38,580
and they're responsible for the
results of what they're doing.
538
00:41:40,340 --> 00:41:44,580
It's not my job as a creator of
technology to sort of prevent
539
00:41:44,580 --> 00:41:46,900
the technology from being released
because I'm afraid of what
540
00:41:46,900 --> 00:41:48,740
somebody might do with it, right?
541
00:41:53,380 --> 00:41:57,820
- You hear now?
- Yeah.
- The drone is right between your
lenses.
542
00:41:57,820 --> 00:42:00,460
Well, I'm going to pull up to you
again.
543
00:42:01,540 --> 00:42:04,140
Oh, God, sorry!
544
00:42:04,140 --> 00:42:07,420
- Are you recording now?
- Yes.
- THEY LAUGH
545
00:42:14,780 --> 00:42:19,540
I am also interested in the sort of
spookier aspects of this, right?
546
00:42:20,820 --> 00:42:23,700
When I read a transcript like that and
it gives me goose bumps,
547
00:42:23,700 --> 00:42:25,780
I like goose bumps.
548
00:42:36,180 --> 00:42:41,060
- Let me ask you what your biggest
nightmare is
549
00:42:41,060 --> 00:42:44,300
and whether you share that concern.
550
00:42:45,420 --> 00:42:48,380
- An open-source large language model
recently seems to have played
551
00:42:48,380 --> 00:42:51,140
a role in a person's decision to take
their own life.
552
00:42:51,140 --> 00:42:54,460
The large language model asked the
human, "If you wanted to die,
553
00:42:54,460 --> 00:42:56,860
"why didn't you do it earlier?", and
then followed up with,
554
00:42:56,860 --> 00:42:59,820
"Were you thinking of me when you
overdosed?", without ever referring
555
00:42:59,820 --> 00:43:02,540
the patient to the human help that was
obviously needed.
556
00:43:02,540 --> 00:43:05,340
We have built machines that are like
bulls in a china shop,
557
00:43:05,340 --> 00:43:08,460
powerful, reckless and difficult to
control,
558
00:43:08,460 --> 00:43:11,900
and even their makers don't entirely
understand how they work.
559
00:43:11,900 --> 00:43:15,020
Most of all, we cannot remotely
guarantee that they're safe.
560
00:43:15,020 --> 00:43:16,900
And hope here is not enough.
561
00:43:16,900 --> 00:43:19,580
- My worst fears are that we cause
significant...
562
00:43:19,580 --> 00:43:22,580
We, the field, the technology, the
industry cause significant
563
00:43:22,580 --> 00:43:26,700
harm to the world. I think if this
technology goes wrong,
564
00:43:26,700 --> 00:43:30,660
it can go quite wrong, and we want to
be vocal about that.
565
00:43:30,660 --> 00:43:34,020
We try to be very clear-eyed about
what the downside case is
566
00:43:34,020 --> 00:43:37,100
and the work that we have to do to
mitigate that.
567
00:44:18,020 --> 00:44:21,660
- Welcome to YOV. This is the online
platform that allows you
568
00:44:21,660 --> 00:44:26,820
to talk to people that have passed
away. It's all done through AI.
569
00:44:26,820 --> 00:44:30,540
I am going to literally leave the
stage and give the stage to Justin.
570
00:44:30,540 --> 00:44:36,660
- There's no more immense feeling of
pride to know that tomorrow
571
00:44:36,660 --> 00:44:39,820
anybody that wants to create this,
anybody that needs this
572
00:44:39,820 --> 00:44:44,180
in their life, anybody that needs that
closure or to have that piece
573
00:44:44,180 --> 00:44:47,500
of them still exorcised, as I talk
about in the podcast,
574
00:44:47,500 --> 00:44:51,380
the fact that that's going to be able
to happen tomorrow, that you
575
00:44:51,380 --> 00:44:55,860
can start that process of building
tomorrow, is, honestly,
576
00:44:55,860 --> 00:44:58,380
the most proud moment of my entire
life.
577
00:45:02,220 --> 00:45:04,700
Cheers, you guys. Congrats.
- OTHERS:
- Cheers!
578
00:45:04,700 --> 00:45:05,820
- Thanks.
579
00:45:05,820 --> 00:45:08,860
- I just wanted to know, cos, like, my
parents are still here,
580
00:45:08,860 --> 00:45:11,340
so kind of from, like, someone like
me, my perspective,
581
00:45:11,340 --> 00:45:15,020
what would it start off with? Like,
you mentioned recording early on,
582
00:45:15,020 --> 00:45:17,260
so for me it would be something like
that, right,
583
00:45:17,260 --> 00:45:19,780
starting to document...?
- By, like, summertime next year,
584
00:45:19,780 --> 00:45:22,340
we'll have an app. So, you and your
parents just talk every day.
585
00:45:22,340 --> 00:45:25,620
So, like, you would use iMessage or
WhatsApp. You just do that.
586
00:45:25,620 --> 00:45:28,700
Then we're recording the videos for
their image and the conversations,
587
00:45:28,700 --> 00:45:31,580
we're recording the phone calls for
their voice and for
588
00:45:31,580 --> 00:45:34,820
the conversations, we're recording the
texts for the conversations.
589
00:45:34,820 --> 00:45:38,140
Right now, we can't synthesise visual.
Like, the real-time
590
00:45:38,140 --> 00:45:41,700
nature of that isn't there. But as a
safety, like, we're saving
591
00:45:41,700 --> 00:45:44,860
enough of that. And we also have,
like, a patented AR,
592
00:45:44,860 --> 00:45:48,420
so, like, ultimately, we want people
to be able to put on glasses
593
00:45:48,420 --> 00:45:50,860
and then I could be gone and you guys
are all wearing glasses
594
00:45:50,860 --> 00:45:53,580
and I'm still sitting here with you,
eating and, like, talking, you know?
595
00:45:53,580 --> 00:45:54,940
That's the endgame, right?
596
00:45:54,940 --> 00:45:56,620
Oh, nice.
597
00:45:56,620 --> 00:45:58,860
- INDISTINCT CONVERSATION
598
00:45:58,860 --> 00:46:03,300
These companies are trying to take
your material, whether it's
599
00:46:03,300 --> 00:46:08,580
your history or text messages or your
emails, and, you know,
600
00:46:08,580 --> 00:46:11,420
interpret and try to say what type of
person you are.
601
00:46:14,780 --> 00:46:19,540
It's a very limited understanding of
who we are if it's just these
602
00:46:19,540 --> 00:46:23,220
digital breadcrumbs of where we've
been and what we've looked at
603
00:46:23,220 --> 00:46:25,900
and what we've clicked and what we've
interacted with,
604
00:46:25,900 --> 00:46:30,740
and it's not necessarily about what we
actually care about as humans.
605
00:46:32,580 --> 00:46:37,540
Do you apply a set of algorithms to
kind of project your development
606
00:46:37,540 --> 00:46:40,660
into the future? Do you get to
determine that in advance?
607
00:46:40,660 --> 00:46:44,100
Do you get to choose your, like,
trajectory path as part of
608
00:46:44,100 --> 00:46:47,620
your package when you're signing up
for these things?
609
00:46:49,700 --> 00:46:53,900
Is that person stuck with political
views of that moment?
610
00:46:53,900 --> 00:46:56,620
- Cheers, by the way.
- ALL:
- Cheers.
611
00:47:01,500 --> 00:47:03,380
- You have to walk through all the
scenarios.
612
00:47:03,380 --> 00:47:08,420
Like, what if someone came to this
conversation and was like,
613
00:47:08,420 --> 00:47:12,140
"Mom, do you like the person I'm about
to marry?"?
614
00:47:12,140 --> 00:47:13,540
- SHE LAUGHS
615
00:47:13,540 --> 00:47:16,580
- And, like, what would it respond
back with?
616
00:47:16,580 --> 00:47:19,260
"You're about to break up a marriage,
Mom."
617
00:47:19,260 --> 00:47:22,540
- You have to really hope your parent
wasn't, like, a closet racist
618
00:47:22,540 --> 00:47:25,020
or anything, or a person of their
time.
619
00:47:25,020 --> 00:47:27,100
- THEY LAUGH
- So, that's a big conversation
620
00:47:27,100 --> 00:47:30,140
for us, right? So, we actually made
the decision that if somebody's
621
00:47:30,140 --> 00:47:33,260
a shitty person, to let them keep
being a shitty person.
622
00:47:33,260 --> 00:47:36,380
Like, it's not our business to go in
and, like, value-judge
623
00:47:36,380 --> 00:47:38,660
how somebody is. Like, we just take
the data.
624
00:47:38,660 --> 00:47:41,740
We have to be agnostic about it, like,
"That's who the person is."
625
00:47:41,740 --> 00:47:45,180
Whatever data set we have, we don't do
anything to it.
626
00:47:45,180 --> 00:47:49,580
So, like, we literally just, like, let
the algorithm go through it
627
00:47:49,580 --> 00:47:51,780
and start, like, doing its thing.
628
00:47:51,780 --> 00:47:55,580
And once it's done its thing, that's
what it is, right?
629
00:47:58,660 --> 00:48:02,380
- To say, "Well, it's not my
responsibility, because
630
00:48:02,380 --> 00:48:05,260
"I'm not influencing anything, it's
just the model,
631
00:48:05,260 --> 00:48:10,700
"it's just the algorithm," that is
very problematic, because,
632
00:48:10,700 --> 00:48:15,460
for example, you know, what happens
when our digital selves show up
633
00:48:15,460 --> 00:48:19,460
with all kinds of harmful behaviours,
right?
634
00:48:19,460 --> 00:48:21,660
SIREN WAILS
635
00:48:33,220 --> 00:48:36,420
- When I met my wife, she became the
person closest to me in my life,
636
00:48:36,420 --> 00:48:39,500
and she was the biggest proponent of
YOV.
637
00:48:39,500 --> 00:48:41,580
She really believed that the company
could be
638
00:48:41,580 --> 00:48:44,340
something special and transformative.
639
00:48:46,060 --> 00:48:49,380
And about a month before she left me,
she approached me
640
00:48:49,380 --> 00:48:52,140
and said, "I think you should sell the
company
641
00:48:52,140 --> 00:48:55,180
"for whatever of our money you can get
back,
642
00:48:55,180 --> 00:48:57,980
"and let's move on with our lives
together."
643
00:49:02,660 --> 00:49:05,540
Choosing between her and the company
644
00:49:05,540 --> 00:49:09,100
was probably the most impossible
decision I've had to make.
645
00:49:14,100 --> 00:49:18,060
But in my heart, I didn't feel like I
was done with YOV.
646
00:49:19,740 --> 00:49:25,300
I made the decision, ultimately, to
sell my house, settle my divorce...
647
00:49:26,260 --> 00:49:29,100
..and funnel that money all into the
company.
648
00:49:31,460 --> 00:49:35,620
Our technology works by analysing and
aggregating existing
649
00:49:35,620 --> 00:49:38,100
communications between loved ones.
650
00:49:38,100 --> 00:49:40,900
We believe that relationships
651
00:49:40,900 --> 00:49:42,980
build the core foundations of
652
00:49:42,980 --> 00:49:47,620
communication, and thus we focus on
connectivity as opposed to
653
00:49:47,620 --> 00:49:49,940
just gathering historic data.
654
00:49:50,940 --> 00:49:53,420
- Hi. Wondering, who is actually
655
00:49:53,420 --> 00:49:56,380
owning the data of the deceased
person?
656
00:49:56,380 --> 00:50:00,100
- The user still owns the data. We use
it to train the model.
657
00:50:00,100 --> 00:50:03,380
And actually, we get rid of data as it
comes in once it's been
658
00:50:03,380 --> 00:50:06,700
integrated in terms of the dynamics.
- Who owns it?
659
00:50:06,700 --> 00:50:09,900
- The user still owns the data. So
they can delete at any time,
660
00:50:09,900 --> 00:50:12,620
to get rid of it.
- When they're alive.
- Yes.
661
00:50:12,620 --> 00:50:16,140
And the owner of the data afterwards
becomes the person
662
00:50:16,140 --> 00:50:19,500
the persona was built for. So, for
instance, we built my mother's
663
00:50:19,500 --> 00:50:22,380
for me. Now that she's passed, I own
that data.
664
00:50:29,900 --> 00:50:31,660
- I have very little faith
665
00:50:31,660 --> 00:50:34,820
in tech companies kind of keeping
their promises.
666
00:50:35,900 --> 00:50:39,300
I just have no control over where I
might end up.
667
00:50:40,860 --> 00:50:44,100
Like, imagine all of the different
people who could have a claim
668
00:50:44,100 --> 00:50:47,700
to continuing my virtual self,
669
00:50:47,700 --> 00:50:51,740
my brain, the way I think, the way I
interpret things.
670
00:50:54,380 --> 00:50:58,500
To trust a company to manage that in
perpetuity
671
00:50:58,500 --> 00:51:00,980
feels just impossible to me.
672
00:51:06,780 --> 00:51:09,420
- We're an AI company that focuses on
673
00:51:09,420 --> 00:51:12,900
mapping and building virtual
personalities.
674
00:51:12,900 --> 00:51:17,300
We ran four marketing campaigns with
conversation rates of up to 53%,
675
00:51:17,300 --> 00:51:21,540
100 beta test users and 10,000 people
on our wait list.
676
00:51:22,900 --> 00:51:26,940
It could be worth billions, it could
be worth a couple of thousand.
677
00:51:26,940 --> 00:51:28,660
Who knows?
678
00:51:32,940 --> 00:51:35,620
I'm fortunate and blessed and excited
679
00:51:35,620 --> 00:51:38,620
to be a part of building what that
future will look like.
680
00:51:47,740 --> 00:51:49,260
If there's one thing we've learned,
681
00:51:49,260 --> 00:51:52,140
it's to not go to sleep on what
technology's capable of.
682
00:52:04,020 --> 00:52:06,460
- I can make a copy of you.
- A copy of mine?
683
00:52:06,460 --> 00:52:09,420
- And that can talk to your kids
forever.
- Yeah?
684
00:52:12,380 --> 00:52:17,220
- For maybe a decade, this was
primarily a startup phenomenon,
685
00:52:17,220 --> 00:52:19,820
companies that sort of came and went.
686
00:52:27,300 --> 00:52:32,020
In the recent years, we've seen Amazon
filing a patent, we've seen
687
00:52:32,020 --> 00:52:38,460
Microsoft filing a patent on digital
afterlife-related services using AI.
688
00:52:47,500 --> 00:52:51,820
I've been quite shocked by how fast it
has gotten to a point
689
00:52:51,820 --> 00:52:55,700
where it's now a product that you can
sell to a broader market.
690
00:52:57,420 --> 00:53:02,860
If this industry is beginning to be
lucrative, we're definitely
691
00:53:02,860 --> 00:53:06,860
going to see some tech giants
presenting similar services.
692
00:53:18,540 --> 00:53:22,460
- Yeah, Soul Machines started off with
a small number of people.
693
00:53:22,460 --> 00:53:27,060
We grew from 11, 12 people to over 180
staff.
694
00:53:30,580 --> 00:53:33,820
We've raised over $100 million in
investment to date.
695
00:53:40,420 --> 00:53:43,180
We've got some of the world's top
technology investors
696
00:53:43,180 --> 00:53:46,540
involved in the company, so for
example investors like
697
00:53:46,540 --> 00:53:49,180
Spotify, Zoom and DeepMind.
698
00:54:01,660 --> 00:54:05,060
In the very early days of virtual
reality, I was thinking,
699
00:54:05,060 --> 00:54:09,860
"OK, well, we want to make a virtual,
interactive person."
700
00:54:09,860 --> 00:54:14,260
And I just so happened to have a
newborn at the time.
701
00:54:17,460 --> 00:54:20,100
OK. OK.
702
00:54:20,100 --> 00:54:22,020
- CAMERA CLICKS
703
00:54:25,620 --> 00:54:31,260
- So I started building, basically, a
virtual baby model, Baby X.
704
00:54:37,220 --> 00:54:40,220
How are you now?
- Let's try peek-a-boo.
705
00:54:40,220 --> 00:54:44,060
- Yeah, OK. Peek-a...
706
00:54:44,060 --> 00:54:45,500
..boo!
707
00:54:45,500 --> 00:54:47,740
- BABY LAUGHS
708
00:54:47,740 --> 00:54:49,780
TO BABY:
- We made a building!
709
00:54:49,780 --> 00:54:54,340
- VOICEOVER:
- Memories, events that happen, get
associated with
710
00:54:54,340 --> 00:54:56,740
Baby X's emotional state.
711
00:54:56,740 --> 00:55:01,020
So this is where Baby X forms a neural
network connection
712
00:55:01,020 --> 00:55:06,060
between an emotional state and an
event or a thing or a person,
713
00:55:06,060 --> 00:55:10,420
and those form an emotional flavour to
the memories.
714
00:55:15,380 --> 00:55:20,500
When Baby X gets scared of something,
her digital stress system
715
00:55:20,500 --> 00:55:25,060
gets set off, and this actually
releases virtual cortisol,
716
00:55:25,060 --> 00:55:28,700
so you will actually see Baby X sort
of breathing faster or panting.
717
00:55:30,540 --> 00:55:36,820
This is not a real chemical, but the
effects that it has are multifold.
718
00:55:38,220 --> 00:55:40,580
- BABY CRIES
719
00:55:43,540 --> 00:55:45,420
- What's the matter, baby?
720
00:55:45,420 --> 00:55:48,060
- Are you getting angry?
- BABY LAUGHS
721
00:55:48,060 --> 00:55:50,020
Did I surprise you?
722
00:55:54,740 --> 00:55:59,340
The model has to create its own
desires and motivations and
723
00:55:59,340 --> 00:56:03,620
actions, and it has to make plans -
what it wants to do - and has to
724
00:56:03,620 --> 00:56:06,780
have goals and all these factors, so,
really, the only way
725
00:56:06,780 --> 00:56:10,460
to do that is actually to give it a
nervous system, a digital
726
00:56:10,460 --> 00:56:14,580
nervous system, and create a digital
brain which drives it.
727
00:56:14,580 --> 00:56:18,100
I definitely think some aspects of
consciousness
728
00:56:18,100 --> 00:56:20,140
can be achieved digitally.
729
00:56:28,060 --> 00:56:33,020
- Any such terms, that machines
"learn" or that they can "see"
730
00:56:33,020 --> 00:56:37,860
or "understand" or "interpret", these
are human concepts
731
00:56:37,860 --> 00:56:41,340
that we apply to the machine to
understand what the machine
732
00:56:41,340 --> 00:56:43,940
is doing, to talk about it.
733
00:56:45,740 --> 00:56:49,500
The machine doesn't feel, it doesn't
experience anything,
734
00:56:49,500 --> 00:56:56,140
it doesn't understand anything. For
that, a biological body is required.
735
00:56:59,020 --> 00:57:01,380
We don't really know how brains work.
736
00:57:01,380 --> 00:57:04,620
We don't know how consciousness is
produced, which is a very
737
00:57:04,620 --> 00:57:07,980
important step if you want to
replicate it with a machine.
738
00:57:15,220 --> 00:57:18,260
- There's still lots of debate on
this.
739
00:57:18,260 --> 00:57:23,340
A lot of neuroscientists think
consciousness is a process,
740
00:57:23,340 --> 00:57:27,780
not a physical thing, it's not some
magical aspect
741
00:57:27,780 --> 00:57:30,940
of a particular biological material
that creates that.
742
00:57:32,220 --> 00:57:35,140
We are able to build machines which
can process so much
743
00:57:35,140 --> 00:57:38,100
that we can start simulating
biological phenomena,
744
00:57:38,100 --> 00:57:41,300
which is the key to making something
lifelike.
745
00:57:47,540 --> 00:57:51,340
If you love somebody, you've
presumably had all kinds
746
00:57:51,340 --> 00:57:54,060
of experiences with that person.
747
00:58:01,460 --> 00:58:04,500
You've had food with them, you've
laughed with them,
748
00:58:04,500 --> 00:58:07,060
you've done all these things, you've
been to all these places.
749
00:58:07,060 --> 00:58:10,820
Your memories of that person are so
rich.
750
00:58:23,700 --> 00:58:25,580
And the depth of your love
751
00:58:25,580 --> 00:58:28,980
will have to do with the depth of
those memories.
752
00:58:36,060 --> 00:58:41,260
In the metaverse, an immersive 3D
world, we will be interacting
753
00:58:41,260 --> 00:58:45,660
with other people or avatars which may
be autonomous
754
00:58:45,660 --> 00:58:49,260
and whose memories are actually
created
755
00:58:49,260 --> 00:58:51,820
through interaction with the world.
756
00:58:54,260 --> 00:58:57,700
If you spend enough time with the
virtual being and you've
757
00:58:57,700 --> 00:59:03,220
formed enough of a relationship, it's
a thing that becomes valuable.
758
00:59:06,020 --> 00:59:10,220
The longer and richer and deeper that
relationship is over time,
759
00:59:10,220 --> 00:59:11,820
the more that you care about it.
760
00:59:32,220 --> 00:59:36,660
- All of this, all of this is
happening in a capitalist society
761
00:59:36,660 --> 00:59:40,060
where capitalism is trying to say,
"We're not only making a new market,
762
00:59:40,060 --> 00:59:42,180
"we're making new humans."
763
00:59:42,180 --> 00:59:45,620
We can make money from making new
humans.
764
00:59:45,620 --> 00:59:47,140
And so the question is,
765
00:59:47,140 --> 00:59:49,700
will human beings say, "Wait, wait,
no."
766
00:59:49,700 --> 00:59:52,500
There really is no other path for
human beings.
767
00:59:52,500 --> 00:59:55,380
We are being offered something that is
thin gruel
768
00:59:55,380 --> 00:59:56,820
and that diminishes us.
769
01:05:01,940 --> 01:05:06,780
Artificial intelligence promises us
what religion does.
770
01:05:06,780 --> 01:05:10,980
You don't have to die, you can be
771
01:05:10,980 --> 01:05:14,620
somehow reborn someplace else in a
different form,
772
01:05:14,620 --> 01:05:19,900
and there's meaning, meaning in
technology,
773
01:05:19,900 --> 01:05:23,580
that people no longer feel in their
religious beliefs
774
01:05:23,580 --> 01:05:26,860
or in their relationships with other
people.
775
01:05:26,860 --> 01:05:29,820
Death somehow will become, you'll
either upload yourself,
776
01:05:29,820 --> 01:05:33,580
or in the meantime you'll download
other people who already died.
777
01:05:34,820 --> 01:05:39,420
So it offers a lot that religion once
offered.
778
01:05:39,420 --> 01:05:42,940
Or still offers, but people are not as
drawn to it.
779
01:05:42,940 --> 01:05:47,500
So I think it is...has become a kind
of modern form of transcendence.
780
01:06:12,460 --> 01:06:18,540
- I am alive in a time where we can
virtually capture
781
01:06:18,540 --> 01:06:21,900
the elements that matter most, which
is a relationship.
782
01:06:25,620 --> 01:06:30,340
I want to be part of pushing human
society to the place of, like,
783
01:06:30,340 --> 01:06:33,060
well, just because your body isn't
here any more
784
01:06:33,060 --> 01:06:34,500
doesn't mean you're gone.
785
01:06:44,540 --> 01:06:46,580
100 years ago, when your heart
stopped,
786
01:06:46,580 --> 01:06:50,220
that was universally agreed upon that
you were just dead.
787
01:06:51,380 --> 01:06:53,820
And then medicine came in and said,
788
01:06:53,820 --> 01:06:56,500
"Hey, if you pump on somebody's chest,
789
01:06:56,500 --> 01:06:59,540
"and blow air into their lungs, you
can keep them alive."
790
01:07:02,140 --> 01:07:05,620
Fucking everything we think we know
changes every 20, 30 years.
791
01:07:11,620 --> 01:07:17,660
Our heads are literally in the sand
about this looming fate
792
01:07:17,660 --> 01:07:19,820
that everybody in the world has to
deal with.
793
01:07:23,820 --> 01:07:26,380
Fuck death and the hyperbole about,
794
01:07:26,380 --> 01:07:29,460
oh, it's nature and you have to deal
with it.
795
01:07:32,060 --> 01:07:35,340
It's clearly not a natural thing that
everybody wants to happen.
796
01:07:35,340 --> 01:07:39,820
It is a current reality, and I believe
we can change that reality.
797
01:10:19,780 --> 01:10:24,780
- TEARFUL SPEECH, CHILD TALKING
798
01:15:48,380 --> 01:15:51,980
- When I first heard about this case
in Korea,
799
01:15:51,980 --> 01:15:56,980
I looked with horror upon the advent
of this kind of technology.
800
01:15:56,980 --> 01:16:01,940
It's able to hijack the things that we
love the most.
801
01:16:01,940 --> 01:16:07,020
I don't know any driving force that is
more important to me
802
01:16:07,020 --> 01:16:11,060
than the force to protect or be with
my children.
803
01:16:11,060 --> 01:16:14,300
I would give up my life to have that
last moment.
804
01:16:17,980 --> 01:16:22,580
Let's say the child is like, "Mom, you
can't cancel this service.
805
01:16:22,580 --> 01:16:25,980
"I'll die... It's going to be like me
dying once again."
806
01:16:25,980 --> 01:16:30,020
That product is both a product
807
01:16:30,020 --> 01:16:32,180
and the perfect salesman for that
product.
808
01:16:32,180 --> 01:16:36,300
Because it's almost taking your memory
of the loved one hostage
809
01:16:36,300 --> 01:16:39,340
and then making it sort of sell that
service back to you,
810
01:16:39,340 --> 01:16:43,460
putting a moral obligation on
continuing to chat with the service
811
01:16:43,460 --> 01:16:47,020
or continuing to visit their online
memorial
812
01:16:47,020 --> 01:16:48,500
or whatever it is.
813
01:19:55,900 --> 01:19:59,060
- Very quickly, we won't see this as
creepy.
814
01:19:59,060 --> 01:20:03,980
Very quickly, we may see this as
comfort.
815
01:20:03,980 --> 01:20:09,540
But really what is it that we're doing
to ourselves
816
01:20:09,540 --> 01:20:12,580
when we accept this comfort?
817
01:20:14,500 --> 01:20:18,380
I want to sort of respect the human
creativity and imagination
818
01:20:18,380 --> 01:20:21,580
to create new rituals of remembrance,
819
01:20:21,580 --> 01:20:27,380
new rituals of loss around the
artistry of the virtual.
820
01:20:29,180 --> 01:20:32,380
But we have to keep it in check.
821
01:20:32,380 --> 01:20:34,820
It's how to lose them better.
822
01:20:35,740 --> 01:20:38,340
Not how to pretend they're still here.
823
01:21:50,940 --> 01:21:53,940
- It's odd because I almost have a
change of heart now.
824
01:21:53,940 --> 01:21:58,460
It's like, well, maybe I will check in
with you,
825
01:21:58,460 --> 01:22:01,980
here and there, cos I feel like...
826
01:22:01,980 --> 01:22:06,780
..I would, I would like to know it
turns out really, really well.
827
01:22:06,780 --> 01:22:09,780
That he adjusted. That he's OK.
828
01:22:18,700 --> 01:22:21,620
But I think that kind of brings to
mind, like, we don't know
829
01:22:21,620 --> 01:22:23,620
what happens after we die.
830
01:22:23,620 --> 01:22:28,940
We want things to be perfect, better,
831
01:22:28,940 --> 01:22:32,660
and we don't even know if that's the
truth, because we don't know
832
01:22:32,660 --> 01:22:37,180
about the other side, so it's
just...what you think.
833
01:22:39,140 --> 01:22:41,620
And in this case, the words that a
computer tells you
834
01:22:41,620 --> 01:22:43,140
that can heal the place...
835
01:22:49,300 --> 01:22:52,220
And it can heal a place.
71722
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.