Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:00,566 --> 00:00:02,766
(bright music)
2
00:00:04,766 --> 00:00:07,533
- A man checks his smartwatch.
- Our modern,
3
00:00:07,533 --> 00:00:10,766
digital lives offer limitless
convenience, community,
4
00:00:10,766 --> 00:00:14,100
and easy to access everything.
5
00:00:14,100 --> 00:00:16,766
- [Speaker] The internet opens
unbelievable opportunities.
6
00:00:18,500 --> 00:00:21,633
- [Host] But there's a
trade-off. Your personal data.
7
00:00:21,633 --> 00:00:23,266
- [Speaker] Pretty much
everything you do on the web
8
00:00:23,266 --> 00:00:25,066
is being tracked and logged.
9
00:00:25,066 --> 00:00:26,833
- Who you are, what you're
doing, and where you're going.
10
00:00:26,833 --> 00:00:28,400
- This is the mothership.
11
00:00:28,400 --> 00:00:31,600
Can you show me what I've
inadvertently given up?
12
00:00:31,600 --> 00:00:34,833
And that personal data
is worth a fortune.
13
00:00:34,833 --> 00:00:37,166
- People expect it to
grow to $400 billion.
14
00:00:38,300 --> 00:00:40,400
- To completely
avoid data collection
15
00:00:41,466 --> 00:00:45,066
requires you to
go live in a cave.
16
00:00:45,066 --> 00:00:46,666
- [Host] Thankfully,
there are tools
17
00:00:46,666 --> 00:00:50,433
to help maintain your
privacy and security online.
18
00:00:50,433 --> 00:00:54,466
- Let's create some strong
and unique passwords.
19
00:00:54,466 --> 00:00:56,500
- Everything has
privacy settings.
20
00:00:56,500 --> 00:00:59,600
We can dial to whatever
level we feel comfortable.
21
00:00:59,600 --> 00:01:03,200
- [Host] And some coders are
rewriting the rules of the web.
22
00:01:03,200 --> 00:01:05,366
- The decentralized
web is our antidote
23
00:01:05,366 --> 00:01:07,466
to the web as we
now think of it.
24
00:01:07,466 --> 00:01:09,533
- New technologies
can proliferate
25
00:01:09,533 --> 00:01:11,800
without centralized control.
26
00:01:11,800 --> 00:01:14,600
- This is about technological
self-determination.
27
00:01:14,600 --> 00:01:19,366
- [Host] Secrets in your
Data right now on "NOVA."
28
00:01:19,366 --> 00:01:48,133
(triumphant upbeat music)
29
00:01:49,200 --> 00:01:51,666
(phone camera clicking)
30
00:01:51,666 --> 00:01:54,166
PATEL:
Our world runs on data.
31
00:01:54,166 --> 00:01:56,133
When you're shopping online,
32
00:01:56,133 --> 00:01:58,100
streaming
your favorite shows...
33
00:01:58,100 --> 00:01:59,166
(mouse clicking)
34
00:01:59,166 --> 00:02:00,666
posting family pics--
35
00:02:00,666 --> 00:02:02,200
that's all your personal data
36
00:02:02,200 --> 00:02:04,133
leaving your device.
37
00:02:04,133 --> 00:02:06,600
(on phone):
What's up, everyone?
38
00:02:06,600 --> 00:02:07,800
I'm about to head to
the hospital
39
00:02:07,800 --> 00:02:08,800
for a night shift...
40
00:02:08,800 --> 00:02:11,166
(voiceover):
I'm Alok Patel, a pediatrician,
41
00:02:11,166 --> 00:02:12,033
medical journalist,
42
00:02:12,033 --> 00:02:15,266
and all-around
social media enthusiast.
43
00:02:15,266 --> 00:02:16,566
(on phone):
Just got here at the
children's hospital...
44
00:02:16,566 --> 00:02:18,233
(voiceover):
I kinda love our
connected lives,
45
00:02:18,233 --> 00:02:20,133
and I haven't been shy at all
46
00:02:20,133 --> 00:02:23,333
about sharing mine
pretty publicly.
47
00:02:23,333 --> 00:02:25,133
(on phone):
30 million children...
48
00:02:25,133 --> 00:02:27,200
(voiceover):
And I know that sharing
personal data
49
00:02:27,200 --> 00:02:28,233
can have major upsides.
50
00:02:28,233 --> 00:02:29,566
The media's reported
51
00:02:29,566 --> 00:02:30,666
on some of
the most dramatic examples.
52
00:02:30,666 --> 00:02:34,566
NARRATOR (archival):
...is conducting several studies
53
00:02:34,566 --> 00:02:37,133
to see how far wearables can go
in detecting disease.
54
00:02:37,133 --> 00:02:38,833
A California man,
rescued yesterday
55
00:02:38,833 --> 00:02:41,066
after going missing
credits a social media post
56
00:02:41,066 --> 00:02:42,400
for helping save his life.
57
00:02:42,400 --> 00:02:44,366
Six out of ten people
with Alzheimer's or dementia
58
00:02:44,366 --> 00:02:45,633
will wander away from home,
59
00:02:45,633 --> 00:02:48,000
and that's where
this little box comes in.
60
00:02:48,900 --> 00:02:51,700
PATEL:
There's no doubt,
our data is out there,
61
00:02:51,700 --> 00:02:54,200
but did you ever
wonder where it goes
62
00:02:54,200 --> 00:02:56,166
and how it's used?
63
00:02:56,166 --> 00:02:57,233
I do.
64
00:02:57,233 --> 00:02:59,833
I want to know: what are the
benefits of sharing
65
00:02:59,833 --> 00:03:01,666
and the risks?
66
00:03:01,666 --> 00:03:05,200
And what can we do to
make our connected world safer
67
00:03:05,200 --> 00:03:08,100
and just generally better
for everyone?
68
00:03:12,466 --> 00:03:14,700
First, the basics.
69
00:03:14,700 --> 00:03:16,200
I want to know:
70
00:03:16,200 --> 00:03:19,500
how much personal data
have I been sharing?
71
00:03:19,500 --> 00:03:22,633
Meet Hayley Tsukayama,
tech journalist
72
00:03:22,633 --> 00:03:24,466
and data privacy advocate.
73
00:03:24,466 --> 00:03:26,300
(chuckles)
PATEL:
You must be Hayley.
74
00:03:26,300 --> 00:03:27,633
Hi, nice to meet you.
75
00:03:27,633 --> 00:03:29,766
What is going on?
76
00:03:29,766 --> 00:03:32,133
(voiceover):
Okay, hold on,
wasn't expecting this.
77
00:03:32,133 --> 00:03:35,533
She's already got
my whole life up on the screen.
78
00:03:35,533 --> 00:03:38,533
So this is a data report
gathered about you
79
00:03:38,533 --> 00:03:40,766
from social media posts,
80
00:03:40,766 --> 00:03:42,666
from the images of you
that may be on the internet,
81
00:03:42,666 --> 00:03:45,166
from publicly available
information.
82
00:03:45,166 --> 00:03:47,166
This is like
a family business.
83
00:03:47,166 --> 00:03:48,666
That's my dad's
cell phone number.
84
00:03:48,666 --> 00:03:50,500
Why is this available?
85
00:03:50,500 --> 00:03:52,700
(groaning):
Oh, this is my address!
86
00:03:52,700 --> 00:03:54,066
This is just available?
87
00:03:54,066 --> 00:03:55,200
Just available online.
88
00:03:55,200 --> 00:03:56,666
Okay,
you guys are going to
89
00:03:56,666 --> 00:03:58,600
blur all this, right?
Please?
90
00:03:58,600 --> 00:04:01,166
There's-- oh, okay, cool,
there's my cell phone number.
91
00:04:01,166 --> 00:04:03,833
Oh, this is concerning.
92
00:04:03,833 --> 00:04:06,633
As a medical doctor,
93
00:04:06,633 --> 00:04:09,100
we all have
license numbers
94
00:04:09,100 --> 00:04:11,333
and issue dates
and expiration dates.
95
00:04:11,333 --> 00:04:13,166
(chuckling):
And it's literally right here.
96
00:04:13,166 --> 00:04:15,066
This is so weird,
97
00:04:15,066 --> 00:04:16,500
these are the names
of my neighbors
98
00:04:16,500 --> 00:04:18,033
from my childhood home.
99
00:04:18,966 --> 00:04:21,200
(whispering):
Why do you have
this information?
100
00:04:21,200 --> 00:04:23,600
Like, I haven't
seen these names in...
101
00:04:23,600 --> 00:04:25,100
years.
102
00:04:25,100 --> 00:04:26,033
(voiceover):
Not gonna lie,
103
00:04:26,033 --> 00:04:28,700
I expected some of
my data to be out there,
104
00:04:28,700 --> 00:04:29,700
but this is...
105
00:04:29,700 --> 00:04:31,833
this is just creepy.
106
00:04:31,833 --> 00:04:33,366
How is there a report
107
00:04:33,366 --> 00:04:36,466
about who my neighbors were
in the '80s?
108
00:04:36,466 --> 00:04:38,333
LEVY:
You can be assured that
pretty much
109
00:04:38,333 --> 00:04:39,600
everything you do on the web
110
00:04:39,600 --> 00:04:40,600
and with a browser
111
00:04:40,600 --> 00:04:42,266
is being tracked and logged.
112
00:04:42,266 --> 00:04:44,566
Where you go, what you look at,
113
00:04:44,566 --> 00:04:46,066
what apps you use.
114
00:04:46,066 --> 00:04:48,700
SAFIYA NOBLE:
Modern life has been arranged
115
00:04:48,700 --> 00:04:51,133
so that every aspect of
116
00:04:51,133 --> 00:04:52,633
what we do and how we live
can be captured.
117
00:04:52,633 --> 00:04:55,833
MEREDITH WHITTAKER:
They pick up our faces
as we walk down the street.
118
00:04:55,833 --> 00:05:00,333
They track our keystrokes
while we're at work
119
00:05:00,333 --> 00:05:02,133
on behalf of our employer.
120
00:05:02,133 --> 00:05:03,033
(keys clacking)
121
00:05:03,033 --> 00:05:04,766
GALPERIN:
And ad networks use a bunch of
122
00:05:04,766 --> 00:05:07,200
creepy and nefarious ways
of figuring out
123
00:05:07,200 --> 00:05:09,000
who you are, what you're doing,
and where you're going.
124
00:05:09,900 --> 00:05:12,233
PATEL:
What happened to the web
I grew up with?
125
00:05:12,233 --> 00:05:15,166
♪ ♪
126
00:05:15,166 --> 00:05:18,200
I came of age when the internet
was a fun and weird party,
127
00:05:18,200 --> 00:05:19,233
and everyone was invited.
128
00:05:19,233 --> 00:05:21,300
Remember GeoCities,
129
00:05:21,300 --> 00:05:23,200
AOL Instant Messenger,
130
00:05:23,200 --> 00:05:26,100
or the dancing "ooga chacka,
ooga chacka" baby?
131
00:05:26,100 --> 00:05:27,600
♪ ♪
132
00:05:27,600 --> 00:05:29,466
Today, it's clear
that innocence is gone.
133
00:05:30,700 --> 00:05:31,733
Is the party over?
134
00:05:34,233 --> 00:05:38,700
To understand the present,
I want a refresher on the past.
135
00:05:38,700 --> 00:05:40,166
(machinery whirring)
136
00:05:40,166 --> 00:05:42,633
So I'm visiting
the Computer History Museum
137
00:05:42,633 --> 00:05:44,200
in Silicon Valley.
138
00:05:44,200 --> 00:05:45,733
(video game music,
Patel chuckling)
139
00:05:45,733 --> 00:05:48,433
MARC WEBER:
Okay, I'm headed
straight for the sun...
140
00:05:48,433 --> 00:05:50,833
PATEL:
I'm meeting
computer historian Marc Weber.
141
00:05:50,833 --> 00:05:52,800
What a maneuver!
There's a straight line
142
00:05:52,800 --> 00:05:54,333
of bullets coming
right at you--
143
00:05:54,333 --> 00:05:55,500
(exploding sound effect,
Patel groans)
144
00:05:55,500 --> 00:05:56,733
(voiceover):
...Who just wasted me
145
00:05:56,733 --> 00:05:59,200
in the cosmic battlefield of
"Space War,"
146
00:05:59,200 --> 00:06:00,633
one of the first video games.
147
00:06:00,633 --> 00:06:03,100
(laughing)
This is actually really fun.
148
00:06:03,100 --> 00:06:04,666
♪ ♪
149
00:06:04,666 --> 00:06:06,733
This is cool.
150
00:06:06,733 --> 00:06:10,200
ICBM computer from
a nuclear missile.
151
00:06:10,200 --> 00:06:11,333
Woah.
152
00:06:11,333 --> 00:06:13,133
Human-computer interaction.
153
00:06:13,133 --> 00:06:15,533
PATEL:
Now, this looks fascinating.
154
00:06:15,533 --> 00:06:16,700
I can't tell if this is like,
155
00:06:16,700 --> 00:06:19,600
this universal clock
apparatus,
156
00:06:19,600 --> 00:06:21,133
or what's happening here?
157
00:06:21,133 --> 00:06:24,533
This is the first
punch card machine.
158
00:06:24,533 --> 00:06:27,133
And this was an invention
159
00:06:27,133 --> 00:06:29,766
to make the U.S. census
faster.
160
00:06:29,766 --> 00:06:30,800
PATEL (voiceover):
It's kind of wild,
161
00:06:30,800 --> 00:06:34,200
but the dawn of automated
personal data collection
162
00:06:34,200 --> 00:06:36,066
looks like this--
163
00:06:36,066 --> 00:06:39,366
census info poked into
humble little punch cards.
164
00:06:39,366 --> 00:06:42,166
Marc, can you explain
punch cards to me?
165
00:06:42,166 --> 00:06:43,200
Because when
I think punch cards,
166
00:06:43,200 --> 00:06:45,600
I'm thinking
lottery tickets,
167
00:06:45,600 --> 00:06:49,200
or "buy nine cups of coffee,
and your tenth is free."
168
00:06:49,200 --> 00:06:54,066
Well, it was the first way to
record machine-readable data.
169
00:06:54,066 --> 00:06:58,100
PATEL:
Okay, let's make sense
of the census.
170
00:06:58,100 --> 00:07:00,466
This kind of data collection
171
00:07:00,466 --> 00:07:02,166
has been going on
for centuries.
172
00:07:02,166 --> 00:07:04,566
But starting in 1890,
173
00:07:04,566 --> 00:07:06,666
data was recorded
in a pattern of holes
174
00:07:06,666 --> 00:07:08,833
"punched" into a census card.
175
00:07:08,833 --> 00:07:11,500
So the cards were used to
collect census data.
176
00:07:11,500 --> 00:07:14,200
What exact questions
were on there?
177
00:07:14,200 --> 00:07:18,033
So things like age, gender,
number of children.
178
00:07:18,900 --> 00:07:20,833
PATEL:
When a card passes through
the machine,
179
00:07:20,833 --> 00:07:24,166
the holes allow pins to
make electrical connections
180
00:07:24,166 --> 00:07:25,166
through the card.
181
00:07:25,166 --> 00:07:27,533
A counter keeps a running tally
182
00:07:27,533 --> 00:07:28,533
in each census category
183
00:07:28,533 --> 00:07:31,133
as more cards pass through.
184
00:07:31,133 --> 00:07:33,233
Census data
is how the government
185
00:07:33,233 --> 00:07:35,633
allocates resources
for schools and hospitals;
186
00:07:35,633 --> 00:07:38,166
draws the lines for districts;
187
00:07:38,166 --> 00:07:40,100
and decides
how many representatives
188
00:07:40,100 --> 00:07:42,100
each state will send to
congress.
189
00:07:42,100 --> 00:07:46,366
It's a reminder of the value of
collecting data,
190
00:07:46,366 --> 00:07:47,766
but it's a far cry from
191
00:07:47,766 --> 00:07:49,466
having my every move
tracked on the internet.
192
00:07:51,400 --> 00:07:53,100
To help connect the dots,
193
00:07:53,100 --> 00:07:54,833
Marc shows me
the next surprising step
194
00:07:54,833 --> 00:07:57,533
in the evolution
of data collection.
195
00:07:57,533 --> 00:08:00,200
♪ ♪
196
00:08:00,200 --> 00:08:01,333
I look at this
and I'm almost like,
197
00:08:01,333 --> 00:08:03,700
"Oh, this looks like a
modern office building to me."
198
00:08:03,700 --> 00:08:06,066
WEBER:
Exactly. Each one of those
199
00:08:06,066 --> 00:08:08,733
is one of these
giant terminals here.
200
00:08:08,733 --> 00:08:10,066
(static hissing)
201
00:08:10,066 --> 00:08:12,133
PATEL:
During the Cold War,
202
00:08:12,133 --> 00:08:13,833
IBM figured out
how to hook up computers
203
00:08:13,833 --> 00:08:16,066
from all across the country
to monitor
204
00:08:16,066 --> 00:08:18,366
U.S. airspace in real-time.
205
00:08:18,366 --> 00:08:20,133
(rocket blasting off)
206
00:08:20,133 --> 00:08:24,433
So, this is a terminal
from SAGE.
207
00:08:24,433 --> 00:08:28,000
(beeping, static)
208
00:08:29,600 --> 00:08:33,166
PATEL:
It was the first real-time
networked data system.
209
00:08:33,166 --> 00:08:35,366
Real-time is just that:
210
00:08:35,366 --> 00:08:36,600
instantaneous.
211
00:08:36,600 --> 00:08:40,066
And "networked" means connected.
212
00:08:40,066 --> 00:08:41,633
Computers from all across
the country
213
00:08:41,633 --> 00:08:42,833
were hooked up to each other,
214
00:08:42,833 --> 00:08:45,133
so they could share data
in real time,
215
00:08:45,133 --> 00:08:48,000
Sort of like the internet.
216
00:08:48,900 --> 00:08:53,266
WEBER:
The whole point was to track
incoming Soviet bombers,
217
00:08:53,266 --> 00:08:55,633
and they built, arguably,
218
00:08:55,633 --> 00:08:58,566
the first computer network
in the world.
219
00:08:58,566 --> 00:09:00,133
PATEL:
IBM soon realized
220
00:09:00,133 --> 00:09:02,666
that their
aircraft-tracking computers
221
00:09:02,666 --> 00:09:04,500
could be used commercially.
222
00:09:04,500 --> 00:09:08,200
♪ ♪
223
00:09:08,200 --> 00:09:11,200
So In 1960,
along came SABRE.
224
00:09:11,200 --> 00:09:13,733
(beeping, static)
225
00:09:13,733 --> 00:09:15,800
A real-time networked
data system
226
00:09:15,800 --> 00:09:19,200
that shared personal data.
227
00:09:19,200 --> 00:09:21,366
WEBER:
The story goes that
228
00:09:21,366 --> 00:09:24,833
the head of American Airlines
met with IBM,
229
00:09:24,833 --> 00:09:27,200
and they decided to do
a partnership,
230
00:09:27,200 --> 00:09:29,700
the key idea being real time.
231
00:09:29,700 --> 00:09:32,833
They could look at
your age, your name,
232
00:09:32,833 --> 00:09:34,533
financial information,
233
00:09:34,533 --> 00:09:35,833
what flights
you'd been on before,
234
00:09:35,833 --> 00:09:39,133
your dietary preferences
for your meal,
235
00:09:39,133 --> 00:09:41,666
all of this right then
in real time.
236
00:09:41,666 --> 00:09:44,666
PATEL:
We went from tracking bombs to
237
00:09:44,666 --> 00:09:45,666
tracking butts on planes.
238
00:09:45,666 --> 00:09:46,733
WEBER:
Exactly.
239
00:09:48,666 --> 00:09:50,733
PATEL (voiceover):
Between "Pong," punch cards,
240
00:09:50,733 --> 00:09:52,200
and planes...
Whoa, whoa, whoa!
241
00:09:52,200 --> 00:09:55,266
PATEL (voiceover):
I could see how data started
connecting the world--
242
00:09:55,266 --> 00:09:57,400
the military to our airspace
243
00:09:57,400 --> 00:09:59,466
and companies
to their customers.
244
00:10:00,333 --> 00:10:02,766
But what if those customers
all over the world
245
00:10:02,766 --> 00:10:05,166
were linked up to each other?
246
00:10:05,166 --> 00:10:06,833
What would we even call that?
247
00:10:06,833 --> 00:10:10,300
Oh yeah,
the World Wide Web.
248
00:10:10,300 --> 00:10:12,500
PRESENTER:
The World Wide Web
is a part of our lives
249
00:10:12,500 --> 00:10:14,766
and getting bigger every day.
250
00:10:14,766 --> 00:10:17,300
But what about the man
who dreamt it all up?
251
00:10:17,300 --> 00:10:19,733
PATEL:
News reports opened our eyes
252
00:10:19,733 --> 00:10:21,633
to this innovation
and its creator,
253
00:10:21,633 --> 00:10:23,533
Tim Berners-Lee.
254
00:10:23,533 --> 00:10:24,800
PRESENTER:
He simply wanted
255
00:10:24,800 --> 00:10:27,366
to give it away
for the benefit of others.
256
00:10:28,133 --> 00:10:29,600
TIM BERNERS-LEE:
When the web started,
257
00:10:29,600 --> 00:10:32,166
anybody could make a website.
258
00:10:32,166 --> 00:10:34,200
You could make your own style,
259
00:10:34,200 --> 00:10:36,200
your own poems, your own blogs,
260
00:10:36,200 --> 00:10:37,633
and you could link to
other people.
261
00:10:37,633 --> 00:10:41,633
The feeling and the sense
was of tremendous empowerment.
262
00:10:41,633 --> 00:10:43,200
(modem dialing up)
263
00:10:43,200 --> 00:10:45,366
NOBLE:
Some of us remember
the internet
264
00:10:45,366 --> 00:10:48,366
when it was managed
and curated by people.
265
00:10:48,366 --> 00:10:49,800
Where it was, you know,
266
00:10:49,800 --> 00:10:51,666
bulletin board systems
and chat rooms.
267
00:10:51,666 --> 00:10:54,166
♪ ♪
268
00:10:54,166 --> 00:10:55,633
PATEL:
In the late '90s,
small start-ups
269
00:10:55,633 --> 00:10:58,200
were maturing into
big tech companies--
270
00:10:58,200 --> 00:11:01,766
with lots of users
and user data.
271
00:11:02,833 --> 00:11:05,066
KAHLE:
As the web grew,
272
00:11:05,066 --> 00:11:06,533
we ended up with
a very few companies
273
00:11:06,533 --> 00:11:08,200
going and doing
all of the hosting.
274
00:11:08,200 --> 00:11:11,133
It wasn't your actual
personal computer doing it.
275
00:11:11,133 --> 00:11:15,800
NOBLE:
The large platforms
that have emerged are dependent
276
00:11:15,800 --> 00:11:18,233
upon user-generated content
277
00:11:18,233 --> 00:11:19,800
and user engagement.
278
00:11:19,800 --> 00:11:24,400
The modern internet is a wholly
commercial kind of internet.
279
00:11:24,400 --> 00:11:27,700
PATEL:
And as web traffic started
to flow more and more
280
00:11:27,700 --> 00:11:29,533
through just a few companies,
281
00:11:29,533 --> 00:11:32,566
they began to realize the value
of all this personal data
282
00:11:32,566 --> 00:11:35,066
piling up on their servers.
283
00:11:35,066 --> 00:11:37,433
ALEKSANDRA KOROLOVA:
Google started to understand
284
00:11:37,433 --> 00:11:40,166
the power
of individual's data.
285
00:11:41,100 --> 00:11:43,200
They have information about
286
00:11:43,200 --> 00:11:44,333
not just how the websites
287
00:11:44,333 --> 00:11:45,600
are interconnected,
288
00:11:45,600 --> 00:11:49,166
but also information
about the individuals.
289
00:11:49,166 --> 00:11:51,100
HILL:
The big technology companies.
290
00:11:51,100 --> 00:11:53,233
They're looking at
where you are,
291
00:11:53,233 --> 00:11:54,600
They're wanting to know
what your gender is,
292
00:11:54,600 --> 00:11:55,733
what your age is,
293
00:11:55,733 --> 00:11:57,400
you know,
how much you have to spend,
294
00:11:57,400 --> 00:12:00,200
what your political beliefs are.
295
00:12:00,200 --> 00:12:02,300
They have a lot of power
to determine
296
00:12:02,300 --> 00:12:03,633
what our privacy is.
297
00:12:03,633 --> 00:12:06,100
RASKAR:
That's trillions of dollars
298
00:12:06,100 --> 00:12:08,166
of economic value
that's out there,
299
00:12:08,166 --> 00:12:09,533
and there's a constant struggle
300
00:12:09,533 --> 00:12:11,100
between who owns the data.
301
00:12:11,100 --> 00:12:12,100
We create the data,
302
00:12:12,100 --> 00:12:14,166
but somebody else monetizes it.
303
00:12:15,800 --> 00:12:17,633
PATEL:
As big tech was maturing,
304
00:12:17,633 --> 00:12:20,400
early data collection on
websites was pretty limited.
305
00:12:21,433 --> 00:12:23,400
So what happened
to supercharge it?
306
00:12:23,400 --> 00:12:26,066
To create the vast
tracking infrastructure
307
00:12:26,066 --> 00:12:29,100
that allowed Hayley
to find out so much about me?
308
00:12:29,100 --> 00:12:31,600
It turns out things
really got going
309
00:12:31,600 --> 00:12:33,766
when someone invented...
310
00:12:33,766 --> 00:12:35,100
Cookies?
311
00:12:36,000 --> 00:12:38,633
A cookie is not
actually a tasty snack.
312
00:12:38,633 --> 00:12:40,766
When you go to a website,
313
00:12:40,766 --> 00:12:43,833
the website sends
314
00:12:43,833 --> 00:12:45,700
a small file to your browser.
315
00:12:45,700 --> 00:12:46,733
(keys clacking)
316
00:12:46,733 --> 00:12:49,066
PATEL:
That small file
allows the websites
317
00:12:49,066 --> 00:12:50,766
to collect information
about your visit,
318
00:12:50,766 --> 00:12:53,733
and update it every time
you come back.
319
00:12:53,733 --> 00:12:55,833
You might have
heard of cookies
320
00:12:55,833 --> 00:12:57,266
because a lot of websites
321
00:12:57,266 --> 00:12:58,733
pop up with messages
prompting you
322
00:12:58,733 --> 00:13:00,300
to accept theirs.
323
00:13:00,300 --> 00:13:04,300
But what happens
after you hit "accept"?
324
00:13:04,300 --> 00:13:06,133
GALPERIN:
It then uses cookies
325
00:13:06,133 --> 00:13:08,700
to track who you are
and what you're doing,
326
00:13:08,700 --> 00:13:10,233
including in many cases,
327
00:13:10,233 --> 00:13:12,366
all of the other websites
that you go to.
328
00:13:12,366 --> 00:13:15,266
RUMMAN CHOWDHURY:
Cookies have the cutest name,
329
00:13:15,266 --> 00:13:17,800
but they're actually
a really important
330
00:13:17,800 --> 00:13:20,533
piece of information
about you.
331
00:13:20,533 --> 00:13:22,133
Every time you click on a menu,
332
00:13:22,133 --> 00:13:23,333
every time you look at an item
333
00:13:23,333 --> 00:13:24,666
or put it in your cart,
334
00:13:24,666 --> 00:13:27,166
every time you
want to buy something
335
00:13:27,166 --> 00:13:29,666
and click out of the website
and come back in.
336
00:13:29,666 --> 00:13:31,300
A cookie is what's
enabling you to do that.
337
00:13:31,300 --> 00:13:33,166
It's a piece of
tracking information.
338
00:13:33,166 --> 00:13:36,400
You can impute data
and information about you
339
00:13:36,400 --> 00:13:39,366
from the seemingly
innocent behavior online
340
00:13:39,366 --> 00:13:41,133
that's tracked via cookies.
341
00:13:41,133 --> 00:13:43,566
Do you have certain
lifestyle conditions?
342
00:13:43,566 --> 00:13:45,200
What age are you?
343
00:13:45,200 --> 00:13:46,366
What gender or race are you?
344
00:13:46,366 --> 00:13:48,300
Do you have children?
345
00:13:50,200 --> 00:13:52,066
PATEL:
Okay, so I'm starting to get
346
00:13:52,066 --> 00:13:53,666
that these tech companies
347
00:13:53,666 --> 00:13:55,400
use things like cookies
to keep track of
348
00:13:55,400 --> 00:13:56,833
what sites I'm on,
what I'm shopping for,
349
00:13:56,833 --> 00:13:58,200
and what I'm watching.
350
00:13:58,200 --> 00:14:01,066
♪ ♪
351
00:14:01,066 --> 00:14:02,566
(doorbell chimes)
352
00:14:02,566 --> 00:14:03,733
But do they have other ways
353
00:14:03,733 --> 00:14:05,566
of learning the secrets
in my data?
354
00:14:06,566 --> 00:14:08,200
To learn more,
355
00:14:08,200 --> 00:14:10,100
I'm tracking down
a data security expert
356
00:14:10,100 --> 00:14:11,500
in his hacking
high-tech superhero lair...
357
00:14:11,500 --> 00:14:13,800
naturally.
358
00:14:13,800 --> 00:14:17,100
(door rolling open)
359
00:14:17,100 --> 00:14:19,100
Are you Patrick Jackson?
My name's Alok.
360
00:14:19,100 --> 00:14:20,100
Yes, I am.
361
00:14:20,100 --> 00:14:21,333
Awesome,
I'm in the right place.
362
00:14:21,333 --> 00:14:23,733
(door rolls shut)
363
00:14:23,733 --> 00:14:25,333
Your phone is sending
this extra data
364
00:14:25,333 --> 00:14:26,333
to these people that
365
00:14:26,333 --> 00:14:27,566
you don't know exist.
366
00:14:27,566 --> 00:14:29,166
PATEL:
Patrick Jackson has been
367
00:14:29,166 --> 00:14:31,166
all over the news for
his work as
368
00:14:31,166 --> 00:14:33,066
a former NSA research scientist,
369
00:14:33,066 --> 00:14:34,633
and is now chief tech officer
370
00:14:34,633 --> 00:14:37,200
for the internet security firm
Disconnect.
371
00:14:37,200 --> 00:14:39,166
And if anyone can help me
372
00:14:39,166 --> 00:14:41,833
understand how
my personal data is leaking,
373
00:14:41,833 --> 00:14:43,533
it's him.
374
00:14:43,533 --> 00:14:45,833
Patrick,
when I think about
375
00:14:45,833 --> 00:14:50,433
the main control of my
work, life, play,
376
00:14:50,433 --> 00:14:52,400
what I'm doing at home,
when I'm on the road,
377
00:14:52,400 --> 00:14:53,666
this is the mothership.
378
00:14:53,666 --> 00:14:56,133
If I were to
relinquish control
379
00:14:56,133 --> 00:14:58,100
of this precious device
for a few moments,
380
00:14:58,100 --> 00:15:00,100
can you show me, like,
381
00:15:00,100 --> 00:15:01,400
what I've
inadvertently given up?
382
00:15:01,400 --> 00:15:02,566
Yes, yeah.
383
00:15:02,566 --> 00:15:04,200
I can show you what
these data companies
384
00:15:04,200 --> 00:15:06,100
know about you and
what they're collecting.
385
00:15:06,900 --> 00:15:08,666
PATEL (voiceover):
Using a hacking technique
386
00:15:08,666 --> 00:15:10,066
called a
"man-in-the-middle attack,"
387
00:15:10,066 --> 00:15:12,633
Patrick can intercept
the personal data
388
00:15:12,633 --> 00:15:14,233
that's leaving my device,
389
00:15:14,233 --> 00:15:17,100
essentially eavesdropping
on a conversation
390
00:15:17,100 --> 00:15:18,600
between me and the websites
391
00:15:18,600 --> 00:15:21,200
and applications I use.
392
00:15:21,200 --> 00:15:22,566
JACKSON:
Contact information,
393
00:15:22,566 --> 00:15:24,466
name, phone number,
email address.
394
00:15:24,466 --> 00:15:26,600
When you agree to allow them
to have your location,
395
00:15:26,600 --> 00:15:30,100
I can see not only where
you're at on a map,
396
00:15:30,100 --> 00:15:32,133
but also where you're at
in your house,
397
00:15:32,133 --> 00:15:33,666
whether you're
at the back of the house
398
00:15:33,666 --> 00:15:35,366
or the front of the house.
399
00:15:35,366 --> 00:15:37,500
When you look at the data
400
00:15:37,500 --> 00:15:39,700
leaving the phone,
every time you open an app,
401
00:15:39,700 --> 00:15:42,166
you send about, maybe,
402
00:15:42,166 --> 00:15:44,366
500 kilobytes of data
to their servers.
403
00:15:44,366 --> 00:15:48,433
That is equivalent to
about 125 pages of text
404
00:15:48,433 --> 00:15:50,400
that you would print
in a printer.
405
00:15:54,066 --> 00:15:57,466
PATEL (voiceover):
It's one thing if the apps
on my phone are collecting data,
406
00:15:57,466 --> 00:15:59,166
I chose to download them.
407
00:15:59,166 --> 00:16:02,066
But Patrick says
that other companies
408
00:16:02,066 --> 00:16:04,266
have even sneakier ways
to get my data.
409
00:16:04,266 --> 00:16:05,766
Like something as harmless
410
00:16:05,766 --> 00:16:07,400
as a marketing email.
411
00:16:07,400 --> 00:16:09,600
To demonstrate,
Patrick sends me
412
00:16:09,600 --> 00:16:11,633
a mock promotional email.
413
00:16:11,633 --> 00:16:13,600
I have an email that says
414
00:16:13,600 --> 00:16:15,166
"Shoe Sale."
415
00:16:15,166 --> 00:16:17,266
It says "open
to view the shoe sale,"
416
00:16:17,266 --> 00:16:19,833
and I want to open it,
so I'm going to.
417
00:16:19,833 --> 00:16:23,400
"Hi, Alok, get ready for
upcoming shoe sale."
418
00:16:23,400 --> 00:16:25,466
I like this graphic,
and there's like,
419
00:16:25,466 --> 00:16:27,633
social media icons.
This looks legit.
420
00:16:27,633 --> 00:16:28,833
(voiceover):
It turns out,
421
00:16:28,833 --> 00:16:31,600
emails can include
a tracking pixel--
422
00:16:31,600 --> 00:16:35,400
also known as
a spy or invisible pixel.
423
00:16:35,400 --> 00:16:38,100
JACKSON:
An invisible pixel is
a type of image
424
00:16:38,100 --> 00:16:40,100
you're never intended to see,
425
00:16:40,100 --> 00:16:43,400
but it still initiates
a handshake of data
426
00:16:43,400 --> 00:16:46,233
from your device
to these data companies.
427
00:16:46,233 --> 00:16:47,600
(keys clacking)
428
00:16:47,600 --> 00:16:49,433
Most of the emails
that you receive
429
00:16:49,433 --> 00:16:51,333
likely have these
tracking pixels in them.
430
00:16:51,333 --> 00:16:54,133
Most?
Most.
431
00:16:55,266 --> 00:16:56,833
This invisible pixel
is actually
432
00:16:56,833 --> 00:16:58,566
disguised as the banner
433
00:16:58,566 --> 00:17:00,166
in that email.
434
00:17:00,166 --> 00:17:04,200
In other cases, the tracking
pixel will be a one-pixel,
435
00:17:04,200 --> 00:17:05,633
very, very tiny image
436
00:17:05,633 --> 00:17:07,266
that you would never see
with your eyes,
437
00:17:07,266 --> 00:17:09,166
and it's not meant to be seen.
438
00:17:09,166 --> 00:17:10,466
(zooming in)
439
00:17:10,466 --> 00:17:12,066
PATEL:
That little pixel,
that little spy.
440
00:17:12,066 --> 00:17:14,133
See it?
Right there.
441
00:17:14,133 --> 00:17:16,600
It can hide in images
embedded in an email,
442
00:17:16,600 --> 00:17:19,533
like a banner, or even an
"unsubscribe" button.
443
00:17:19,533 --> 00:17:21,333
And when you open the email,
444
00:17:21,333 --> 00:17:25,100
it contacts a tracking website
to collect data about you.
445
00:17:25,100 --> 00:17:27,133
It can suck up
your device model,
446
00:17:27,133 --> 00:17:29,133
location, and even
some browsing history.
447
00:17:30,033 --> 00:17:33,133
JACKSON:
This is essentially
a digital fingerprint
448
00:17:33,133 --> 00:17:35,266
of your device.
PATEL:
No!
449
00:17:35,266 --> 00:17:38,266
JACKSON:
That fingerprint would
identify you as who you are,
450
00:17:38,266 --> 00:17:39,733
and somebody else in
the future
451
00:17:39,733 --> 00:17:41,066
could look at
that fingerprint,
452
00:17:41,066 --> 00:17:43,200
make you do
a new fingerprint,
453
00:17:43,200 --> 00:17:46,200
and they would know that
this is the same person.
454
00:17:46,200 --> 00:17:49,066
PATEL:
Companies are snatching copies
of my digital fingerprints
455
00:17:49,066 --> 00:17:51,100
wherever I go online.
456
00:17:51,100 --> 00:17:53,166
And by following that trail...
457
00:17:53,166 --> 00:17:55,166
JACKSON:
Companies are, over time,
458
00:17:55,166 --> 00:17:57,566
collecting
everything about you.
459
00:17:57,566 --> 00:18:00,100
That's how they
build up this,
460
00:18:00,100 --> 00:18:02,066
this digital profile about you.
461
00:18:04,100 --> 00:18:07,333
HILL:
The internet is this
data collection machine,
462
00:18:07,333 --> 00:18:09,366
and as we're moving
through the internet,
463
00:18:09,366 --> 00:18:10,533
there are these companies
464
00:18:10,533 --> 00:18:11,700
whose whole job is
465
00:18:11,700 --> 00:18:13,266
to reassemble that trail.
466
00:18:14,300 --> 00:18:17,633
PATEL:
So now I know how my digital
profile is out there,
467
00:18:17,633 --> 00:18:19,200
but why?
468
00:18:19,200 --> 00:18:20,733
Who really cares that
469
00:18:20,733 --> 00:18:22,366
I take pictures doing
handstands
470
00:18:22,366 --> 00:18:24,133
and own way too many
sneakers?
471
00:18:26,133 --> 00:18:28,200
Turns out,
there's a whole industry
472
00:18:28,200 --> 00:18:29,833
devoted to selling my data.
473
00:18:29,833 --> 00:18:32,366
It's the data brokers.
474
00:18:32,366 --> 00:18:34,500
♪ ♪
475
00:18:34,500 --> 00:18:36,633
Data brokers are companies
that are set up to
476
00:18:36,633 --> 00:18:38,766
collect information,
repackage it,
477
00:18:38,766 --> 00:18:40,333
and resell it or re-share it
478
00:18:40,333 --> 00:18:42,266
to other companies
that may want to know
479
00:18:42,266 --> 00:18:44,200
information about you.
480
00:18:44,200 --> 00:18:47,566
So what do data brokers want
with all our personal data?
481
00:18:47,566 --> 00:18:50,366
Why do they care that
I love chicken mole
482
00:18:50,366 --> 00:18:51,633
and that "Cool Runnings"
483
00:18:51,633 --> 00:18:54,100
is in my top five movies
of all-time?
484
00:18:54,100 --> 00:18:55,800
TSUKAYAMA:
They collect all this
information,
485
00:18:55,800 --> 00:18:57,733
and then they're really
trying to slice and dice it
486
00:18:57,733 --> 00:18:59,466
in ways that are
appealing to customers.
487
00:18:59,466 --> 00:19:01,233
You know,
it could be an employer
488
00:19:01,233 --> 00:19:02,833
who's doing research on you,
489
00:19:02,833 --> 00:19:05,166
it could be a retailer
who wants to know
490
00:19:05,166 --> 00:19:07,533
what kind of customers
they can target with ads.
491
00:19:08,800 --> 00:19:10,200
That kind of information
492
00:19:10,200 --> 00:19:12,200
is really valuable
for advertisers,
493
00:19:12,200 --> 00:19:13,300
because they want
to target advertising
494
00:19:13,300 --> 00:19:14,633
to a certain type of person.
495
00:19:15,600 --> 00:19:18,366
PATEL:
Our data ends up eventually
getting compiled
496
00:19:18,366 --> 00:19:20,066
into reports like these,
497
00:19:20,066 --> 00:19:22,133
that are then sold to...
498
00:19:22,133 --> 00:19:23,533
whoever's interested?
499
00:19:23,533 --> 00:19:25,233
TSUKAYAMA:
Data brokers are able to say,
500
00:19:25,233 --> 00:19:27,166
"Look,
we have a group of people
501
00:19:27,166 --> 00:19:29,666
"that will fit the audience
of your product.
502
00:19:29,666 --> 00:19:33,200
"We really are happy to serve
this list of people to you,"
503
00:19:33,200 --> 00:19:35,400
or to make a score about
how likely they might be
504
00:19:35,400 --> 00:19:37,366
to purchase your product.
505
00:19:39,166 --> 00:19:41,300
Okay,
I know people say
506
00:19:41,300 --> 00:19:43,066
"Oh, your phones
aren't listening to you,"
507
00:19:43,066 --> 00:19:45,833
but we were just talking
about retro video games,
508
00:19:45,833 --> 00:19:48,066
and here is an ad for
509
00:19:48,066 --> 00:19:50,100
home mini-arcade machines--
510
00:19:50,100 --> 00:19:51,433
which looks kind of cool--
but how did it know
511
00:19:51,433 --> 00:19:53,200
we were just
talking about that?
512
00:19:53,200 --> 00:19:54,333
Is my phone listening?
513
00:19:54,333 --> 00:19:55,533
Is it a psychic?
514
00:19:55,533 --> 00:19:57,400
What's happening?
515
00:19:57,400 --> 00:19:58,566
People always say,
516
00:19:58,566 --> 00:19:59,633
"I was talking about
this product,
517
00:19:59,633 --> 00:20:01,133
and then I saw an ad for it.
518
00:20:01,133 --> 00:20:03,833
"Matt, are they listening
through my phone?"
519
00:20:03,833 --> 00:20:05,166
And I'm like,
"Well, they didn't hear you,
520
00:20:05,166 --> 00:20:06,666
they didn't listen
to anything."
521
00:20:06,666 --> 00:20:07,833
The truth is actually
more frightening.
522
00:20:07,833 --> 00:20:09,833
You talked about that thing
523
00:20:09,833 --> 00:20:11,733
because they influenced you
524
00:20:11,733 --> 00:20:14,133
into having this conversation.
525
00:20:14,133 --> 00:20:16,833
It's because the algorithm
took you to this place.
526
00:20:16,833 --> 00:20:18,733
And that's the situation
with our data.
527
00:20:19,500 --> 00:20:22,166
PATEL:
There have been some outlier
examples
528
00:20:22,166 --> 00:20:24,366
of ad tech companies
listening without consent,
529
00:20:24,366 --> 00:20:25,766
but that's against the law.
530
00:20:25,766 --> 00:20:28,700
For the most part,
advertising algorithms
531
00:20:28,700 --> 00:20:30,833
know you so well
that they can predict
532
00:20:30,833 --> 00:20:32,500
what you would find interesting.
533
00:20:32,500 --> 00:20:34,733
They are so good
because they've studied
534
00:20:34,733 --> 00:20:36,633
all the data
gathered about you--
535
00:20:36,633 --> 00:20:38,333
from the data brokers.
536
00:20:40,133 --> 00:20:41,133
There's this kind of dossier
537
00:20:41,133 --> 00:20:42,600
that's being created
about you
538
00:20:42,600 --> 00:20:43,833
as you're wandering around
the internet,
539
00:20:43,833 --> 00:20:45,833
and it's being used to,
540
00:20:45,833 --> 00:20:48,333
you know,
decide what ad you see.
541
00:20:50,200 --> 00:20:53,233
PATEL:
Advertising fuels
the economics of the internet.
542
00:20:53,233 --> 00:20:55,233
And advertising is fueled by
543
00:20:55,233 --> 00:20:57,266
you, me, us--
544
00:20:57,266 --> 00:20:59,533
our personal data.
545
00:20:59,533 --> 00:21:03,633
So how do the algorithms
actually work?
546
00:21:03,633 --> 00:21:05,533
You land on a webpage,
547
00:21:05,533 --> 00:21:07,566
and that webpage
548
00:21:07,566 --> 00:21:09,233
gets your I.P. address.
549
00:21:09,233 --> 00:21:11,633
A unique identifier is
returned to a data broker.
550
00:21:11,633 --> 00:21:13,066
That data broker
551
00:21:13,066 --> 00:21:14,333
looks up all the facts
552
00:21:14,333 --> 00:21:15,700
that were ever
gleaned about you.
553
00:21:15,700 --> 00:21:19,233
A dossier that describes you.
554
00:21:19,233 --> 00:21:21,733
PATEL:
And then advertisers
literally bid,
555
00:21:21,733 --> 00:21:24,166
they bid on you,
in a real-time auction
556
00:21:24,166 --> 00:21:26,300
based on the only lesson
I remember
557
00:21:26,300 --> 00:21:30,100
from Econ 101:
supply and demand.
558
00:21:30,100 --> 00:21:31,066
DOCTOROW:
The demand-side platform
559
00:21:31,066 --> 00:21:34,333
that the advertiser is on
says, "I have here
560
00:21:34,333 --> 00:21:36,200
"an 18- to 34-year-old
man-child
561
00:21:36,200 --> 00:21:37,766
"with an Xbox
in Southern California.
562
00:21:37,766 --> 00:21:40,533
"Who wants to pay to
cram something
563
00:21:40,533 --> 00:21:42,266
nonconsensually
into his eyeballs?"
564
00:21:42,266 --> 00:21:45,166
And that takes place in
a marketplace,
565
00:21:45,166 --> 00:21:47,066
and you have advertisers
on the other side
566
00:21:47,066 --> 00:21:48,400
who have standing orders.
567
00:21:48,400 --> 00:21:49,833
They're like,
"I want to advertise to
568
00:21:49,833 --> 00:21:52,300
"18- to 34-year-old man-children
in Southern California
569
00:21:52,300 --> 00:21:53,600
who own Xboxes,"
570
00:21:53,600 --> 00:21:56,333
and the marketplace conducts
571
00:21:56,333 --> 00:21:58,766
an automated high-speed auction
where it's just like,
572
00:21:58,766 --> 00:22:01,333
"I'll pay so many
fractions of a cent,"
573
00:22:01,333 --> 00:22:02,833
"I'll pay that
plus ten percent."
574
00:22:02,833 --> 00:22:04,266
(auction bell rings)
575
00:22:04,266 --> 00:22:06,133
The high bidder
gets to show you an ad.
576
00:22:07,533 --> 00:22:09,600
PATEL:
All of this happens
nearly instantly.
577
00:22:09,600 --> 00:22:12,533
And it's possible
because data brokers
578
00:22:12,533 --> 00:22:14,433
sort and cull personal data
579
00:22:14,433 --> 00:22:16,533
into super detailed
consumer groups.
580
00:22:16,533 --> 00:22:19,300
This is the process
that delivers
581
00:22:19,300 --> 00:22:21,233
relevant personalized ads
582
00:22:21,233 --> 00:22:22,833
for cute shoes
583
00:22:22,833 --> 00:22:24,166
or that new tea club--
584
00:22:24,166 --> 00:22:25,800
if you're into
that sort of thing--
585
00:22:25,800 --> 00:22:27,633
which all the ads
seem to know about you.
586
00:22:29,100 --> 00:22:30,733
TSUKAYAMA:
Here we have,
for example, Soccer Mom.
587
00:22:30,733 --> 00:22:32,133
PATEL:
Okay.
588
00:22:32,133 --> 00:22:33,300
I might be able
to identify with that.
589
00:22:34,333 --> 00:22:38,066
A sporting goods retailer could
then go and check out a file
590
00:22:38,066 --> 00:22:40,766
on what a Soccer Mom's
online activity is like.
591
00:22:40,766 --> 00:22:43,433
TSUKAYAMA:
What are the things that
they tend to spend money on,
592
00:22:43,433 --> 00:22:44,566
and do they like to save?
593
00:22:44,566 --> 00:22:45,700
Are there
certain times of year
594
00:22:45,700 --> 00:22:47,500
where they might
spend more or less?
595
00:22:47,500 --> 00:22:51,833
PATEL:
With all this data about
Soccer Mom's consumer habits,
596
00:22:51,833 --> 00:22:55,433
retailers can send personalized
ads with uncanny timing.
597
00:22:56,300 --> 00:22:59,833
Some people like the results--
soccer moms get deals.
598
00:22:59,833 --> 00:23:03,100
Advertisers keep websites
we love in business.
599
00:23:03,100 --> 00:23:04,400
But...
600
00:23:04,400 --> 00:23:07,066
some consumer categories
can be more concerning.
601
00:23:07,066 --> 00:23:09,100
HILL:
Sometimes they'll
have categories
602
00:23:09,100 --> 00:23:10,200
that are like,
603
00:23:10,200 --> 00:23:13,133
"This person has erectile
dysfunction,
604
00:23:13,133 --> 00:23:15,166
"this person
has a gambling problem.
605
00:23:15,166 --> 00:23:17,433
"This is a list of, kind of,
606
00:23:17,433 --> 00:23:20,333
people that are likely
to fall for frauds."
607
00:23:20,333 --> 00:23:23,166
So these can be really powerful
and damaging lists.
608
00:23:24,700 --> 00:23:26,066
TSUKAYAMA:
So this one, for example,
609
00:23:26,066 --> 00:23:28,166
diabetes.
PATEL:
Wow.
610
00:23:28,166 --> 00:23:30,200
Okay,
healthcare demographic...
611
00:23:30,200 --> 00:23:32,666
The thing about that,
right, is like,
612
00:23:32,666 --> 00:23:35,100
when you're thinking
particularly about health data,
613
00:23:35,100 --> 00:23:36,233
that indicates
that you're probably
614
00:23:36,233 --> 00:23:38,066
in a pretty sensitive
situation.
615
00:23:38,066 --> 00:23:42,200
PATEL:
I'm now thinking about
all the patients I take care of.
616
00:23:42,200 --> 00:23:43,666
And I shudder to think
617
00:23:43,666 --> 00:23:45,366
at the targeted ads
that are
618
00:23:45,366 --> 00:23:47,133
deliberately targeting
these individuals.
619
00:23:47,133 --> 00:23:48,533
(keys clacking)
620
00:23:48,533 --> 00:23:51,100
TSUKAYAMA:
Data brokers have lists
about sexual assault victims,
621
00:23:51,100 --> 00:23:53,400
they have lists
about dementia sufferers.
622
00:23:53,400 --> 00:23:55,300
Those are some really
concerning types of categories,
623
00:23:55,300 --> 00:23:57,333
and you could
definitely see how
624
00:23:57,333 --> 00:23:59,700
an advertising company
could take advantage
625
00:23:59,700 --> 00:24:01,300
of people on those lists.
626
00:24:03,100 --> 00:24:05,100
So I think a lot of
people's first reaction
627
00:24:05,100 --> 00:24:06,466
on seeing these
data broker reports
628
00:24:06,466 --> 00:24:08,200
is like, where has
all of this come from?
629
00:24:08,200 --> 00:24:11,766
There are a lot of places where
data brokers get information.
630
00:24:11,766 --> 00:24:13,366
Most commonly apps that
631
00:24:13,366 --> 00:24:14,833
you download that have deals
632
00:24:14,833 --> 00:24:17,266
with data brokers
to share information.
633
00:24:17,266 --> 00:24:20,233
PATEL:
All right, I knew that apps
were getting my data,
634
00:24:20,233 --> 00:24:23,366
but I had no idea some were
sharing it with data brokers.
635
00:24:23,366 --> 00:24:26,233
How is that even legal?
636
00:24:26,233 --> 00:24:28,366
TSUKAYAMA:
How often have you seen
a terms and conditions screen
637
00:24:28,366 --> 00:24:29,833
just like this?
638
00:24:29,833 --> 00:24:31,733
All the time.
Anytime I download
639
00:24:31,733 --> 00:24:34,433
anything, basically.
(both chuckle)
640
00:24:34,433 --> 00:24:37,800
I download apps
all the time
641
00:24:37,800 --> 00:24:40,100
for work, play, life,
convenience.
642
00:24:40,100 --> 00:24:42,166
I just scroll all the way down
and hit accept.
643
00:24:42,166 --> 00:24:42,833
because I'm over it.
Right.
644
00:24:42,833 --> 00:24:44,366
And I think that's how
645
00:24:44,366 --> 00:24:46,166
most people feel about it.
646
00:24:46,166 --> 00:24:48,166
PATEL:
"By accepting these terms,
647
00:24:48,166 --> 00:24:50,166
you allow the platform
to freely..."
648
00:24:50,166 --> 00:24:51,233
"...content for demonstration,
promotion, and..."
649
00:24:51,233 --> 00:24:52,766
TSUKAYAMA:
Terms and conditions
are really long.
650
00:24:52,766 --> 00:24:54,700
Cool, I'm in a commercial
651
00:24:54,700 --> 00:24:57,133
and don't even know about it.
652
00:24:57,133 --> 00:24:58,633
TSUKAYAMA:
Carnegie Mellon once did a study
that said it would take
653
00:24:58,633 --> 00:25:01,300
days for somebody to get through
all of the terms and conditions.
654
00:25:01,300 --> 00:25:03,100
They actually
use the word exploiting.
655
00:25:03,100 --> 00:25:04,833
"You represent and warrant..."
656
00:25:04,833 --> 00:25:07,666
TSUKAYAMA:
That puts people in
a really difficult position
657
00:25:07,666 --> 00:25:10,633
when we're supposed
to manage our own privacy,
658
00:25:10,633 --> 00:25:12,733
but we're also supposed
to use all these things
659
00:25:12,733 --> 00:25:14,833
that are products that
will make our lives better.
660
00:25:14,833 --> 00:25:17,100
(papers rustling,
teacup clatters)
661
00:25:19,633 --> 00:25:22,433
Data brokers
are a big industry.
662
00:25:22,433 --> 00:25:25,100
It's about a $200 billion
industry right now.
663
00:25:25,100 --> 00:25:27,200
I think a lot of people
expect it to grow to,
664
00:25:27,200 --> 00:25:29,533
you know, $400 billion
in the next few years.
665
00:25:31,233 --> 00:25:34,566
NOBLE:
That level of data
from our past,
666
00:25:34,566 --> 00:25:37,066
all the things
we've done being used
667
00:25:37,066 --> 00:25:38,200
and put into systems
668
00:25:38,200 --> 00:25:39,233
to help predict our futures,
669
00:25:39,233 --> 00:25:41,100
that's unprecedented,
670
00:25:41,100 --> 00:25:45,466
and that's rife with the
potential for discrimination,
671
00:25:45,466 --> 00:25:47,200
for harm,
for exclusion.
672
00:25:49,133 --> 00:25:50,266
PATEL:
Okay, I know I said
673
00:25:50,266 --> 00:25:52,500
that I'm not the type
to log in to...
674
00:25:52,500 --> 00:25:54,166
(voiceover):
Look, I love being online.
675
00:25:54,166 --> 00:25:56,133
I like sharing what I'm up to
on social media,
676
00:25:56,133 --> 00:25:58,533
and I'm not afraid to
share my thoughts.
677
00:25:58,533 --> 00:26:01,133
Okay, bugs, I don't bite you,
678
00:26:01,133 --> 00:26:02,133
you don't bite me.
679
00:26:02,133 --> 00:26:02,900
It's that easy.
680
00:26:02,900 --> 00:26:05,533
But some of
this personal data
681
00:26:05,533 --> 00:26:07,333
is, well, personal.
682
00:26:07,333 --> 00:26:08,833
So, now I need to know:
683
00:26:08,833 --> 00:26:11,100
What can I do
to make my digital life
684
00:26:11,100 --> 00:26:13,100
more private and secure?
685
00:26:13,100 --> 00:26:17,366
GALPERIN:
Privacy and security
are not the same thing.
686
00:26:17,366 --> 00:26:18,766
For example, uh,
687
00:26:18,766 --> 00:26:20,833
Facebook is extremely interested
688
00:26:20,833 --> 00:26:22,400
in protecting your security.
689
00:26:22,400 --> 00:26:25,233
They want to make sure
that it is always you
690
00:26:25,233 --> 00:26:26,666
logging in to your account.
691
00:26:26,666 --> 00:26:28,733
They will go through
a great deal of trouble
692
00:26:28,733 --> 00:26:31,433
to keep your account secure.
693
00:26:32,233 --> 00:26:35,366
But you enter all kinds
of data into that account.
694
00:26:35,366 --> 00:26:37,100
You tell it
where you are located.
695
00:26:37,100 --> 00:26:38,500
You send it all
of your pictures.
696
00:26:38,500 --> 00:26:40,200
You send messages
697
00:26:40,200 --> 00:26:43,166
and Facebook collects
all of that data.
698
00:26:43,166 --> 00:26:45,100
They don't want you
to keep it private.
699
00:26:45,100 --> 00:26:48,533
They want you to hand it to them
so that they can use it in order
700
00:26:48,533 --> 00:26:51,566
to serve you targeted ads
and make them money.
701
00:26:51,566 --> 00:26:53,600
PATEL (voiceover):
My accounts are mostly secure
702
00:26:53,600 --> 00:26:55,700
when I control access to them.
703
00:26:55,700 --> 00:26:58,566
But that doesn't mean
the data I put in them
704
00:26:58,566 --> 00:27:01,500
stays private, far from it.
705
00:27:01,500 --> 00:27:05,066
Privacy and security
are not the same.
706
00:27:05,066 --> 00:27:07,700
But they are two sides
of the same coin,
707
00:27:07,700 --> 00:27:09,266
and I have to understand both
708
00:27:09,266 --> 00:27:10,566
if I'm gonna protect
709
00:27:10,566 --> 00:27:12,100
my personal data.
710
00:27:12,100 --> 00:27:14,566
MITCHELL:
When your privacy
is taken from you,
711
00:27:14,566 --> 00:27:16,466
your agency is taken from you.
712
00:27:16,466 --> 00:27:19,100
Privacy is that whisper.
713
00:27:19,100 --> 00:27:21,766
When you think you're whispering
to your friend,
714
00:27:21,766 --> 00:27:24,200
but you're shouting
in a crowded elevator,
715
00:27:24,200 --> 00:27:26,100
you're robbed of something.
716
00:27:26,100 --> 00:27:29,000
And that's why privacy
is so important.
717
00:27:29,966 --> 00:27:33,100
PATEL:
What can I do right now
to protect my privacy
718
00:27:33,100 --> 00:27:34,200
and my security?
719
00:27:35,366 --> 00:27:38,766
To learn tips and tools to
preserve my data on both fronts,
720
00:27:38,766 --> 00:27:42,200
I'm chatting with hacker
and educator Matt Mitchell
721
00:27:42,200 --> 00:27:46,233
and cybersecurity
expert Eva Galperin.
722
00:27:46,233 --> 00:27:49,100
First up: privacy.
723
00:27:49,100 --> 00:27:50,300
Sorry.
724
00:27:50,300 --> 00:27:52,200
(whispering):
Privacy.
725
00:27:52,200 --> 00:27:54,366
Matt is a privacy advocate
726
00:27:54,366 --> 00:27:56,600
at CryptoHarlem,
as in cryptography,
727
00:27:56,600 --> 00:28:00,666
the process of hiding
or coding information.
728
00:28:00,666 --> 00:28:03,400
Hey. I'm Matt. Thanks for coming
to CryptoHarlem.
729
00:28:03,400 --> 00:28:07,633
MITCHELL (voiceover):
CryptoHarlem is
anti-surveillance workshops
730
00:28:07,633 --> 00:28:09,366
for the Black community
731
00:28:09,366 --> 00:28:11,400
and all marginalized communities
around the world.
732
00:28:12,200 --> 00:28:15,500
And our mission is to develop
people's digital skills
733
00:28:15,500 --> 00:28:17,133
so they can actually
help us in the fight
734
00:28:17,133 --> 00:28:19,500
to push back on digital harms.
735
00:28:19,500 --> 00:28:22,133
MITCHELL:
Privacy is not secrecy.
736
00:28:22,133 --> 00:28:23,066
Privacy is just a door.
737
00:28:23,066 --> 00:28:25,100
A door in your home.
738
00:28:25,100 --> 00:28:26,233
There's a sense of, like,
739
00:28:26,233 --> 00:28:28,333
I want to control
what can be seen,
740
00:28:28,333 --> 00:28:30,000
what can't be seen just for me.
741
00:28:30,800 --> 00:28:32,766
PATEL (voiceover):
And I want to close that door.
742
00:28:32,766 --> 00:28:33,833
So how do I do this?
743
00:28:33,833 --> 00:28:35,633
Even just a little?
744
00:28:35,633 --> 00:28:37,100
PATEL:
Okay, what do I do
745
00:28:37,100 --> 00:28:39,566
to make all this safer for me?
746
00:28:39,566 --> 00:28:42,133
What do I do, who do I talk to,
how do I start?
747
00:28:42,133 --> 00:28:43,333
You have to ask yourself,
748
00:28:43,333 --> 00:28:45,300
"Is this a problem
that needs to be fixed?"
749
00:28:45,300 --> 00:28:47,100
Privacy isn't a switch.
750
00:28:47,100 --> 00:28:48,400
It's a dial.
751
00:28:48,400 --> 00:28:50,633
You get to control
how much you share
752
00:28:50,633 --> 00:28:52,166
with who and with what.
753
00:28:52,166 --> 00:28:54,500
PATEL (voiceover):
Let's start with a big one,
754
00:28:54,500 --> 00:28:57,066
something I use all the time,
every day,
755
00:28:57,066 --> 00:28:59,133
and I bet you do, too.
756
00:28:59,133 --> 00:29:02,266
Have you ever used
this website called Google?
757
00:29:02,266 --> 00:29:03,533
I've heard of it.
Yeah.
758
00:29:03,533 --> 00:29:04,766
Well, let's check this out.
759
00:29:04,766 --> 00:29:06,100
Well, if we go here
760
00:29:06,100 --> 00:29:11,400
to myactivity.google.com...
761
00:29:13,200 --> 00:29:16,366
...it'll show us all the things
that you've been doing.
762
00:29:16,366 --> 00:29:17,566
So for example,
763
00:29:17,566 --> 00:29:19,133
when we go here,
764
00:29:19,133 --> 00:29:22,733
we see all the different
Google services that you use.
765
00:29:22,733 --> 00:29:24,766
I don't think they make
a service you don't use.
766
00:29:25,566 --> 00:29:28,633
PATEL (voiceover):
These platforms are so deeply
embedded in many of our lives
767
00:29:28,633 --> 00:29:30,133
and for good reason.
768
00:29:30,133 --> 00:29:33,133
They make products
that can be really useful.
769
00:29:33,133 --> 00:29:34,566
It's hard for me to imagine
770
00:29:34,566 --> 00:29:36,300
going a single day
without searching the web
771
00:29:36,300 --> 00:29:38,466
or using a navigation app.
772
00:29:38,466 --> 00:29:41,200
But they also suck up
a lot of our data.
773
00:29:41,200 --> 00:29:43,466
It's a trade-off
that I'm comfortable making,
774
00:29:43,466 --> 00:29:45,100
within reason.
775
00:29:45,100 --> 00:29:49,333
I've used literally everything
Google has ever offered.
776
00:29:49,333 --> 00:29:51,633
And it knows what you've used.
777
00:29:51,633 --> 00:29:53,700
And it records everything
you've ever used
778
00:29:53,700 --> 00:29:55,166
and how you've used it.
779
00:29:55,166 --> 00:29:56,600
Every search term you've used,
780
00:29:56,600 --> 00:30:00,100
every, uh, you know,
shopping item you've used.
781
00:30:00,100 --> 00:30:05,066
But this is the dashboard
to delete it.
782
00:30:05,066 --> 00:30:07,100
PATEL (voiceover):
I didn't know this,
but I can literally just delete
783
00:30:07,100 --> 00:30:10,100
huge amounts of data
that Google is storing about me.
784
00:30:10,100 --> 00:30:13,400
And the same is true
for a lot of other services.
785
00:30:13,400 --> 00:30:16,100
We can dial to whatever
level we feel comfortable.
786
00:30:16,100 --> 00:30:17,633
For example, on LinkedIn,
787
00:30:17,633 --> 00:30:19,100
you would just click on me,
788
00:30:19,100 --> 00:30:23,333
and then you would go to
your settings and privacy.
789
00:30:23,333 --> 00:30:25,066
Here, when we go to
manage your activity,
790
00:30:25,066 --> 00:30:27,266
it tells you that, you know,
791
00:30:27,266 --> 00:30:28,533
you started sharing
your LinkedIn data
792
00:30:28,533 --> 00:30:30,566
with a permitted application.
793
00:30:30,566 --> 00:30:35,066
PATEL (voiceover):
Treating privacy like a dial
means it's not all or nothing.
794
00:30:35,066 --> 00:30:39,200
You can have your data cake
and eat it, too.
795
00:30:39,200 --> 00:30:41,466
Like now I'm thinking I want
to log into everything--
796
00:30:41,466 --> 00:30:44,500
all my social media, my email,
my LinkedIn, everything--
797
00:30:44,500 --> 00:30:48,133
regularly and look to
see who is using these.
798
00:30:48,133 --> 00:30:50,566
Exactly, it is about awareness.
799
00:30:50,566 --> 00:30:52,733
Furthermore, the companies,
800
00:30:52,733 --> 00:30:55,433
they know how many people
actually use
801
00:30:55,433 --> 00:30:56,433
the privacy controls.
802
00:30:56,433 --> 00:30:58,066
And by you even
peeking in it,
803
00:30:58,066 --> 00:31:02,166
you're saying
I believe privacy matters.
804
00:31:02,166 --> 00:31:05,400
At the Electronic
Frontier Foundation,
805
00:31:05,400 --> 00:31:08,200
Eva shows me how I can take
my privacy game
806
00:31:08,200 --> 00:31:09,433
to the next level...
807
00:31:09,433 --> 00:31:13,166
learning some
Surveillance Self-Defense.
808
00:31:13,833 --> 00:31:19,333
Although, apparently not
that kind of self-defense.
809
00:31:19,333 --> 00:31:21,266
GALPERIN:
So, the next step in
your privacy journey
810
00:31:21,266 --> 00:31:23,366
is fighting back against
811
00:31:23,366 --> 00:31:25,166
types of corporate surveillance.
812
00:31:25,166 --> 00:31:27,100
I-- and one of the things
813
00:31:27,100 --> 00:31:29,700
that websites really like
to do when, uh,
814
00:31:29,700 --> 00:31:32,333
is not just to track what
you are doing on their website,
815
00:31:32,333 --> 00:31:34,733
but to track all the other
websites that you go to.
816
00:31:34,733 --> 00:31:36,466
And they do this using cookies.
817
00:31:36,466 --> 00:31:37,766
There's some companies I trust,
818
00:31:37,766 --> 00:31:39,666
and I'm like, fine,
you have these cookies.
819
00:31:39,666 --> 00:31:41,566
They're chocolate chip.
I know where they were made.
820
00:31:41,566 --> 00:31:42,800
I know what
you're doing with them.
821
00:31:42,800 --> 00:31:44,066
But then there's
these third party companies,
822
00:31:44,066 --> 00:31:46,166
I don't want them
around me.
823
00:31:46,166 --> 00:31:48,166
You can use a
browser extension
824
00:31:48,166 --> 00:31:50,566
to eat these cookies, uh,
825
00:31:50,566 --> 00:31:51,833
and fight back against
this kind of tracking
826
00:31:51,833 --> 00:31:54,466
and keep those websites from
seeing where else you're going.
827
00:31:54,466 --> 00:31:58,100
PATEL (voiceover):
Browser extensions are
add-ons for your web browser
828
00:31:58,100 --> 00:32:01,266
that give it extra features
and functionality,
829
00:32:01,266 --> 00:32:03,333
like eating cookies.
830
00:32:03,333 --> 00:32:04,566
I'm imagining this,
831
00:32:04,566 --> 00:32:07,833
like, digital Cookie Monster
that's eating up
832
00:32:07,833 --> 00:32:11,233
all these pieces
of my online activity
833
00:32:11,233 --> 00:32:14,133
so that companies don't know
what I'm doing online.
834
00:32:14,133 --> 00:32:16,200
And it reduces the amount
of privacy tracking.
835
00:32:16,200 --> 00:32:17,566
Am I understanding that?
836
00:32:17,566 --> 00:32:19,366
What these
browser extensions
837
00:32:19,366 --> 00:32:24,333
do is they get rid of,
uh, the tracking cookies
838
00:32:24,333 --> 00:32:26,166
that these websites use to see
839
00:32:26,166 --> 00:32:28,066
all the other sites
that you're going to,
840
00:32:28,066 --> 00:32:29,100
which is none of their business.
841
00:32:29,100 --> 00:32:31,166
(knuckles crack)
842
00:32:31,166 --> 00:32:33,100
PATEL (voiceover):
My personal digital world
is starting to feel
843
00:32:33,100 --> 00:32:36,033
comfy, controlled
and a lot more private.
844
00:32:37,000 --> 00:32:39,633
I've learned how to close
the doors and blinds.
845
00:32:39,633 --> 00:32:43,166
But still, someone could
open them right back up.
846
00:32:43,166 --> 00:32:44,733
I could get hacked.
847
00:32:44,733 --> 00:32:47,133
How do I lock my doors?
848
00:32:47,133 --> 00:32:49,466
Time to talk about security.
849
00:32:49,466 --> 00:32:50,833
MITCHELL:
The tragedy of
850
00:32:50,833 --> 00:32:52,266
this recent pandemic teaches us
851
00:32:52,266 --> 00:32:55,133
that we're all
pretty vulnerable.
852
00:32:55,133 --> 00:32:56,833
And without that herd immunity,
853
00:32:56,833 --> 00:32:59,200
without that community response,
854
00:32:59,200 --> 00:33:01,566
you can't just say,
"I'm okay,
855
00:33:01,566 --> 00:33:05,066
and therefore I don't
have to worry about this."
856
00:33:05,066 --> 00:33:07,300
This is the first time
that I've ever heard
857
00:33:07,300 --> 00:33:10,100
someone compare data privacy
858
00:33:10,100 --> 00:33:14,066
to epidemiology
and herd immunity.
859
00:33:14,066 --> 00:33:17,666
All you need is one person
to make a mistake
860
00:33:17,666 --> 00:33:19,366
and it's game over.
861
00:33:19,366 --> 00:33:23,700
That's what happens to so many
other corporations, businesses,
862
00:33:23,700 --> 00:33:25,766
even hospitals during
a ransomware attack
863
00:33:25,766 --> 00:33:28,300
that'll freeze all machines
and stop them from working
864
00:33:28,300 --> 00:33:30,600
until someone pays
a bunch of hackers.
865
00:33:30,600 --> 00:33:33,100
My one error could
compromise the security
866
00:33:33,100 --> 00:33:35,266
of my entire institution.
867
00:33:35,266 --> 00:33:36,533
MITCHELL:
Human beings will make mistakes.
868
00:33:36,533 --> 00:33:39,200
Folks won't wash
their hands sometimes, right?
869
00:33:39,200 --> 00:33:43,300
But we try to teach best
practices to prevent the worst.
870
00:33:43,300 --> 00:33:44,766
Okay, Matt, you have
my wheels spinning.
871
00:33:44,766 --> 00:33:47,466
I do not want
to be patient zero
872
00:33:47,466 --> 00:33:50,300
in this, like, massive
infectious data leak.
873
00:33:50,300 --> 00:33:52,666
What can I do to start
pulling it back
874
00:33:52,666 --> 00:33:54,466
and better protecting
my information?
875
00:33:54,466 --> 00:33:56,600
Well, I got something
for that, okay?
876
00:33:56,600 --> 00:33:58,666
Oh, you have, like,
a bag of tricks.
I've got a bag.
877
00:33:58,666 --> 00:34:00,600
I thought you were just gonna
say, like, change your password.
878
00:34:00,600 --> 00:34:02,533
We got to go all the way.
879
00:34:02,533 --> 00:34:04,200
This is a privacy screen.
880
00:34:04,200 --> 00:34:06,733
And this will keep
people from being able
881
00:34:06,733 --> 00:34:08,166
to look over your shoulder.
882
00:34:08,166 --> 00:34:09,400
Shoulder surfers beware.
883
00:34:09,400 --> 00:34:10,566
Can't look through that.
884
00:34:10,566 --> 00:34:11,733
This is for like,
using my computer
885
00:34:11,733 --> 00:34:13,300
in a public space,
the airplane.
886
00:34:13,300 --> 00:34:15,466
Everywhere.
887
00:34:15,466 --> 00:34:17,233
This is a Faraday bag.
888
00:34:17,233 --> 00:34:18,800
You got to keep
your phone in here.
889
00:34:18,800 --> 00:34:20,500
This blocks RF signals.
890
00:34:20,500 --> 00:34:21,833
It's basically aluminum foil...
PATEL (voiceover):
Yo, Matt?
891
00:34:21,833 --> 00:34:23,833
Can we slow this down
a little bit?
892
00:34:23,833 --> 00:34:25,700
You can't get hacked when
you're not attached to anything.
893
00:34:25,700 --> 00:34:28,200
Is this really necessary,
a Faraday bag?
894
00:34:28,200 --> 00:34:31,200
PATEL (voiceover):
Okay, this looks a little more
complicated than I expected.
895
00:34:31,200 --> 00:34:33,400
I guess I just have to become
Dr. 007.
896
00:34:33,400 --> 00:34:35,666
We also have a Wi-Fi pineapple.
897
00:34:35,666 --> 00:34:38,200
This is a portable access point.
898
00:34:38,200 --> 00:34:39,833
And we also could
use it to detect
899
00:34:39,833 --> 00:34:42,200
and stop access points
from going bad.
900
00:34:42,200 --> 00:34:44,400
PATEL (voiceover):
Okay, Matt,
now you've just gone rogue.
901
00:34:44,400 --> 00:34:46,400
MITCHELL:
Let's say you
have a hard drive
902
00:34:46,400 --> 00:34:48,300
with some important
files from work in it.
903
00:34:48,300 --> 00:34:49,500
What happens if you lose it?
904
00:34:49,500 --> 00:34:50,833
Someone can plug that
in and have access
905
00:34:50,833 --> 00:34:51,633
to all your stuff.
906
00:34:51,633 --> 00:34:54,100
Not if it's an
encrypted hard drive.
907
00:34:54,100 --> 00:34:56,400
I understand--
There's more.
908
00:34:56,400 --> 00:34:58,166
How do you know that
you're not being bugged?
909
00:34:58,166 --> 00:34:59,533
This is nuts.
910
00:34:59,533 --> 00:35:01,400
This is something
I could use to find out
911
00:35:01,400 --> 00:35:04,433
if there is a
audio bug in the room.
(beeping)
912
00:35:04,433 --> 00:35:06,100
I can find Wi-Fi signals...
(device beeps)
913
00:35:06,100 --> 00:35:07,733
Oops, something's here,
right?
914
00:35:08,566 --> 00:35:11,733
PATEL (voiceover):
Honestly, I thought being
a spy would be more fun.
915
00:35:11,733 --> 00:35:14,433
But I'm starting
to feel shaken, not stirred.
916
00:35:14,433 --> 00:35:17,300
How am I supposed to keep track
of all this stuff?
917
00:35:17,300 --> 00:35:20,366
I'm no hacker genius like Matt.
918
00:35:21,200 --> 00:35:25,233
What if I just left all
of this behind and just quit.
919
00:35:25,233 --> 00:35:27,433
Goodbye, digital world.
920
00:35:32,133 --> 00:35:37,166
(clock ticking)
921
00:35:40,400 --> 00:35:42,133
(yawning)
922
00:35:45,533 --> 00:35:46,666
Do you know if there's
breaking news?
(cat meows)
923
00:35:46,666 --> 00:35:47,733
Look in my eyes and tell me.
924
00:35:47,733 --> 00:35:49,166
What are we doing here?
925
00:35:49,166 --> 00:35:50,400
What are we doing?
926
00:35:50,400 --> 00:35:51,400
Like, I don't even know
927
00:35:51,400 --> 00:35:52,666
how many steps I've taken today.
928
00:35:52,666 --> 00:35:54,533
Because my step counter
is not on my hand.
929
00:35:54,533 --> 00:35:57,400
I'm technically disconnected,
this isn't cheating,
930
00:35:57,400 --> 00:35:59,500
but can someone tell me
what the Suns score is?
931
00:35:59,500 --> 00:36:01,266
CREW MEMBER:
Uh, down by 11 at the half.
932
00:36:01,266 --> 00:36:02,266
(groans)
933
00:36:02,266 --> 00:36:04,166
(cat purring)
934
00:36:04,166 --> 00:36:06,400
Oh, I am supposed to
pick up my daughter.
935
00:36:06,400 --> 00:36:09,366
How long do
I have to do this for?
936
00:36:10,166 --> 00:36:12,366
(voiceover):
All right,
maybe it's not that easy.
937
00:36:12,366 --> 00:36:14,066
But there's got to be
some middle ground
938
00:36:14,066 --> 00:36:16,533
between Inspector Gadget
and Fred Flintstone
939
00:36:16,533 --> 00:36:18,700
that's right for me.
940
00:36:18,700 --> 00:36:21,533
Instead of going completely
off the grid,
941
00:36:21,533 --> 00:36:23,766
I'm checking back in with Eva
942
00:36:23,766 --> 00:36:26,166
for some more surveillance
self-defense.
943
00:36:26,166 --> 00:36:31,500
And the best place to start is
with the basics: passwords.
944
00:36:31,500 --> 00:36:33,533
I'll be honest,
945
00:36:33,533 --> 00:36:34,800
my current password
946
00:36:34,800 --> 00:36:36,066
is basically a combination
947
00:36:36,066 --> 00:36:38,566
of my first pet's name
plus my birthday
948
00:36:38,566 --> 00:36:40,833
or a favorite
video game or song.
949
00:36:40,833 --> 00:36:43,066
I don't think any hacker's
really gonna guess that.
950
00:36:43,066 --> 00:36:46,200
But you're telling me that there
is still a vulnerability there.
951
00:36:46,200 --> 00:36:48,133
Yes.
952
00:36:48,133 --> 00:36:49,733
Hackers can go
find out information
953
00:36:49,733 --> 00:36:50,933
about you from data brokers,
954
00:36:50,933 --> 00:36:54,233
including, you know, the name
of the street you grew up on
955
00:36:54,233 --> 00:36:57,266
or the city where
you went to college.
956
00:36:57,266 --> 00:37:01,233
So you wouldn't want a password
like the name of your pet.
957
00:37:01,233 --> 00:37:02,766
Or password123.
958
00:37:02,766 --> 00:37:07,300
Well, I did have a pet fish,
and his name was Password123.
959
00:37:07,300 --> 00:37:09,066
So, I guess that
kind of, like,
960
00:37:09,066 --> 00:37:11,433
hits both boxes and
makes me more vulnerable.
961
00:37:11,433 --> 00:37:16,700
Let's start by, uh, creating
some strong and unique passwords
962
00:37:16,700 --> 00:37:18,800
for all of your, uh,
your different accounts.
963
00:37:18,800 --> 00:37:20,833
And that will make them
much more hacker-proof.
964
00:37:20,833 --> 00:37:22,633
I'm all ears.
Fantastic.
965
00:37:22,633 --> 00:37:28,100
Well, we're going to start
by rolling these five dice.
966
00:37:29,400 --> 00:37:32,133
GALPERIN (voiceover):
There are easy ways
to create passwords
967
00:37:32,133 --> 00:37:34,566
that are long and strong
968
00:37:34,566 --> 00:37:37,100
and easy to remember that
are not based on information
969
00:37:37,100 --> 00:37:40,266
that attackers
can easily find about you.
970
00:37:40,266 --> 00:37:43,366
And the way that we do that
is we use word lists and dice.
971
00:37:44,266 --> 00:37:48,100
The word list is essentially
just a long list
972
00:37:48,100 --> 00:37:52,100
of dictionary words
with numbers attached.
973
00:37:52,100 --> 00:37:54,133
So what we're going
to do is we're going
974
00:37:54,133 --> 00:37:55,833
to write down the
numbers on this dice.
975
00:37:55,833 --> 00:37:58,766
45263, my new lucky numbers.
Okay.
976
00:37:58,766 --> 00:38:01,733
And now we are
going to look up
977
00:38:01,733 --> 00:38:05,600
45263 in this book.
978
00:38:05,600 --> 00:38:08,100
So there are words that
correlate to the numbers
979
00:38:08,100 --> 00:38:09,700
that I just randomly rolled.
Yes.
980
00:38:09,700 --> 00:38:12,066
Okay. I want a cool
word like "tiger."
981
00:38:12,066 --> 00:38:14,533
The word is
"presoak."
982
00:38:14,533 --> 00:38:16,033
"Presoak?"
"Presoak."
983
00:38:17,166 --> 00:38:21,400
34115.
984
00:38:21,400 --> 00:38:22,666
"Henna."
985
00:38:22,666 --> 00:38:24,333
Okay, I like this.
986
00:38:24,333 --> 00:38:26,200
PATEL (voiceover):
The idea behind this dice game
987
00:38:26,200 --> 00:38:29,133
is that
it's both random and long.
988
00:38:29,133 --> 00:38:32,133
Hackers using fast,
powerful computers
989
00:38:32,133 --> 00:38:34,366
and sophisticated
password-cracking software
990
00:38:34,366 --> 00:38:37,700
are able to try many,
many passwords a second.
991
00:38:37,700 --> 00:38:40,066
A recent study
by a security firm
992
00:38:40,066 --> 00:38:42,600
showed that a seven-character
password can be cracked
993
00:38:42,600 --> 00:38:44,700
in just a few seconds.
994
00:38:44,700 --> 00:38:46,266
But a 12-character password
995
00:38:46,266 --> 00:38:48,100
with uppercase letters
and symbols
996
00:38:48,100 --> 00:38:50,500
is much more difficult to break.
997
00:38:50,500 --> 00:38:52,100
Studies show
it could take hackers
998
00:38:52,100 --> 00:38:54,366
well over 100 years to crack.
999
00:38:54,366 --> 00:38:56,133
That's safe.
1000
00:38:56,133 --> 00:38:59,133
But also more difficult
to type or remember.
1001
00:38:59,133 --> 00:39:00,733
Eva's six-word random passphrase
1002
00:39:00,733 --> 00:39:04,066
is easier to remember than a
random string of 12 characters
1003
00:39:04,066 --> 00:39:06,466
and practically
impossible to break.
1004
00:39:06,466 --> 00:39:07,500
"Stinging."
1005
00:39:07,500 --> 00:39:08,700
"Ignition."
1006
00:39:11,166 --> 00:39:13,333
Six is "Clutch."
"Clutch," okay.
1007
00:39:13,333 --> 00:39:17,833
"Presoak Henna Stinging
Ignition Clutch Handbrake."
1008
00:39:17,833 --> 00:39:19,300
(bell dings)
Fantastic.
1009
00:39:19,300 --> 00:39:20,500
Now you have a passphrase.
1010
00:39:20,500 --> 00:39:21,500
These six words.
1011
00:39:21,500 --> 00:39:23,533
It is long. It's unique.
1012
00:39:23,533 --> 00:39:25,266
It's difficult to guess.
1013
00:39:25,266 --> 00:39:26,833
And it's relatively
easy to remember,
1014
00:39:26,833 --> 00:39:29,166
considering how long it is.
1015
00:39:29,166 --> 00:39:31,266
It is a very good way of
creating random passwords,
1016
00:39:31,266 --> 00:39:34,733
but a lot of password managers
will automatically
1017
00:39:34,733 --> 00:39:36,566
do this for you.
1018
00:39:36,566 --> 00:39:39,600
PATEL (voiceover):
Once you have a passphrase,
you can use this
1019
00:39:39,600 --> 00:39:42,800
to unlock a password manager
that generates
1020
00:39:42,800 --> 00:39:47,100
strong random passwords
and stores them securely.
1021
00:39:47,100 --> 00:39:51,233
Now I have this beautiful
random password or passphrase,
1022
00:39:51,233 --> 00:39:53,500
but what happens
if someone steals this?
1023
00:39:53,500 --> 00:39:55,700
Is it just game over,
efforts are gone?
1024
00:39:55,700 --> 00:39:56,833
No, it is not.
1025
00:39:56,833 --> 00:40:00,100
Uh, that leads us to the
next step on your surveillance
1026
00:40:00,100 --> 00:40:03,400
self-defense journey, and that
is two-factor authentication.
1027
00:40:04,400 --> 00:40:07,800
PATEL (voiceover):
This may sound
like a hacker ninja term,
1028
00:40:07,800 --> 00:40:10,100
but it's just
an added layer of security,
1029
00:40:10,100 --> 00:40:13,833
requiring an additional method
of confirming your identity.
1030
00:40:13,833 --> 00:40:16,433
Most of the time,
it's just a simple text message.
1031
00:40:16,433 --> 00:40:17,633
GALPERIN:
Because it requires
1032
00:40:17,633 --> 00:40:19,233
that you have two things
1033
00:40:19,233 --> 00:40:20,600
in order to log
into your account.
1034
00:40:20,600 --> 00:40:24,200
You will need both your password
and also this code,
1035
00:40:24,200 --> 00:40:27,800
so the site
will send you a text message.
1036
00:40:27,800 --> 00:40:30,600
There is also
an authenticator app.
1037
00:40:30,600 --> 00:40:35,166
The site will ask you to take
a photo of and then go to
1038
00:40:35,166 --> 00:40:37,566
the site in a QR code.
1039
00:40:37,566 --> 00:40:40,166
The best way to be
a security superhero
1040
00:40:40,166 --> 00:40:43,100
is to find the stuff
that works for you
1041
00:40:43,100 --> 00:40:44,833
and implement it
as sort of smoothly
1042
00:40:44,833 --> 00:40:47,433
and seamlessly as you can.
1043
00:40:48,500 --> 00:40:50,800
PATEL (voiceover):
These data self-defense
tweaks are great
1044
00:40:50,800 --> 00:40:52,833
for me and my own data.
1045
00:40:52,833 --> 00:40:55,066
But what about everyone else?
1046
00:40:55,066 --> 00:40:58,833
I'm wondering, is there a way
for all of us to share data
1047
00:40:58,833 --> 00:41:00,833
and get all the sweet benefits,
1048
00:41:00,833 --> 00:41:02,733
without sacrificing
privacy and security?
1049
00:41:05,100 --> 00:41:06,600
At M.I.T.'s Media Lab,
1050
00:41:06,600 --> 00:41:09,300
I'm meeting up
with Ramesh Raskar,
1051
00:41:09,300 --> 00:41:12,066
who is working on a way
to access the benefits
1052
00:41:12,066 --> 00:41:13,533
of personal health data--
1053
00:41:13,533 --> 00:41:15,533
finally, a topic I understand--
1054
00:41:15,533 --> 00:41:18,266
without compromising
our private information.
1055
00:41:19,400 --> 00:41:21,433
At Ramesh's Camera Culture Lab,
1056
00:41:21,433 --> 00:41:23,500
they're building
some data collecting tools
1057
00:41:23,500 --> 00:41:26,233
that seem straight out of
a science fiction movie.
1058
00:41:26,233 --> 00:41:29,333
PATEL:
Love when a sign says
"Danger: Laser."
1059
00:41:29,333 --> 00:41:31,266
You know important things
are happening in here.
1060
00:41:31,266 --> 00:41:35,766
(voiceover):
They've got cameras that can
literally see around corners.
1061
00:41:35,766 --> 00:41:37,566
PATEL:
Get out!
(both laughing)
1062
00:41:37,566 --> 00:41:40,033
RASKAR:
We are not violating
the laws of physics.
1063
00:41:40,900 --> 00:41:44,233
PATEL (voiceover):
And cameras that can see
inside our own bodies.
1064
00:41:44,233 --> 00:41:47,133
That's about as personal
as it gets.
1065
00:41:47,133 --> 00:41:51,566
For Ramesh, protecting
patient privacy is paramount.
1066
00:41:51,566 --> 00:41:54,066
But he's found a drawback
in how we do that.
1067
00:41:54,066 --> 00:41:56,200
Since data is locked up
for privacy,
1068
00:41:56,200 --> 00:42:00,066
advancement in medical science
is not as fast as it could be.
1069
00:42:00,066 --> 00:42:03,466
Because, think about it, access
to tons of personal health data
1070
00:42:03,466 --> 00:42:06,233
could help researchers make
major medical breakthroughs.
1071
00:42:06,233 --> 00:42:10,166
But today, researchers have
to ask for your consent
1072
00:42:10,166 --> 00:42:13,133
to peek at your data
in order to learn from it.
1073
00:42:13,133 --> 00:42:14,566
RASKAR:
When we talk about consent,
1074
00:42:14,566 --> 00:42:17,066
someone's still peeking
into your data.
1075
00:42:17,066 --> 00:42:19,433
A no-peek privacy,
on the other hand,
1076
00:42:19,433 --> 00:42:21,133
is where nobody
can peek at your data.
1077
00:42:21,933 --> 00:42:23,633
PATEL (voiceover):
No-peek privacy?
1078
00:42:23,633 --> 00:42:25,466
But how could
researchers learn anything?
1079
00:42:25,466 --> 00:42:27,233
RASKAR:
There are many ways
1080
00:42:27,233 --> 00:42:29,500
to create no-peek privacy.
1081
00:42:29,500 --> 00:42:33,300
First one is the notion
of smashing information.
1082
00:42:36,733 --> 00:42:39,200
PATEL (voiceover):
No, not literally
smashing your phone.
1083
00:42:39,200 --> 00:42:41,133
But I'm not gonna lie...
1084
00:42:42,700 --> 00:42:44,833
...that was really fun.
1085
00:42:44,833 --> 00:42:47,366
Smashing is basically
the idea of taking raw data
1086
00:42:47,366 --> 00:42:49,666
and smashing it
into just the wisdom.
1087
00:42:50,733 --> 00:42:53,066
PATEL (voiceover):
According to Ramesh,
smashing data
1088
00:42:53,066 --> 00:42:54,266
is simply the process
1089
00:42:54,266 --> 00:42:56,466
of extracting
the useful information,
1090
00:42:56,466 --> 00:42:58,166
which he calls the "wisdom,"
1091
00:42:58,166 --> 00:43:00,466
while obscuring
the private data.
1092
00:43:00,466 --> 00:43:03,833
In other words, a collection
of private health records
1093
00:43:03,833 --> 00:43:06,133
contains two kinds of data.
1094
00:43:06,133 --> 00:43:08,700
One kind is the personal data--
1095
00:43:08,700 --> 00:43:10,833
the names,
conditions, and histories,
1096
00:43:10,833 --> 00:43:13,800
the stuff we absolutely
want to protect.
1097
00:43:13,800 --> 00:43:19,133
But collectively, the records
may contain valuable information
1098
00:43:19,133 --> 00:43:20,166
about patterns in health care.
1099
00:43:20,166 --> 00:43:22,533
The wisdom.
1100
00:43:22,533 --> 00:43:24,733
I'm thinking about
examples in health care,
1101
00:43:24,733 --> 00:43:28,300
and the individual data,
the patient data
1102
00:43:28,300 --> 00:43:31,133
is protected, and
you can't reverse engineer it.
1103
00:43:31,133 --> 00:43:33,666
Let's take a concrete example
of COVID.
1104
00:43:35,200 --> 00:43:37,666
Imagine during the early days
of the pandemic,
1105
00:43:37,666 --> 00:43:39,266
we could have sent
medical experts
1106
00:43:39,266 --> 00:43:42,466
from here in Boston
to China, to Korea, to Italy,
1107
00:43:42,466 --> 00:43:45,333
and embedded them in
those hospitals to understand
1108
00:43:45,333 --> 00:43:48,700
what COVID pneumonia looks like
on a chest X-ray.
1109
00:43:48,700 --> 00:43:50,433
And they could have
come back to Boston,
1110
00:43:50,433 --> 00:43:52,066
all those medical experts,
1111
00:43:52,066 --> 00:43:55,700
and said, "Hey, together,
we can figure out what this is."
1112
00:43:56,600 --> 00:44:00,300
PATEL (voiceover):
So, the experts would return
back with just the wisdom,
1113
00:44:00,300 --> 00:44:02,066
an understanding of COVID
1114
00:44:02,066 --> 00:44:04,700
derived from being exposed
to large data sets.
1115
00:44:04,700 --> 00:44:08,100
But they wouldn't come back
with specific patient info.
1116
00:44:08,100 --> 00:44:12,166
None of the raw, sensitive,
and private data.
1117
00:44:12,166 --> 00:44:14,133
But of course,
that would require
1118
00:44:14,133 --> 00:44:16,700
initially sharing patient data;
1119
00:44:16,700 --> 00:44:19,166
a non-starter.
1120
00:44:19,166 --> 00:44:21,166
Because of privacy,
because of regulations,
1121
00:44:21,166 --> 00:44:23,266
because of national security
issues, they couldn't.
1122
00:44:24,433 --> 00:44:27,433
PATEL (voiceover):
This is where A.I.,
Artificial Intelligence,
1123
00:44:27,433 --> 00:44:28,666
comes in.
1124
00:44:28,666 --> 00:44:31,100
Instead of sending
experts to each hospital,
1125
00:44:31,100 --> 00:44:34,400
Ramesh is working on a way
to send A.I. to learn instead.
1126
00:44:35,200 --> 00:44:38,166
An A.I. model could be trained
on patients' lung X-rays
1127
00:44:38,166 --> 00:44:40,266
to learn what signs of COVID
look like.
1128
00:44:40,266 --> 00:44:44,166
The private health data
would never be copied
1129
00:44:44,166 --> 00:44:46,666
and would never leave the
hospital or the patient's file.
1130
00:44:46,666 --> 00:44:49,366
The A.I. would only
transmit its conclusions,
1131
00:44:49,366 --> 00:44:52,066
the wisdom, the smashed data.
1132
00:44:52,066 --> 00:44:53,300
RASKAR:
It's not enough to just
1133
00:44:53,300 --> 00:44:56,133
remove the name
from the chest X-ray,
1134
00:44:56,133 --> 00:44:58,066
but make sure
that you don't send
1135
00:44:58,066 --> 00:45:00,066
any of the pixels
of the chest X-ray at all.
1136
00:45:01,000 --> 00:45:04,066
PATEL (voiceover):
The A.I. can learn patterns
and gain knowledge
1137
00:45:04,066 --> 00:45:06,200
without having to keep hold
of everyone's data.
1138
00:45:06,200 --> 00:45:09,200
So the "wisdom"
from one hospital
1139
00:45:09,200 --> 00:45:11,700
can be combined
with the "wisdom" of others.
1140
00:45:11,700 --> 00:45:16,300
RASKAR:
Achieving privacy and benefits
of the data simultaneously,
1141
00:45:16,300 --> 00:45:19,500
it's like having a cake and
eating it, too; it's available.
1142
00:45:19,500 --> 00:45:22,233
And it's just a matter of
convincing large companies
1143
00:45:22,233 --> 00:45:24,566
to play along those rules.
1144
00:45:24,566 --> 00:45:28,366
PATEL:
Smashed data is one way
we can learn to stop worrying
1145
00:45:28,366 --> 00:45:31,100
and love sharing
our personal data.
1146
00:45:31,100 --> 00:45:34,066
But every corporate network
would have to agree
1147
00:45:34,066 --> 00:45:36,666
to smash our raw data.
1148
00:45:38,066 --> 00:45:40,100
I'm not convinced
we should wait around
1149
00:45:40,100 --> 00:45:42,400
for big tech companies
to do this.
1150
00:45:43,200 --> 00:45:44,833
HILL:
The big technology companies,
1151
00:45:44,833 --> 00:45:47,800
they have built
the infrastructure
1152
00:45:47,800 --> 00:45:48,833
of the whole internet.
1153
00:45:48,833 --> 00:45:51,333
You're not just interacting
with these companies
1154
00:45:51,333 --> 00:45:53,033
when you know you're interacting
with them.
1155
00:45:54,266 --> 00:45:56,133
PATEL:
Even if it seems like you're on
1156
00:45:56,133 --> 00:46:00,200
a totally independent website,
big tech still sees you.
1157
00:46:00,200 --> 00:46:03,100
For example, tons of websites,
1158
00:46:03,100 --> 00:46:06,233
like some of the ones owned
by NASA, Pfizer, BMW,
1159
00:46:06,233 --> 00:46:09,466
even PBS,
use Amazon Web Services,
1160
00:46:09,466 --> 00:46:13,633
which means Amazon gets
some data when you visit them.
1161
00:46:13,633 --> 00:46:16,233
Or when you visit
any app on your iPhone,
1162
00:46:16,233 --> 00:46:18,100
Apple knows.
1163
00:46:18,100 --> 00:46:20,233
Also, every time
you've been to a webpage
1164
00:46:20,233 --> 00:46:22,466
that has
a Facebook "like" button,
1165
00:46:22,466 --> 00:46:25,600
Facebook gets to gobble up
your personal data.
1166
00:46:25,600 --> 00:46:27,166
KAHLE:
So it may seem
1167
00:46:27,166 --> 00:46:28,400
like the web is decentralized
1168
00:46:28,400 --> 00:46:29,833
because it comes from
many different places.
1169
00:46:29,833 --> 00:46:30,733
But in fact,
1170
00:46:30,733 --> 00:46:33,466
there's centralized points
of control.
1171
00:46:34,266 --> 00:46:37,666
CHOWDHURY:
Right now, for example, when
you are in a social media app,
1172
00:46:37,666 --> 00:46:38,833
you're sort of locked in, right?
1173
00:46:38,833 --> 00:46:40,500
All your information
is on there.
1174
00:46:40,500 --> 00:46:42,200
It's hard to leave a social
media app
1175
00:46:42,200 --> 00:46:43,300
because you're like,
"All my friends are there,
1176
00:46:43,300 --> 00:46:44,700
all my photos are there.
1177
00:46:44,700 --> 00:46:46,400
"If I move to another app,
1178
00:46:46,400 --> 00:46:47,733
I have to rebuild
that community."
1179
00:46:48,666 --> 00:46:52,066
GALPERIN:
Those social media websites
are controlled
1180
00:46:52,066 --> 00:46:53,566
by a very small
number of companies.
1181
00:46:53,566 --> 00:46:56,100
And the rules about
using those websites,
1182
00:46:56,100 --> 00:47:00,566
who has access to them and what
kind of behavior is acceptable
1183
00:47:00,566 --> 00:47:04,100
on them, are set by
the websites themselves.
1184
00:47:05,233 --> 00:47:10,266
PATEL:
This centralized data monopoly,
how do we start to dismantle it?
1185
00:47:10,266 --> 00:47:12,566
That's exactly what
proponents of an idea
1186
00:47:12,566 --> 00:47:16,333
called the decentralized web
want to do.
1187
00:47:17,200 --> 00:47:19,833
GALPERIN:
The decentralized web
is sort of our antidote,
1188
00:47:19,833 --> 00:47:24,166
the antithesis of the web
as we now think of it.
1189
00:47:24,166 --> 00:47:26,533
CHRISTINE LEMMER-WEBBER:
The decentralized web
can be seen in contrast
1190
00:47:26,533 --> 00:47:28,700
to the centralized web,
of course.
1191
00:47:28,700 --> 00:47:31,433
Instead of what has become,
1192
00:47:31,433 --> 00:47:35,233
on the internet, of a few big
players kind of controlling
1193
00:47:35,233 --> 00:47:36,733
a lot of the web
and the internet space,
1194
00:47:36,733 --> 00:47:38,733
we're really kind of
rolling things back
1195
00:47:38,733 --> 00:47:40,500
to kind of what
the vision of the internet was.
1196
00:47:41,500 --> 00:47:44,133
PATEL:
I'm wondering
what a decentralized web
1197
00:47:44,133 --> 00:47:46,300
would look and feel like?
1198
00:47:46,300 --> 00:47:49,733
So I'm meeting up with coder
Christine Lemmer-Webber,
1199
00:47:49,733 --> 00:47:51,500
who is working on writing
the standards and rules
1200
00:47:51,500 --> 00:47:55,633
for how social media could work
on a decentralized web.
1201
00:47:55,633 --> 00:47:57,300
LEMMER-WEBBER:
Most people are familiar
1202
00:47:57,300 --> 00:47:59,300
with social networks:
you know, they've used,
1203
00:47:59,300 --> 00:48:01,266
you know, X-slash-Twitter;
1204
00:48:01,266 --> 00:48:03,333
they've used Facebook;
they've used Instagram.
1205
00:48:03,333 --> 00:48:06,100
The decentralized social web is
1206
00:48:06,100 --> 00:48:09,066
like that, but no one company,
1207
00:48:09,066 --> 00:48:10,833
no one gatekeeper
controls the thing.
1208
00:48:10,833 --> 00:48:13,100
We want to be able to have
1209
00:48:13,100 --> 00:48:15,066
all of our different
social media sites,
1210
00:48:15,066 --> 00:48:17,300
all of our different
online communication tools
1211
00:48:17,300 --> 00:48:19,400
be able to talk to each other.
1212
00:48:19,400 --> 00:48:20,633
DOCTOROW:
A decentralized web
1213
00:48:20,633 --> 00:48:23,200
is one that is focused on
1214
00:48:23,200 --> 00:48:25,166
the self-determination of users.
1215
00:48:25,166 --> 00:48:26,300
You're not locked in.
1216
00:48:27,533 --> 00:48:30,266
So you find a service,
and you like it.
1217
00:48:30,266 --> 00:48:31,566
(duck quacks)
1218
00:48:31,566 --> 00:48:33,333
And then you and they
start to part ways.
1219
00:48:34,133 --> 00:48:37,833
The art you made,
the stories you told are there.
1220
00:48:37,833 --> 00:48:41,133
In a decentralized web,
you go somewhere else.
1221
00:48:41,133 --> 00:48:43,766
You go somewhere else,
and you don't lose any of that.
1222
00:48:47,666 --> 00:48:51,166
PATEL (voiceover):
But imagine, instead of having
one Facebook-type company,
1223
00:48:51,166 --> 00:48:54,100
we each had our own data
live on our own devices,
1224
00:48:54,100 --> 00:48:56,700
and it was easy to join
or merge with others,
1225
00:48:56,700 --> 00:48:58,433
or disconnect from them.
1226
00:48:58,433 --> 00:49:00,700
Leaving one group
and joining another
1227
00:49:00,700 --> 00:49:02,833
wouldn't mean rebuilding
everything from scratch
1228
00:49:02,833 --> 00:49:06,100
because
your data moves with you.
1229
00:49:07,200 --> 00:49:09,066
It sounds like a tech fantasy.
1230
00:49:09,066 --> 00:49:11,133
But it's actually something
we've already proven
1231
00:49:11,133 --> 00:49:12,433
works online.
1232
00:49:12,433 --> 00:49:16,066
Consider something you may use
every day,
1233
00:49:16,066 --> 00:49:19,100
or if you're like me,
every minute: email.
1234
00:49:19,100 --> 00:49:21,066
LEMMER-WEBBER:
So what a lot of people
don't realize
1235
00:49:21,066 --> 00:49:22,833
is that they use decentralized
networks every day.
1236
00:49:22,833 --> 00:49:24,833
It doesn't matter
if somebody's on Gmail,
1237
00:49:24,833 --> 00:49:26,366
or if they're on Hotmail,
1238
00:49:26,366 --> 00:49:28,100
they're on
their university email.
1239
00:49:28,100 --> 00:49:29,333
They don't even
have to think about it.
1240
00:49:29,333 --> 00:49:31,133
They type an email
to their friend,
1241
00:49:31,133 --> 00:49:33,266
and they send it off,
and it gets there, right?
1242
00:49:33,266 --> 00:49:35,200
At the heart of it,
that's kind of the basics
1243
00:49:35,200 --> 00:49:36,233
of what we're doing.
1244
00:49:37,533 --> 00:49:41,166
PATEL:
In other words, you can get
an email sent from Hotmail
1245
00:49:41,166 --> 00:49:42,766
even if you have
a Gmail account.
1246
00:49:42,766 --> 00:49:46,466
Or you can create your own
email server, for that matter.
1247
00:49:46,466 --> 00:49:49,566
It doesn't matter what
email service you use
1248
00:49:49,566 --> 00:49:52,233
because the protocol,
the rules, are written
1249
00:49:52,233 --> 00:49:55,666
to allow the email servers
to talk to one another.
1250
00:49:56,500 --> 00:49:58,733
LEMMER-WEBBER:
That's because there's
a shared protocol that says,
1251
00:49:58,733 --> 00:50:01,333
here's how we get the messages
from place to place.
1252
00:50:01,333 --> 00:50:03,766
Social networks
could just work like email.
1253
00:50:04,566 --> 00:50:07,066
PATEL:
And in fact, there are plenty
of decentralized projects
1254
00:50:07,066 --> 00:50:08,700
out there like Mastodon
1255
00:50:08,700 --> 00:50:09,800
and Blue Sky
1256
00:50:09,800 --> 00:50:11,700
putting the user's
profile and data
1257
00:50:11,700 --> 00:50:13,066
back into the users' hands.
1258
00:50:13,966 --> 00:50:15,666
LEMMER-WEBBER:
In the systems we're
moving towards
1259
00:50:15,666 --> 00:50:17,366
that will be much more the case
1260
00:50:17,366 --> 00:50:19,633
where private communication
is much more private.
1261
00:50:19,633 --> 00:50:22,666
They're, by default,
secure in that type of way.
1262
00:50:22,666 --> 00:50:25,700
We're building new foundations
1263
00:50:25,700 --> 00:50:28,166
for the internet
that allow for healthier,
1264
00:50:28,166 --> 00:50:32,400
safer, more decentralized
systems, gatekeeper-free.
1265
00:50:32,400 --> 00:50:35,400
That's the vision and the future
we're trying to build.
1266
00:50:36,833 --> 00:50:40,366
PATEL:
No doubt,
there are secrets in your data.
1267
00:50:40,366 --> 00:50:43,700
NOBLE:
Everything we touch
is increasingly datafied,
1268
00:50:43,700 --> 00:50:46,066
every dimension of our lives.
1269
00:50:46,066 --> 00:50:49,200
CHOWDHURY:
All of this data
goes into feed algorithms.
1270
00:50:49,200 --> 00:50:51,666
And algorithms make predictions
1271
00:50:51,666 --> 00:50:53,133
about how long you will live,
1272
00:50:53,133 --> 00:50:55,200
or your health,
or your lifestyle.
1273
00:50:56,033 --> 00:50:58,166
PATEL (voiceover):
And while I learned
how to protect
1274
00:50:58,166 --> 00:51:00,300
my personal privacy
and security,
1275
00:51:00,300 --> 00:51:03,133
there's only so much
I can do myself.
1276
00:51:03,133 --> 00:51:04,533
HILL:
There are things
that individuals
1277
00:51:04,533 --> 00:51:06,466
can do to protect their privacy.
1278
00:51:06,466 --> 00:51:08,266
Ultimately, in order to
1279
00:51:08,266 --> 00:51:12,000
avoid a huge crisis,
it requires systemic change.
1280
00:51:12,900 --> 00:51:17,100
GALPERIN:
There are limits to
what we can do as individuals.
1281
00:51:17,100 --> 00:51:19,333
There are also things
that need to be addressed
1282
00:51:19,333 --> 00:51:21,700
through regulation,
through litigation,
1283
00:51:21,700 --> 00:51:26,333
and through legislation in
order to make all of us safer.
1284
00:51:26,333 --> 00:51:29,566
PATEL:
It's comforting to know
there are techies out there
1285
00:51:29,566 --> 00:51:32,466
trying to protect us
and our data,
1286
00:51:32,466 --> 00:51:35,100
designing new ways to enjoy
the fun and convenience
1287
00:51:35,100 --> 00:51:38,333
of the digital world
without being exploited.
1288
00:51:38,333 --> 00:51:42,133
KAHLE:
New technologies,
new games can proliferate
1289
00:51:42,133 --> 00:51:43,833
without centralized
points of control.
1290
00:51:43,833 --> 00:51:47,200
That's the excitement
of the decentralized web.
1291
00:51:47,200 --> 00:51:50,566
LEMMER-WEBBER:
I can't wait for people
to see this vision
1292
00:51:50,566 --> 00:51:54,100
we're building for
collaborative, more consensual,
1293
00:51:54,100 --> 00:51:55,700
decentralized social networks.
1294
00:51:55,700 --> 00:51:58,600
That's going to be
really exciting.
1295
00:51:59,400 --> 00:52:01,733
PATEL:
The future we're building
might not look too different
1296
00:52:01,733 --> 00:52:04,566
from the internet of today.
1297
00:52:04,566 --> 00:52:07,133
But it could be much more
private and secure,
1298
00:52:07,133 --> 00:52:10,266
if we get it right.
1299
00:52:10,266 --> 00:52:13,233
DOCTOROW:
This is about how technological
self-determination
1300
00:52:13,233 --> 00:52:15,500
and regulation work together
1301
00:52:15,500 --> 00:52:19,066
to make a world that is
more safe for human habitation.
1302
00:52:41,100 --> 00:52:44,166
♪ ♪
1303
00:52:45,100 --> 00:52:52,433
♪ ♪
1304
00:52:56,266 --> 00:53:04,066
♪ ♪
1305
00:53:07,700 --> 00:53:15,233
♪ ♪
1306
00:53:17,066 --> 00:53:24,400
♪ ♪
1307
00:53:25,833 --> 00:53:33,566
♪ ♪
100630
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.