All language subtitles for A.Brief.History.of.the.Future.S01E02.1080p.WEBRip.x264-BAE_track3_[eng]

af Afrikaans
ak Akan
sq Albanian
am Amharic
ar Arabic
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bem Bemba
bn Bengali
bh Bihari
bs Bosnian
br Breton
bg Bulgarian
km Cambodian
ca Catalan
ceb Cebuano
chr Cherokee
ny Chichewa
zh-CN Chinese (Simplified)
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch
en English Download
eo Esperanto
et Estonian
ee Ewe
fo Faroese
tl Filipino
fi Finnish
fr French
fy Frisian
gaa Ga
gl Galician
ka Georgian
de German
gn Guarani
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ia Interlingua
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
rw Kinyarwanda
rn Kirundi
kg Kongo
ko Korean
kri Krio (Sierra Leone)
ku Kurdish
ckb Kurdish (Soranî)
ky Kyrgyz
lo Laothian
la Latin
lv Latvian
ln Lingala
lt Lithuanian
loz Lozi
lg Luganda
ach Luo
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mfe Mauritian Creole
mo Moldavian
mn Mongolian
my Myanmar (Burmese)
sr-ME Montenegrin
ne Nepali
pcm Nigerian Pidgin
nso Northern Sotho
no Norwegian
nn Norwegian (Nynorsk)
oc Occitan
or Oriya
om Oromo
ps Pashto
fa Persian
pl Polish
pt-BR Portuguese (Brazil)
pt Portuguese (Portugal) Download
pa Punjabi
qu Quechua
ro Romanian
rm Romansh
nyn Runyakitara
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
sh Serbo-Croatian
st Sesotho
tn Setswana
crs Seychellois Creole
sn Shona
sd Sindhi
si Sinhalese
sk Slovak
sl Slovenian
so Somali
es Spanish Download
es-419 Spanish (Latin American)
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
tt Tatar
te Telugu
th Thai
ti Tigrinya
to Tonga
lua Tshiluba
tum Tumbuka
tr Turkish
tk Turkmen
tw Twi
ug Uighur
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese
cy Welsh
wo Wolof
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:00,500 --> 00:00:02,733 [Waves crashing] 2 00:00:06,466 --> 00:00:10,033 Man, voice-over: A few summers ago, I was at the beach with my family. 3 00:00:11,766 --> 00:00:14,766 And right when we got there, we took off our shoes 4 00:00:14,766 --> 00:00:16,866 and ran to the ocean's edge. 5 00:00:18,100 --> 00:00:20,966 And we've all experienced this before. 6 00:00:20,966 --> 00:00:23,966 The waves kind of come in. 7 00:00:23,966 --> 00:00:27,800 And as they wash out, we feel ourselves sinking 8 00:00:27,800 --> 00:00:31,300 because the sand beneath us is getting wet. 9 00:00:31,300 --> 00:00:32,966 But at the same time, 10 00:00:32,966 --> 00:00:34,466 if we're looking out to the horizon line, 11 00:00:34,466 --> 00:00:35,766 it doesn't seem like much changes. 12 00:00:35,766 --> 00:00:37,966 It's very discombobulating. 13 00:00:37,966 --> 00:00:40,366 Marine biologists call this area 14 00:00:40,366 --> 00:00:43,433 that is sometimes above water and sometimes below water 15 00:00:43,433 --> 00:00:45,800 the intertidal zone. 16 00:00:45,800 --> 00:00:49,633 In many ways, that's where we are right now as a society. 17 00:00:49,633 --> 00:00:51,900 We're in an intertidal moment. 18 00:00:53,366 --> 00:00:55,933 You feel like everything's in some ways the same. 19 00:00:55,933 --> 00:00:59,633 You still wake up. You're still you. 20 00:00:59,633 --> 00:01:01,333 But there's a shift, 21 00:01:01,333 --> 00:01:04,333 and you can't quite put your finger on it. 22 00:01:04,333 --> 00:01:07,166 It's really a crossroads moment for humanity. 23 00:01:08,500 --> 00:01:11,366 Imagine, if you will, sitting down to your morning coffee, 24 00:01:11,366 --> 00:01:14,333 turning on your home computer to read the day's newspaper. 25 00:01:14,333 --> 00:01:17,033 Well, it's not as far-fetched as it may seem. 26 00:01:17,033 --> 00:01:18,833 Woman 2: Well, I think what we're seeing 27 00:01:18,833 --> 00:01:22,700 is a new digital wild west where no one is in charge. 28 00:01:22,700 --> 00:01:25,366 Thank you for the likes! Let's get to 40,000. 29 00:01:25,366 --> 00:01:26,866 [All clamoring] 30 00:01:26,866 --> 00:01:29,200 Man: This is freaking crazy. 31 00:01:29,200 --> 00:01:31,166 Woman: More than 3 billion people 32 00:01:31,166 --> 00:01:33,700 in almost 70 countries and territories 33 00:01:33,700 --> 00:01:35,366 have been asked to stay at home. 34 00:01:35,366 --> 00:01:38,200 Joe Biden: The question is-- Donald Trump: The radical left-- 35 00:01:38,200 --> 00:01:39,600 Biden: Will you shut up, man? Trump: Listen. 36 00:01:39,600 --> 00:01:41,166 [All clamoring] 37 00:01:41,166 --> 00:01:43,200 Woman: Abnormal behaviors mean more panic, 38 00:01:43,200 --> 00:01:47,500 aggression, confusion, or anxiety during waking hours. 39 00:01:47,500 --> 00:01:49,533 ♪ 40 00:01:49,533 --> 00:01:53,333 Man: I don't believe we've even 41 00:01:49,533 --> 00:01:53,333 se  \h en the tip of the iceberg. 42 00:01:53,333 --> 00:01:55,100 I think we're really on the verge 43 00:01:55,100 --> 00:01:57,266 of something wonderful and terrifying. 44 00:01:57,266 --> 00:01:58,600 [Thunder] 45 00:01:58,600 --> 00:02:00,500 [Emergency alert system beeping] 46 00:02:00,500 --> 00:02:02,900 Man: I can hear you. I think it's a filter. 47 00:02:02,900 --> 00:02:05,266 Man 2: Yes. I'm here live. 48 00:02:05,266 --> 00:02:06,500 I'm not a cat. 49 00:02:06,500 --> 00:02:08,100 [Rumbling] 50 00:02:08,100 --> 00:02:09,766 Man: How come you're smoking weed in the Capitol? 51 00:02:09,766 --> 00:02:11,233 Man 2: Because I can. 52 00:02:11,233 --> 00:02:12,733 ♪ 53 00:02:12,733 --> 00:02:16,833 Woman: Do you feel like too much is changing too fast? 54 00:02:17,566 --> 00:02:19,866 [Grimes' "Oblivion" playing] 55 00:02:29,900 --> 00:02:32,333 ♪ Ooh ♪ 56 00:02:35,933 --> 00:02:38,500 ♪ Ooh ♪ 57 00:02:42,600 --> 00:02:46,733 ♪ I never walk about after dark ♪ 58 00:02:46,733 --> 00:02:49,066 ♪ It's my point of view ♪ 59 00:02:49,066 --> 00:02:51,266 ♪ Someone could break your neck ♪ 60 00:02:51,266 --> 00:02:53,100 ♪ Coming up behind you ♪ 61 00:02:53,100 --> 00:02:55,100 ♪ Always coming and you never have a clue ♪ 62 00:02:55,100 --> 00:02:58,933 ♪ I never look behind all the time ♪ 63 00:02:58,933 --> 00:03:01,333 ♪ I will wait forever ♪ 64 00:03:01,333 --> 00:03:03,666 ♪ Always looking straight ♪ 65 00:03:03,666 --> 00:03:07,400 ♪ Thinking, counting all the hours you wait ♪ 66 00:03:11,966 --> 00:03:14,266 [Car horns honking] 67 00:03:17,466 --> 00:03:20,166 Man: I use the term the intertidal moment 68 00:03:20,166 --> 00:03:22,000 to kind of describe where we are 69 00:03:22,000 --> 00:03:24,466 in the current kind of arc of human history. 70 00:03:24,466 --> 00:03:27,666 What sets this apart from almost any other 71 00:03:27,666 --> 00:03:29,566 intertidal that has come before, 72 00:03:29,566 --> 00:03:31,666 this is probably the first time that we can 73 00:03:31,666 --> 00:03:34,633 actually recognize that we're actually in an intertidal. 74 00:03:34,633 --> 00:03:36,666 So, instead of it just kind of happening 75 00:03:36,666 --> 00:03:38,566 and everyone feeling discombobulated, 76 00:03:38,566 --> 00:03:41,833 we all feel something is not working. 77 00:03:41,833 --> 00:03:43,633 And at the same time, 78 00:03:43,633 --> 00:03:45,900 we're grappling and looking for something else. 79 00:03:46,666 --> 00:03:48,300 Hello there. 80 00:03:48,300 --> 00:03:50,066 I'm curious about how we can find and embrace 81 00:03:50,066 --> 00:03:51,966 the creative potential of this moment. 82 00:03:51,966 --> 00:03:53,466 But first, we have to get a bearing 83 00:03:53,466 --> 00:03:55,466 on what moment we're actually in. 84 00:03:55,466 --> 00:03:58,666 That's led me here to Columbia University, 85 00:03:58,666 --> 00:04:00,900 where I sat down with a group of graduate students 86 00:04:00,900 --> 00:04:03,700 training to become the future leaders of tomorrow. 87 00:04:03,700 --> 00:04:05,866 So, it's 2040, and you're being asked to kind of 88 00:04:05,866 --> 00:04:10,533 describe this moment in human history, 89 00:04:10,533 --> 00:04:13,633 in the big scope. 90 00:04:13,633 --> 00:04:15,066 Talk to each other. 91 00:04:15,066 --> 00:04:17,200 Talk to me about how you're going to describe 92 00:04:17,200 --> 00:04:19,533 what it is that we're going through right now. 93 00:04:19,533 --> 00:04:22,233 Coming out of the pandemic, and there's a lot of confusion 94 00:04:22,233 --> 00:04:24,866 around, well, if we're coming out of this, 95 00:04:24,866 --> 00:04:27,533 what are we going into? 96 00:04:27,533 --> 00:04:32,200 The world that we knew before is just much different. 97 00:04:32,200 --> 00:04:34,566 What is, you know, that next step? 98 00:04:34,566 --> 00:04:36,900 What's the life that we're moving towards? 99 00:04:36,900 --> 00:04:38,400 Nobody really knows. 100 00:04:38,400 --> 00:04:40,733 Like, from individuals to business leaders 101 00:04:40,733 --> 00:04:43,033 to government leaders, we're all here 102 00:04:43,033 --> 00:04:45,133 just really trying to figure it out. 103 00:04:45,133 --> 00:04:47,400 And that sense of leadership 104 00:04:47,400 --> 00:04:50,066 as this is who I want to follow to get there 105 00:04:50,066 --> 00:04:54,566 is probably more unclear now than it maybe has ever been. 106 00:04:54,566 --> 00:04:56,300 I think this is an interesting moment 107 00:04:56,300 --> 00:04:58,366 of, like, excitement and opportunity, 108 00:04:58,366 --> 00:05:01,966 but it's all founded in a level of uncertainty 109 00:05:01,966 --> 00:05:03,966 that I have not been in before. 110 00:05:03,966 --> 00:05:05,433 There were a lot of things that I think 111 00:05:05,433 --> 00:05:07,766 we took for granted, a lot of certainties 112 00:05:07,766 --> 00:05:09,466 that we had taken for granted 113 00:05:09,466 --> 00:05:11,266 that I think have all bubbled up to the top right now 114 00:05:11,266 --> 00:05:14,466 in terms of, you know, where is there security? 115 00:05:14,466 --> 00:05:16,100 What does it look like for something 116 00:05:16,100 --> 00:05:18,700 to function correctly for everyone? 117 00:05:18,700 --> 00:05:21,300 What is it that we are taking for granted? 118 00:05:21,300 --> 00:05:25,300 Just revenue building can't be the bottom line, 119 00:05:25,300 --> 00:05:28,600 that individualistic mindset can't be it anymore. 120 00:05:28,600 --> 00:05:32,800 So, in 2040, if I looked back and I explained to my kids 121 00:05:32,800 --> 00:05:35,100 hopefully what this time looked like, 122 00:05:35,100 --> 00:05:36,766 I think it was a time of opportunity 123 00:05:36,766 --> 00:05:38,133 with responsibility. 124 00:05:38,133 --> 00:05:40,933 I feel like also climate-wise, this is-- 125 00:05:40,933 --> 00:05:43,300 I mean, these years are really 126 00:05:43,300 --> 00:05:45,600 going to be, like, deciding about our future, right? 127 00:05:45,600 --> 00:05:47,633 So, by 2040, hopefully, we'll be looking back 128 00:05:47,633 --> 00:05:51,100 and say we made some very wise decisions in that regard, 129 00:05:51,100 --> 00:05:52,966 and we won't have to tell our kids, 130 00:05:52,966 --> 00:05:55,966 well, sorry, we, like, did it wrong. 131 00:05:55,966 --> 00:05:57,300 I really do hope that. 132 00:05:57,300 --> 00:05:59,433 And I think with regards to that, 133 00:05:59,433 --> 00:06:01,366 there are a lot of sort of turning points 134 00:06:01,366 --> 00:06:03,266 that we are able to shape right now, 135 00:06:03,266 --> 00:06:05,366 which gives us a responsibility you mentioned 136 00:06:05,366 --> 00:06:07,200 and which means that you have to think about 137 00:06:07,200 --> 00:06:10,333 how you want to shape the future. 138 00:06:10,333 --> 00:06:20,166 ♪ 139 00:06:20,166 --> 00:06:26,166 Wallach, voice-over: When things are stable, it feels safe. 140 00:06:26,166 --> 00:06:27,866 But that's the opposite 141 00:06:27,866 --> 00:06:29,766 of what the world feels like right now. 142 00:06:31,700 --> 00:06:35,700 From our lives to our own jobs to our families 143 00:06:35,700 --> 00:06:40,033 to our country to the climate to the politics, 144 00:06:40,033 --> 00:06:44,000 it feels unsteady, it feels in flux. 145 00:06:44,000 --> 00:06:48,533 When that happens, your brain, your amygdala, 146 00:06:48,533 --> 00:06:51,666 is going to say, "This is not safe." 147 00:06:51,666 --> 00:06:54,133 I don't feel like we're in a stable place. 148 00:06:56,200 --> 00:07:00,933 Man: You look at any long form of humanity 149 00:07:00,933 --> 00:07:04,266 over thousands and thousands or hundreds of thousands of years, 150 00:07:04,266 --> 00:07:06,100 if you go way back to the beginning, 151 00:07:06,100 --> 00:07:07,666 and it's flat. 152 00:07:07,666 --> 00:07:09,400 It's flat in terms of population. 153 00:07:09,400 --> 00:07:10,933 It's flat in terms of technology. 154 00:07:10,933 --> 00:07:12,100 It's flat in terms of communication. 155 00:07:12,100 --> 00:07:13,733 It's flat. 156 00:07:13,733 --> 00:07:18,566 And then for the last eye blink, it explodes. 157 00:07:18,566 --> 00:07:20,266 ♪ 158 00:07:20,266 --> 00:07:24,566 You've never had a period on a planet like the last 50 years, 159 00:07:24,566 --> 00:07:29,600 really half a century of 160 00:07:24,566 --> 00:07:29,600 un  \h precedented human progress. 161 00:07:29,600 --> 00:07:35,266 Education levels increased. Infant mortality reduced. 162 00:07:35,266 --> 00:07:37,266 Life expectancy grew extraordinarily 163 00:07:37,266 --> 00:07:38,933 all over the world. 164 00:07:38,933 --> 00:07:43,733 Now, there were costs to that globalization. 165 00:07:43,733 --> 00:07:47,833 We are now at the beginning 166 00:07:47,833 --> 00:07:52,733 of a new globalization in ways that even a year ago, 167 00:07:52,733 --> 00:07:55,500 never mind 20, seemed inconceivable, 168 00:07:55,500 --> 00:07:58,766 and, of course, I'm talking about the AI revolution. 169 00:07:58,766 --> 00:08:02,300 We're living through a moment of extraordinary change. 170 00:08:02,300 --> 00:08:05,000 Even good change can be hard. 171 00:08:05,000 --> 00:08:07,733 The information environment that we're living in, 172 00:08:07,733 --> 00:08:09,500 it is completely surrounding us. 173 00:08:09,500 --> 00:08:11,333 It's coming at us 24/7. 174 00:08:11,333 --> 00:08:12,966 And unfortunately a lot of times, 175 00:08:12,966 --> 00:08:15,633 this information that comes at us is negative. 176 00:08:15,633 --> 00:08:18,500 It stokes fear and anxiety. 177 00:08:18,500 --> 00:08:21,066 But for young people in particular, 178 00:08:21,066 --> 00:08:24,333 there is more and more data that we have that is telling us 179 00:08:24,333 --> 00:08:27,166 that many young people are in fact harmed. 180 00:08:27,166 --> 00:08:30,500 There are 3 numbers that really stick out to me. 181 00:08:30,500 --> 00:08:33,166 If you look at a high school with 1,000 kids in it, 182 00:08:33,166 --> 00:08:34,633 about 450 of those children 183 00:08:34,633 --> 00:08:37,333 are feeling persistently sad or hopeless. 184 00:08:37,333 --> 00:08:39,233 200 of those children 185 00:08:39,233 --> 00:08:41,833 have considered taking their own life. 186 00:08:41,833 --> 00:08:46,166 And 100 of those kids have attempted suicide. 187 00:08:46,166 --> 00:08:47,833 As much as we're struggling right now, 188 00:08:47,833 --> 00:08:50,333 as much as our kids are in a mental health crisis, 189 00:08:50,333 --> 00:08:52,166 it does not have to be this way. 190 00:08:52,166 --> 00:08:53,966 There's a choice we have 191 00:08:53,966 --> 00:08:58,333 between a world where people are increasingly in despair 192 00:08:58,333 --> 00:09:01,233 and a world where people are connected to one another, 193 00:09:01,233 --> 00:09:04,700 where we look at the future and see possibility. 194 00:09:07,066 --> 00:09:10,966 ♪ 195 00:09:10,966 --> 00:09:13,533 Wallach, voice-over: Life has always been full of change. 196 00:09:13,533 --> 00:09:17,400 But the growing sense that we 197 00:09:13,533 --> 00:09:17,400 ar  \h e entering uncharted waters 198 00:09:17,400 --> 00:09:18,900 is being felt around the world 199 00:09:18,900 --> 00:09:21,533 in unprecedented ways right now. 200 00:09:21,533 --> 00:09:24,066 We're living in a time between times 201 00:09:24,066 --> 00:09:26,733 when what was is no longer working 202 00:09:26,733 --> 00:09:30,233 and what will be has yet to be born. 203 00:09:30,233 --> 00:09:32,866 But what happens in a moment when the usual shifts 204 00:09:32,866 --> 00:09:36,233 we experience all the time in one industry or culture 205 00:09:36,233 --> 00:09:38,733 become heightened and intertwined? 206 00:09:38,733 --> 00:09:40,733 What does it take to navigate through a period 207 00:09:40,733 --> 00:09:43,900 when the degree of complexity and confusion in our lives 208 00:09:43,900 --> 00:09:46,566 feels like it's turned up to 11? 209 00:09:46,566 --> 00:09:49,200 And what kind of stress is this all putting on our brain's ability 210 00:09:49,200 --> 00:09:51,533 to make sense 211 00:09:49,200 --> 00:09:51,533 of  \h what we're living through? 212 00:09:53,066 --> 00:09:56,566 Man: So, Ari, we're going to do a neurofunctional assessment. 213 00:09:56,566 --> 00:09:58,533 This is called a TheraQ. 214 00:09:58,533 --> 00:10:01,333 It's picking up brainwave activity. 215 00:10:02,400 --> 00:10:04,566 There we go. Beautiful. 216 00:10:04,566 --> 00:10:06,633 Voice: This will only take a few minutes. 217 00:10:06,633 --> 00:10:09,733 Your assessment is starting now. 218 00:10:09,733 --> 00:10:11,200 Close your eyes. 219 00:10:11,200 --> 00:10:13,200 ♪ 220 00:10:13,200 --> 00:10:15,300 Wallach, voice-over: Dr. Brown has spent his life 221 00:10:15,300 --> 00:10:18,566 following the effects the modern world is having on us 222 00:10:18,566 --> 00:10:20,400 through traditional psychology 223 00:10:20,400 --> 00:10:22,033 as well as neurofeedback systems 224 00:10:22,033 --> 00:10:24,400 that are getting increasingly powerful 225 00:10:24,400 --> 00:10:26,133 at monitoring our brains' response 226 00:10:26,133 --> 00:10:28,900 to the pressures of this moment we find ourselves in. 227 00:10:28,900 --> 00:10:31,066 One of the things that I'm curious about 228 00:10:31,066 --> 00:10:34,566 is how you see this current moment. 229 00:10:34,566 --> 00:10:37,066 And by that, I mean this current moment 230 00:10:37,066 --> 00:10:39,233 for humanity writ large. 231 00:10:39,233 --> 00:10:41,733 Broadly, I think we're in a time 232 00:10:41,733 --> 00:10:44,533 that has moved more and more towards 233 00:10:44,533 --> 00:10:46,733 kind of the atomization of individuals. 234 00:10:46,733 --> 00:10:51,233 There are stressors that are putting demands on our bodies 235 00:10:51,233 --> 00:10:53,366 and on our brains that are in turn 236 00:10:53,366 --> 00:10:55,066 affecting the way we live our lives 237 00:10:55,066 --> 00:10:56,533 and the way we live together. 238 00:10:56,533 --> 00:10:59,400 ♪ 239 00:10:59,400 --> 00:11:01,300 One way of thinking about this is that 240 00:11:01,300 --> 00:11:02,933 our bodies and our brains 241 00:11:02,933 --> 00:11:04,966 have a blueprint that was laid down 242 00:11:04,966 --> 00:11:10,466 for what was useful to survive 100,000 years ago. 243 00:11:10,466 --> 00:11:12,700 And back then, what you're trying to do 244 00:11:12,700 --> 00:11:15,466 was not be eaten by a bear or a saber-toothed tiger, 245 00:11:15,466 --> 00:11:17,033 depending on how far back you go. 246 00:11:17,033 --> 00:11:18,300 ♪ 247 00:11:18,300 --> 00:11:21,600 So, everything in our evolution 248 00:11:21,600 --> 00:11:25,300 was shaping us towards developing 249 00:11:25,300 --> 00:11:28,133 an effective fight or flight system, 250 00:11:28,133 --> 00:11:31,966 dealing with physical threat, immediate threat. 251 00:11:31,966 --> 00:11:36,366 And that's not very well suited to our life now. 252 00:11:36,366 --> 00:11:38,433 Instead, we have stressors and demands 253 00:11:38,433 --> 00:11:40,800 that are longer term, that are chronic, 254 00:11:40,800 --> 00:11:43,300 and our bodies aren't really designed for that. 255 00:11:43,300 --> 00:11:45,033 Our brains aren't really designed for that. 256 00:11:45,033 --> 00:11:49,100 So, what happens when you take someone who's wired 257 00:11:49,100 --> 00:11:50,766 for fight or flight, you know, 258 00:11:50,766 --> 00:11:52,800 hundreds of thousands of years of evolution, 259 00:11:52,800 --> 00:11:55,866 but you stick them in a cubicle, 260 00:11:55,866 --> 00:11:58,133 or you stick them, you know, on a factory floor, 261 00:11:58,133 --> 00:12:00,366 doing the same thing over and over again, 262 00:12:00,366 --> 00:12:03,433 or you put them in a classroom for 8 hours a day, 263 00:12:03,433 --> 00:12:05,033 sitting at a desk? 264 00:12:05,033 --> 00:12:08,833 I think we perceive a lot of threat right now in our world. 265 00:12:08,833 --> 00:12:11,700 And it's not often the kind of threat 266 00:12:11,700 --> 00:12:13,533 that we're designed to deal with. 267 00:12:13,533 --> 00:12:15,200 We're designed to deal with concrete, 268 00:12:15,200 --> 00:12:17,000 time-limited threat. 269 00:12:17,000 --> 00:12:18,700 What we have is diffuse, 270 00:12:18,700 --> 00:12:19,866 something bad's going to happen, 271 00:12:19,866 --> 00:12:22,266 I don't know what, kind of threat. 272 00:12:22,266 --> 00:12:26,766 And it's not time-limited. It's ongoing. 273 00:12:26,766 --> 00:12:28,700 Wallach, voice-over: A lot of us feel that right now, 274 00:12:28,700 --> 00:12:31,433 this kind of low-grade sense of fear and uncertainty 275 00:12:31,433 --> 00:12:33,033 about where we're headed. 276 00:12:33,033 --> 00:12:34,533 And it can leave us feeling powerless 277 00:12:34,533 --> 00:12:37,366 over what comes next. 278 00:12:37,366 --> 00:12:39,200 [Wind howling] 279 00:12:39,200 --> 00:12:41,033 But what do we do when some of these threats 280 00:12:41,033 --> 00:12:44,866 are not imagined, but rather painfully real? 281 00:12:44,866 --> 00:12:48,700 One of those is a threat to the natural world around us. 282 00:12:48,700 --> 00:12:51,200 The systems that sustain all life on this planet 283 00:12:51,200 --> 00:12:53,500 are warning us that things are not OK. 284 00:12:53,500 --> 00:12:56,866 And yet if you're like me, 285 00:12:53,500 --> 00:12:56,866 it  \h 's so easy to feel helpless 286 00:12:56,866 --> 00:12:59,366 in the face of a challenge this big. 287 00:12:59,366 --> 00:13:03,433 That led me here to the northeast coast of Canada, 288 00:13:03,433 --> 00:13:05,933 where Valérie Courtois leads a growing group of people 289 00:13:05,933 --> 00:13:09,100 who refuse to ignore this threat right here, 290 00:13:09,100 --> 00:13:11,233 in one of the most ecologically important 291 00:13:11,233 --> 00:13:12,766 places on Earth. 292 00:13:12,766 --> 00:13:15,266 Tell me about this land. Where are we right now? 293 00:13:15,266 --> 00:13:19,266 We're in Nitassinan, which is the Innu word for our land 294 00:13:19,266 --> 00:13:22,600 or the place of the Innu, 295 00:13:22,600 --> 00:13:24,900 specifically known as Labrador today. 296 00:13:24,900 --> 00:13:26,733 We're at the foot of the Mealy Mountains. 297 00:13:26,733 --> 00:13:28,933 Interestingly, the Mealy Mountains is a joint park 298 00:13:28,933 --> 00:13:31,933 between the Innu Nation and Parks Canada. 299 00:13:31,933 --> 00:13:35,400 It's the largest intact forest left on this planet. 300 00:13:35,400 --> 00:13:38,766 It is home to over 5 billion birds. 301 00:13:38,766 --> 00:13:40,900 It's got a quarter of the world's wetlands, 302 00:13:40,900 --> 00:13:42,600 a fifth of the world's freshwater. 303 00:13:42,600 --> 00:13:44,733 It actually absorbs twice as much carbon 304 00:13:44,733 --> 00:13:46,833 as tropical forest per hectare. 305 00:13:46,833 --> 00:13:49,166 And so, in terms of climate regulation, 306 00:13:49,166 --> 00:13:51,066 this is the most important 307 00:13:51,066 --> 00:13:53,833 terrestrial landscape on the planet. 308 00:13:53,833 --> 00:13:56,600 There's a movement afoot, several movements afoot 309 00:13:56,600 --> 00:13:58,600 around the planet, but especially here 310 00:13:58,600 --> 00:14:02,400 for First Nations and obviously in Indigenous groups, 311 00:14:02,400 --> 00:14:05,333 to actually have a much stronger part 312 00:14:05,333 --> 00:14:08,500 in protecting and preserving these lands. 313 00:14:08,500 --> 00:14:10,733 Why is that so important? 314 00:14:10,733 --> 00:14:13,833 Well, you know, 80% of the world's remaining biodiversity 315 00:14:13,833 --> 00:14:16,466 is on lands that are loved by Indigenous peoples. 316 00:14:16,466 --> 00:14:17,666 And that's not an accident. 317 00:14:17,666 --> 00:14:18,966 It's because we know that 318 00:14:18,966 --> 00:14:20,966 we're responsible for those landscapes. 319 00:14:20,966 --> 00:14:25,666 And unfortunately, Western society has lost its way. 320 00:14:25,666 --> 00:14:27,900 And, so we're finding that more people 321 00:14:27,900 --> 00:14:29,466 are looking to Indigenous peoples 322 00:14:29,466 --> 00:14:31,800 and looking to us for new ways 323 00:14:31,800 --> 00:14:34,333 of thinking about that relationship. 324 00:14:34,333 --> 00:14:38,833 How do you feel about the biodiversity loss 325 00:14:38,833 --> 00:14:41,166 that these lands are going through right now? 326 00:14:41,166 --> 00:14:42,833 I feel grief. 327 00:14:42,833 --> 00:14:46,133 I feel a loss of responsibility, 328 00:14:46,133 --> 00:14:49,666 and I feel guilt that we've gotten to this place. 329 00:14:49,666 --> 00:14:53,833 But I also know that not all is lost. 330 00:14:53,833 --> 00:14:55,466 You know, the world is resilient. 331 00:14:55,466 --> 00:14:57,900 The land is resilient. We are resilient. 332 00:14:57,900 --> 00:15:03,900 ♪ 333 00:15:03,900 --> 00:15:05,733 Wallach, voice-over: Valérie has spent years 334 00:15:05,733 --> 00:15:07,566 lobbying the Canadian government to invest in 335 00:15:07,566 --> 00:15:11,200 protecting these ecologically rich environments. 336 00:15:11,200 --> 00:15:13,133 And her work led to the creation of a group 337 00:15:13,133 --> 00:15:16,300 known as The Guardians, a First Nations-led initiative 338 00:15:16,300 --> 00:15:19,033 across the country tasked with defending 339 00:15:19,033 --> 00:15:20,833 the long-term health of the land. 340 00:15:22,366 --> 00:15:26,400 Together, they steward not 341 00:15:22,366 --> 00:15:26,400 on  \h ly these fragile ecosystems 342 00:15:26,400 --> 00:15:29,066 but also an ancient way of seeing themselves 343 00:15:29,066 --> 00:15:31,366 in relationship with the land itself. 344 00:15:31,366 --> 00:15:32,900 - Yeah. - Cheers. 345 00:15:32,900 --> 00:15:35,233 Man: This is the first time that we talked about 346 00:15:35,233 --> 00:15:37,733 working as a superintendent here in the National Park 347 00:15:37,733 --> 00:15:39,800 and also with working with Innu Nation. 348 00:15:39,800 --> 00:15:42,066 We've been asked to do up a guideline... 349 00:15:42,066 --> 00:15:44,800 - Mm. - About how to protect it. 350 00:15:44,800 --> 00:15:46,900 And I say to the government 351 00:15:46,900 --> 00:15:48,900 that we've been doing this for thousands of years. 352 00:15:48,900 --> 00:15:51,366 If we didn't manage the way we managed, 353 00:15:51,366 --> 00:15:52,733 there wouldn't be any animals. 354 00:15:52,733 --> 00:15:56,066 There wouldn't be any resources at all. 355 00:15:56,066 --> 00:15:58,866 And now you're expecting us to write it and put it on paper 356 00:15:58,866 --> 00:16:02,466 and have it stamped and say this is how it is? 357 00:16:02,466 --> 00:16:03,533 It's a way of thinking. Exactly. 358 00:16:03,533 --> 00:16:04,966 It's a mindset. 359 00:16:04,966 --> 00:16:06,766 Man: In the Western world, you get people 360 00:16:06,766 --> 00:16:09,633 that want to overcut, want to overkill, 361 00:16:09,633 --> 00:16:12,133 want to kill every fish in the water, 362 00:16:12,133 --> 00:16:15,133 every caribou that walks on the Earth. 363 00:16:15,133 --> 00:16:17,766 But the Innu, they don't think like that. 364 00:16:17,766 --> 00:16:19,933 There's not clear cut in the whole area 365 00:16:19,933 --> 00:16:23,466 just for profit or to sell to somebody else. 366 00:16:23,466 --> 00:16:26,466 They just think, OK, I just need this for this long, 367 00:16:26,466 --> 00:16:30,633 or I need this to feed my family and my mother-in-law. 368 00:16:30,633 --> 00:16:32,800 So, I'll take this many salmon out of the river, 369 00:16:32,800 --> 00:16:34,466 and then I'm done. 370 00:16:34,466 --> 00:16:38,266 I think we need to-- we need to go kind of go back 371 00:16:38,266 --> 00:16:42,966 to that relationship and make a conscious effort to do it. 372 00:16:42,966 --> 00:16:45,866 I think if more people were connected to the land, 373 00:16:45,866 --> 00:16:47,366 we'd be a much better world. 374 00:16:47,366 --> 00:16:48,933 Mm. 375 00:16:48,933 --> 00:16:51,133 ♪ 376 00:16:51,133 --> 00:16:53,966 Courtois, voice-over: Our 377 00:16:51,133 --> 00:16:53,966 la  \h nguages come from the land. 378 00:16:53,966 --> 00:16:56,100 Our practices, our laws, 379 00:16:56,100 --> 00:16:58,966 everything comes from the Earth. 380 00:16:58,966 --> 00:17:02,033 We can learn those things again. 381 00:17:02,033 --> 00:17:03,366 I don't know about you, 382 00:17:03,366 --> 00:17:04,533 but I want to be here for a little while. 383 00:17:04,533 --> 00:17:06,166 Yeah. 384 00:17:06,166 --> 00:17:07,866 And I want my children to be here for a little while. 385 00:17:07,866 --> 00:17:10,166 And I want my grandson to be here for a little while. 386 00:17:10,166 --> 00:17:13,200 We have a role to play and we should be 387 00:17:13,200 --> 00:17:17,000 helping decide and taking care of this place. 388 00:17:17,000 --> 00:17:24,366 ♪ 389 00:17:24,366 --> 00:17:26,533 Wallach, voice-over: It's an unforgettable experience 390 00:17:26,533 --> 00:17:30,533 to spend time with people 391 00:17:26,533 --> 00:17:30,533 wh  \h o simply refuse to give up, 392 00:17:30,533 --> 00:17:33,833 building on ancient wisdom to 393 00:17:30,533 --> 00:17:33,833 lo  \h ok beyond our modern moment 394 00:17:33,833 --> 00:17:36,366 to a future worth fighting for. 395 00:17:36,366 --> 00:17:38,666 These are the stories we need right now. 396 00:17:38,666 --> 00:17:40,366 It's so easy to see what's wrong 397 00:17:40,366 --> 00:17:43,166 and even easier to lose hope altogether. 398 00:17:43,166 --> 00:17:45,533 But the creativity comes in finding new ways 399 00:17:45,533 --> 00:17:47,366 to do something about it. 400 00:17:47,366 --> 00:17:50,033 I'm in Rotterdam to meet Boyan Slat, 401 00:17:50,033 --> 00:17:52,866 who's doing this very thing, inventing a technology 402 00:17:52,866 --> 00:17:55,366 designed to give our oceans a second chance. 403 00:17:55,366 --> 00:17:57,033 Tell me how you got started. 404 00:17:57,033 --> 00:18:00,166 I was 16 years old. I went scuba diving in Greece. 405 00:18:00,166 --> 00:18:03,100 And I was hoping to see all these beautiful things. 406 00:18:03,100 --> 00:18:06,233 Then I looked around me, and I just saw a garbage dump. 407 00:18:06,233 --> 00:18:08,166 I just saw more plastic bags than fish. 408 00:18:08,166 --> 00:18:14,100 And I was so dismayed and shocked by that 409 00:18:14,100 --> 00:18:16,900 that I asked myself a simple question. 410 00:18:16,900 --> 00:18:18,600 Why can't we just clean this up? 411 00:18:18,600 --> 00:18:21,433 ♪ 412 00:18:21,433 --> 00:18:23,233 The Great Pacific Garbage Patch is 413 00:18:23,233 --> 00:18:25,233 the largest accumulation of 414 00:18:25,233 --> 00:18:27,066 trash in the world's oceans. 415 00:18:27,066 --> 00:18:30,166 It's an area halfway between Hawaii and California. 416 00:18:30,166 --> 00:18:32,600 It spans twice the size of Texas, 417 00:18:32,600 --> 00:18:37,066 and it contains about 250 million pounds of trash. 418 00:18:37,066 --> 00:18:41,433 Plastic is one of the largest threats our oceans face today. 419 00:18:41,433 --> 00:18:43,233 There's now 700 species 420 00:18:43,233 --> 00:18:46,566 known to be directly impacted by plastic pollution. 421 00:18:46,566 --> 00:18:48,000 A few hundred of those 422 00:18:48,000 --> 00:18:50,400 are actually threatened with extinction. 423 00:18:50,400 --> 00:18:52,233 The most uncertain factor but perhaps 424 00:18:52,233 --> 00:18:54,166 even the most impactful factor 425 00:18:54,166 --> 00:18:56,766 is the health impact to us humans. 426 00:18:56,766 --> 00:19:00,300 Plastic breaks down 427 00:18:56,766 --> 00:19:00,300 in  \h to smaller, smaller pieces. 428 00:19:00,300 --> 00:19:03,500 They transport toxic chemicals into the food chain, 429 00:19:03,500 --> 00:19:05,833 and that's a food chain that includes 430 00:19:05,833 --> 00:19:08,466 more than 3 billion people that rely on fish 431 00:19:08,466 --> 00:19:10,400 as their key source of protein. 432 00:19:10,400 --> 00:19:14,833 ♪ 433 00:19:14,833 --> 00:19:16,800 Wallach, voice-over: Boyan and his team are working towards a goal 434 00:19:16,800 --> 00:19:20,333 to clean up 90% of floating plastic pollution. 435 00:19:20,333 --> 00:19:22,800 And here's the most powerful part. 436 00:19:22,800 --> 00:19:25,166 It took several attempts 437 00:19:22,800 --> 00:19:25,166 be  \h fore they created something 438 00:19:25,166 --> 00:19:27,500 that even had a chance to achieve that. 439 00:19:27,500 --> 00:19:29,166 It was touch and go. 440 00:19:29,166 --> 00:19:30,633 People said the system he was inventing 441 00:19:30,633 --> 00:19:31,900 would never work. 442 00:19:31,900 --> 00:19:33,133 But then it did. 443 00:19:33,133 --> 00:19:35,633 ♪ 444 00:19:35,633 --> 00:19:37,300 Slat, voice-over: So, the system itself 445 00:19:37,300 --> 00:19:39,666 is a long, U-shaped floating barrier 446 00:19:39,666 --> 00:19:41,733 that we drag forth very slowly 447 00:19:41,733 --> 00:19:44,466 just to make sure that the fish can escape in time. 448 00:19:44,466 --> 00:19:47,333 It acts like a funnel. 449 00:19:44,466 --> 00:19:47,333 Pl  \h astic goes towards the center 450 00:19:47,333 --> 00:19:49,300 where we have what we call the retention zone, 451 00:19:49,300 --> 00:19:51,566 which is a collection bag. 452 00:19:51,566 --> 00:19:55,300 Every few days when it's full, we take the bag onto a ship. 453 00:19:55,300 --> 00:19:57,500 We empty it, sort the waste, 454 00:19:57,500 --> 00:20:00,633 and then ultimately, we bring it back to land for recycling. 455 00:20:00,633 --> 00:20:03,133 ♪ 456 00:20:03,133 --> 00:20:07,633 Actually, the oldest object we ever collected. 457 00:20:07,633 --> 00:20:10,500 And you can see how it's been degraded. 458 00:20:10,500 --> 00:20:13,000 Yeah. So, these flakes are coming off. 459 00:20:13,000 --> 00:20:14,466 The thing is because of UV light, 460 00:20:14,466 --> 00:20:15,833 because of the sun, 461 00:20:15,833 --> 00:20:17,666 the plastic becomes more brittle. 462 00:20:17,666 --> 00:20:21,166 So, then, layer by layer, like an onion, it kind of peels. 463 00:20:21,166 --> 00:20:23,666 So, this eventually can end up in the fish that we eat. 464 00:20:23,666 --> 00:20:25,133 Yeah, and this can turn into 465 00:20:25,133 --> 00:20:27,000 millions of pieces of microplastics. 466 00:20:27,000 --> 00:20:30,066 Yeah. So, in here, what you see is the, 467 00:20:30,066 --> 00:20:33,166 essentially, the recycling process in steps. 468 00:20:33,166 --> 00:20:35,500 So, actually, half of what we get out of 469 00:20:35,500 --> 00:20:38,166 the Great Pacific Garbage Patch is fishing nets. 470 00:20:38,166 --> 00:20:39,666 - Mm-hmm. - So, it looks just like this. 471 00:20:39,666 --> 00:20:41,166 Yeah, yeah, yeah. 472 00:20:41,166 --> 00:20:43,133 I would say probably the most harmful type 473 00:20:43,133 --> 00:20:46,266 because this, of course, ensnares a lot of wildlife. 474 00:20:46,266 --> 00:20:48,566 So, what we then do is we wash it, 475 00:20:48,566 --> 00:20:50,666 and we shred it to get to this kind of pulp. 476 00:20:50,666 --> 00:20:52,233 Yep. 477 00:20:52,233 --> 00:20:55,333 And then ultimately, we injection-mold it, 478 00:20:55,333 --> 00:20:58,333 we compound it, so, we add some additives 479 00:20:58,333 --> 00:21:02,233 to make sure that the materials is safe and high quality. 480 00:21:02,233 --> 00:21:05,133 And then it becomes this. 481 00:21:05,133 --> 00:21:07,466 So, these are what you call pellets. 482 00:21:07,466 --> 00:21:11,366 And these are the building blocks for any new object. 483 00:21:11,366 --> 00:21:14,533 So, you can just mold this into something new, 484 00:21:14,533 --> 00:21:18,200 and the idea is that we are producing durable, 485 00:21:18,200 --> 00:21:19,900 sustainable products out of this, 486 00:21:19,900 --> 00:21:23,400 and with that, help fund the cleanup. 487 00:21:23,400 --> 00:21:25,066 Actually, as a proof of concept, 488 00:21:25,066 --> 00:21:26,366 we made these sort of 489 00:21:26,366 --> 00:21:28,066 high-end designer sunglasses. 490 00:21:28,066 --> 00:21:31,533 Wow. So, this is 100% made from 491 00:21:31,533 --> 00:21:34,633 the plastic we took out of the Great Pacific Garbage Patch. 492 00:21:37,466 --> 00:21:39,800 They look good on you. These are great. 493 00:21:39,800 --> 00:21:44,733 So, if the Garbage Patch is cleaned up in 10 to 15 years, 494 00:21:44,733 --> 00:21:47,733 what's to prevent another one 20 years from now forming? 495 00:21:47,733 --> 00:21:51,066 Realistically, the amount of plastic that's being produced 496 00:21:51,066 --> 00:21:52,566 is not going down. 497 00:21:52,566 --> 00:21:56,400 In fact, the projections are that by 2060, 498 00:21:56,400 --> 00:21:59,233 the amount of plastic produced will increase threefold. 499 00:21:59,233 --> 00:22:01,300 So, really what we need to do is 500 00:22:01,300 --> 00:22:05,133 we need to decouple the plastic usage 501 00:22:05,133 --> 00:22:07,933 from the plastic flows into the ocean. 502 00:22:07,933 --> 00:22:10,133 ♪ 503 00:22:10,133 --> 00:22:13,600 We have interceptors now in 11 rivers, 504 00:22:13,600 --> 00:22:15,633 some of the most polluting rivers in the world. 505 00:22:15,633 --> 00:22:20,466 And we believe we can really stop most of 506 00:22:20,466 --> 00:22:24,300 the world's plastic emissions from leaking into the ocean. 507 00:22:24,300 --> 00:22:27,133 Wallach: What do you think when people say 508 00:22:27,133 --> 00:22:28,933 what you're doing is impossible? 509 00:22:28,933 --> 00:22:31,966 I think when somebody says something is impossible, 510 00:22:31,966 --> 00:22:36,766 I think the sheer absoluteness of that statement 511 00:22:36,766 --> 00:22:40,800 should make you suspicious of it. 512 00:22:40,800 --> 00:22:43,300 If you look at history, 513 00:22:43,300 --> 00:22:45,800 everything that we now take for granted 514 00:22:45,800 --> 00:22:47,966 used to be impossible at some point in time. 515 00:22:47,966 --> 00:22:50,800 So, if you're an entrepreneur, 516 00:22:50,800 --> 00:22:53,633 if you're trying to make something, 517 00:22:53,633 --> 00:22:55,633 if you're trying to create something, yes, 518 00:22:55,633 --> 00:22:57,966 I think it's very important to listen 519 00:22:57,966 --> 00:23:00,366 and to listen to people's advice. 520 00:23:00,366 --> 00:23:02,500 But if there's one bit of advice 521 00:23:02,500 --> 00:23:05,366 that you should really ignore, 522 00:23:05,366 --> 00:23:08,000 it's people who say that something can't be done. 523 00:23:08,000 --> 00:23:11,033 ♪ 524 00:23:11,033 --> 00:23:15,200 Wallach, voice-over: The challenges facing Boyan and us all are daunting. 525 00:23:15,200 --> 00:23:17,366 Intertidal times are full of danger, 526 00:23:17,366 --> 00:23:20,366 but it's also where all the creative juice is. 527 00:23:20,366 --> 00:23:22,533 And for those of us who want to push the envelope 528 00:23:22,533 --> 00:23:24,500 on who we are and what's possible 529 00:23:24,500 --> 00:23:27,266 on this planet looking forward, 530 00:23:27,266 --> 00:23:28,766 this is our moment. 531 00:23:30,533 --> 00:23:32,633 [Explosions] 532 00:23:35,933 --> 00:23:39,500 Man, voice-over: Man, one of the most fragile of Earth's creatures, 533 00:23:39,500 --> 00:23:41,866 the builder of civilization, 534 00:23:41,866 --> 00:23:45,600 entrusted by nature with the unique but dangerous ability 535 00:23:45,600 --> 00:23:49,200 to alter the very conditions that gave him rise. 536 00:23:49,200 --> 00:23:51,033 Wallach, voice-over: We've been in moments like this before. 537 00:23:51,033 --> 00:23:53,533 The big one is moving from 538 00:23:53,533 --> 00:23:55,200 hunter-gatherer to agricultural. 539 00:23:55,200 --> 00:23:57,600 That was maybe 10,000, 12,000 years ago. 540 00:23:58,866 --> 00:24:01,766 We went from being in small clans and tribes 541 00:24:01,766 --> 00:24:04,200 to now actually starting to urbanize. 542 00:24:05,733 --> 00:24:09,266 We also have things like the Gutenberg press. 543 00:24:09,266 --> 00:24:11,100 So, we went from an era 544 00:24:11,100 --> 00:24:14,433 where knowledge could only be held by a few people 545 00:24:14,433 --> 00:24:18,000 to now being able to mass-create knowledge 546 00:24:18,000 --> 00:24:22,166 in a way that would actually bring it out to people. 547 00:24:22,166 --> 00:24:24,433 Now, from that, you got the Reformation 548 00:24:24,433 --> 00:24:26,266 and all sorts of upheaval around Europe 549 00:24:26,266 --> 00:24:28,266 and around the world. 550 00:24:28,266 --> 00:24:30,733 Another one is kind of moving from this idea 551 00:24:30,733 --> 00:24:33,266 that the Earth is the center of everything, 552 00:24:33,266 --> 00:24:37,933 to moving towards heliocentric models of how the world works. 553 00:24:37,933 --> 00:24:40,600 Moving from us being the center of the universe 554 00:24:40,600 --> 00:24:44,433 to just being one kind of node in a multi-noded galaxy 555 00:24:44,433 --> 00:24:46,766 and universe was highly disruptive. 556 00:24:46,766 --> 00:24:50,666 The Industrial Revolution, moving from this idea 557 00:24:50,666 --> 00:24:52,933 that the power that we had in the world 558 00:24:52,933 --> 00:24:55,766 was just what we could do with our own hands and backs 559 00:24:55,766 --> 00:24:59,400 and change where we lived, what we ate, how we travel. 560 00:24:59,400 --> 00:25:02,133 It changed how we fought wars. 561 00:25:02,133 --> 00:25:06,133 And it also fundamentally changed the way we told stories 562 00:25:06,133 --> 00:25:09,300 about who we are and where we're going. 563 00:25:09,300 --> 00:25:11,166 When we go through these moments 564 00:25:11,166 --> 00:25:12,833 of flux and creativity, 565 00:25:12,833 --> 00:25:15,800 all sorts of new things start to arise. 566 00:25:15,800 --> 00:25:25,633 ♪ 567 00:25:25,633 --> 00:25:28,166 That's the potential before us right now. 568 00:25:28,166 --> 00:25:30,833 These threats and challenges hold opportunities 569 00:25:30,833 --> 00:25:33,066 to remake the world. 570 00:25:33,066 --> 00:25:36,966 As the old ways break down and fall apart around us, 571 00:25:36,966 --> 00:25:40,133 what does it look like 572 00:25:36,966 --> 00:25:40,133 to  \h reimagine what comes next? 573 00:25:41,300 --> 00:25:44,333 In Albany, Eben Bayer and his team at Ecovative 574 00:25:44,333 --> 00:25:46,800 are working to answer that question... 575 00:25:46,800 --> 00:25:47,833 using mushrooms. 576 00:25:47,833 --> 00:25:59,833 ♪ 577 00:25:59,833 --> 00:26:01,633 So, we're going to head right over here. 578 00:26:02,533 --> 00:26:04,233 All right. So, as I understand, 579 00:26:04,233 --> 00:26:06,900 this is kind of just a beautiful piece 580 00:26:06,900 --> 00:26:08,900 of mycelium, basically, in a sense. 581 00:26:08,900 --> 00:26:10,233 - Yeah, yeah. - Like foam-like substance. 582 00:26:10,233 --> 00:26:11,400 What is this? 583 00:26:11,400 --> 00:26:12,566 This is a leather-like material. 584 00:26:12,566 --> 00:26:14,033 We call it a forager hide. 585 00:26:14,033 --> 00:26:15,400 And it's made from this foam. 586 00:26:15,400 --> 00:26:17,233 We just squish it and tan it and color it 587 00:26:17,233 --> 00:26:18,566 and you get something that behaves, looks, 588 00:26:18,566 --> 00:26:20,233 and feels a lot like leather. 589 00:26:20,233 --> 00:26:21,800 So, how long will this last? 590 00:26:21,800 --> 00:26:23,466 So, this particular leather-like hide 591 00:26:23,466 --> 00:26:25,033 will last for a year or two 592 00:26:25,033 --> 00:26:26,533 in this application as tanned. 593 00:26:26,533 --> 00:26:28,033 You could make it last longer by putting 594 00:26:28,033 --> 00:26:29,466 other tannery chemistries in, 595 00:26:29,466 --> 00:26:30,633 like we use for conventional leather. 596 00:26:30,633 --> 00:26:32,300 So, this is where we actually 597 00:26:32,300 --> 00:26:33,733 started with something called mushroom packaging. 598 00:26:33,733 --> 00:26:35,400 These are just little corner blocks 599 00:26:35,400 --> 00:26:36,900 that would go on a box you might get in the mail, 600 00:26:36,900 --> 00:26:38,866 and, you know, they've got little breakaways on them. 601 00:26:38,866 --> 00:26:40,966 And then when it gets to you, unlike Styrofoam, 602 00:26:40,966 --> 00:26:43,033 big difference is you can just start to break this up. 603 00:26:43,033 --> 00:26:45,666 If you have a compost at home, you can put it in your compost, 604 00:26:45,666 --> 00:26:47,366 or you could put it in your yard waste bin. 605 00:26:47,366 --> 00:26:48,866 And within 30 days, 606 00:26:48,866 --> 00:26:51,366 this will turn into a nutrient, not a pollutant, 607 00:26:51,366 --> 00:26:53,566 in whatever your local ecosystem is. 608 00:26:53,566 --> 00:26:55,066 What are the biggest problems 609 00:26:55,066 --> 00:26:56,566 that you're trying to tackle here? 610 00:26:56,566 --> 00:26:58,200 The problems that we focus on at Ecovative 611 00:26:58,200 --> 00:27:00,766 are around plastic pollution, so, this idea that we just 612 00:27:00,766 --> 00:27:02,466 created this incredible miracle material 613 00:27:02,466 --> 00:27:03,933 that can't degrade, and it's, like, 614 00:27:03,933 --> 00:27:05,366 clogging up the lungs 615 00:27:05,366 --> 00:27:06,933 and everything of our Earth's ecosystem. 616 00:27:06,933 --> 00:27:08,766 And the other is around animal agriculture, 617 00:27:08,766 --> 00:27:12,300 so, the mass production of animals for food or materials. 618 00:27:12,300 --> 00:27:13,600 I grew up farming in Central Vermont. 619 00:27:13,600 --> 00:27:15,133 It's fine to raise animals. 620 00:27:15,133 --> 00:27:18,033 Doing it industrially is not ecologically responsible 621 00:27:18,033 --> 00:27:19,433 or ethically responsible. 622 00:27:19,433 --> 00:27:21,600 ...here. Those units over there... 623 00:27:21,600 --> 00:27:23,933 Wallach, voice-over: Eben sees times of 624 00:27:21,600 --> 00:27:23,933 di  \h sorder and chaos like these 625 00:27:23,933 --> 00:27:26,600 as full of opportunity for transformation. 626 00:27:26,600 --> 00:27:28,766 Rather than simply seeing problems, 627 00:27:28,766 --> 00:27:31,466 he sees openings full of potential 628 00:27:31,466 --> 00:27:34,433 to invent entirely new ways of doing things 629 00:27:34,433 --> 00:27:36,266 that most of us take for granted. 630 00:27:36,266 --> 00:27:37,633 Welcome to the magic store. 631 00:27:37,633 --> 00:27:41,600 ♪ 632 00:27:41,600 --> 00:27:43,966 So, what you're seeing here are the aspects 633 00:27:43,966 --> 00:27:45,800 of a conventional vertical mushroom farm. 634 00:27:45,800 --> 00:27:47,266 You've got your shelving system here. 635 00:27:47,266 --> 00:27:48,966 You've got your environmental controls. 636 00:27:48,966 --> 00:27:51,800 And now we've modified it to use the soil we use 637 00:27:51,800 --> 00:27:53,933 to grow our special forms of mycelium. 638 00:27:53,933 --> 00:27:55,600 And rather than getting a bunch of mushrooms 639 00:27:55,600 --> 00:27:57,100 growing out of that bed, you're actually getting 640 00:27:57,100 --> 00:27:59,133 a slab of mycelium tissue. That's really the power. 641 00:27:59,133 --> 00:28:01,100 - There's no wasted space. - There's no wasted space. 642 00:28:01,100 --> 00:28:02,533 - Because it's sheer mycelium all the way across. 643 00:28:02,533 --> 00:28:03,866 - Yep. - Like in a slab. 644 00:28:03,866 --> 00:28:05,100 Well, one of these rooms could produce 645 00:28:05,100 --> 00:28:06,666 20,000 pounds of mycelium. 646 00:28:08,033 --> 00:28:09,533 So, a harvest machine will pull up. 647 00:28:09,533 --> 00:28:11,500 And it comes out like a conveyor. 648 00:28:11,500 --> 00:28:13,500 It comes across and squishes into basically 649 00:28:13,500 --> 00:28:15,600 like a pork belly, or a mush belly, we call it. 650 00:28:15,600 --> 00:28:17,033 And then those come off 651 00:28:17,033 --> 00:28:18,700 and those go to the bacon facility. 652 00:28:18,700 --> 00:28:19,666 - Wow. 653 00:28:19,666 --> 00:28:25,500 ♪ 654 00:28:25,500 --> 00:28:27,500 Wallach: So, now, this came out in a block. 655 00:28:27,500 --> 00:28:28,866 Bayer: Yeah. 656 00:28:28,866 --> 00:28:30,033 Wallach: And then you started slicing it? 657 00:28:30,033 --> 00:28:31,433 It rides along just like 658 00:28:31,433 --> 00:28:32,700 a piece of pork belly would be. 659 00:28:32,700 --> 00:28:34,033 So, run it through the slicer, 660 00:28:34,033 --> 00:28:35,933 you get your slices of bacon, 661 00:28:35,933 --> 00:28:39,366 you add salt and sugar and some natural flavorings. 662 00:28:39,366 --> 00:28:40,833 And then at the very end, 663 00:28:40,833 --> 00:28:42,533 we put the coconut oil on, which is the fat, 664 00:28:42,533 --> 00:28:44,600 because mushrooms don't have any fat in it. 665 00:28:44,600 --> 00:28:46,166 And this is minimally processed. 666 00:28:46,166 --> 00:28:47,500 Sliced and smoked, basically. 667 00:28:47,500 --> 00:28:49,833 Compare and contrast the inputs 668 00:28:49,833 --> 00:28:54,200 that would go into, you know, 5 pounds of this 669 00:28:54,200 --> 00:28:58,033 versus 5 pounds of bacon from a pig. 670 00:28:58,033 --> 00:29:00,166 So, to produce a million pounds of our product 671 00:29:00,166 --> 00:29:02,233 takes about an acre of land 672 00:29:00,166 --> 00:29:02,233 in  \h one of our vertical farms, 673 00:29:02,233 --> 00:29:04,766 and it occurs over about 10 days. 674 00:29:04,766 --> 00:29:07,600 To produce the same amount of bacon using a pig, 675 00:29:07,600 --> 00:29:09,433 you'd need about a million acres of land. 676 00:29:09,433 --> 00:29:10,600 - Hmm. 677 00:29:10,600 --> 00:29:12,066 - And you would also need 678 00:29:12,066 --> 00:29:13,733 to feed that pig high-quality food, 679 00:29:13,733 --> 00:29:15,766 so, like, grains versus woodchips. 680 00:29:15,766 --> 00:29:17,433 And then you have to grow them 681 00:29:17,433 --> 00:29:19,100 for a period of 6 to 9 months. 682 00:29:19,100 --> 00:29:20,900 And so, in each of those dimensions, land use, 683 00:29:20,900 --> 00:29:23,066 the input material, and the time frame, 684 00:29:23,066 --> 00:29:24,766 we're massively, like by an order of magnitude, 685 00:29:24,766 --> 00:29:26,900 improving the equation. - Mm-hmm. 686 00:29:26,900 --> 00:29:32,100 ♪ 687 00:29:32,100 --> 00:29:34,333 Tastes like bacon. 688 00:29:34,333 --> 00:29:35,900 Now, tell me, where does this go, 689 00:29:35,900 --> 00:29:38,766 because so far, we've heard about packaging, right? 690 00:29:38,766 --> 00:29:41,733 Obviously, there's food, and you've mentioned leather. 691 00:29:41,733 --> 00:29:43,500 - Yeah. - Where do you take this? 692 00:29:43,500 --> 00:29:45,433 My dream is we grow everything. 693 00:29:45,433 --> 00:29:47,166 You know, I think we can grow almost everything around us 694 00:29:47,166 --> 00:29:48,600 from the buildings to the medicines we need 695 00:29:48,600 --> 00:29:50,433 to the food we eat. 696 00:29:50,433 --> 00:29:52,100 And we'll do that through structural materials, 697 00:29:52,100 --> 00:29:54,066 nutritional materials, 698 00:29:54,066 --> 00:29:56,766 and even things that might be alive when you use them. 699 00:29:56,766 --> 00:29:58,066 Such as? 700 00:29:58,066 --> 00:29:59,666 Well, you could imagine a building that 701 00:29:59,666 --> 00:30:01,400 senses the environmental conditions within it 702 00:30:01,400 --> 00:30:03,766 and maybe even releases beneficial compounds to, 703 00:30:03,766 --> 00:30:05,600 like, clean the air. 704 00:30:05,600 --> 00:30:07,766 You can imagine buildings in an earthquake develop cracks 705 00:30:07,766 --> 00:30:09,266 and in those cracks, there are, like, 706 00:30:09,266 --> 00:30:12,766 embedded little water balls that break open. 707 00:30:12,766 --> 00:30:15,100 And the fungus is not dead but is dehydrated, 708 00:30:15,100 --> 00:30:16,733 which it can do, and it'll start growing 709 00:30:16,733 --> 00:30:19,933 and seal up all those cracks. 710 00:30:19,933 --> 00:30:23,100 Mushrooms are uniquely situated to save the world. 711 00:30:23,100 --> 00:30:26,900 Wallach, voice-over: What Eben and his team are doing here is inspiring 712 00:30:26,900 --> 00:30:29,100 and it gives me a renewed sense of possibility 713 00:30:29,100 --> 00:30:31,266 for the kind of futures we can choose 714 00:30:31,266 --> 00:30:32,400 to create in this moment. 715 00:30:32,400 --> 00:30:36,566 ♪ 716 00:30:36,566 --> 00:30:40,600 Here in New York, an architect named Bjarke Ingels 717 00:30:40,600 --> 00:30:42,733 is working with a similar perspective 718 00:30:42,733 --> 00:30:45,200 to reimagine the cities in which we live. 719 00:30:46,733 --> 00:30:49,733 Ingels, voice-over: I think 720 00:30:46,733 --> 00:30:49,733 ma  \h ybe the best way to explain what's so special 721 00:30:49,733 --> 00:30:52,766 about architecture and the power of design 722 00:30:52,766 --> 00:30:57,900 is that the Danish word for design isformgivning, 723 00:30:57,900 --> 00:31:01,333 which literally means form-giving. 724 00:31:01,333 --> 00:31:05,066 When you design something, you're giving form 725 00:31:05,066 --> 00:31:08,666 to that which has not yet been given form. 726 00:31:08,666 --> 00:31:11,133 In other words, you are giving form to the future. 727 00:31:11,133 --> 00:31:14,000 So, when you're designing a place or a building, 728 00:31:14,000 --> 00:31:18,000 you are giving form 729 00:31:14,000 --> 00:31:18,000 to  \h a little part of the world 730 00:31:18,000 --> 00:31:19,500 that you would like to find yourself 731 00:31:19,500 --> 00:31:21,233 living in in the future. 732 00:31:21,233 --> 00:31:23,633 ♪ 733 00:31:23,633 --> 00:31:25,000 How do you think about the moment of time 734 00:31:25,000 --> 00:31:26,666 that we're in right now? 735 00:31:26,666 --> 00:31:28,300 It's kind of chaos and flux. 736 00:31:28,300 --> 00:31:31,833 And a lot of people will want to look backwards, 737 00:31:31,833 --> 00:31:33,666 where others will want to stick their head in the sand 738 00:31:33,666 --> 00:31:35,000 to keep what they have. 739 00:31:35,000 --> 00:31:38,633 How do you think about this moment? 740 00:31:38,633 --> 00:31:42,800 Ingels, voice-over: We're living in a time where a lot of technologies 741 00:31:42,800 --> 00:31:45,900 are bringing possibilities to the table 742 00:31:45,900 --> 00:31:48,333 that we have never been even close to before. 743 00:31:48,333 --> 00:31:51,833 And I sense that this innovation that has been 744 00:31:51,833 --> 00:31:56,166 maybe locked in virtual in the last few decades 745 00:31:56,166 --> 00:31:59,166 has finally arrived in physical space. 746 00:31:59,166 --> 00:32:01,200 And maybe give you one example. 747 00:32:01,200 --> 00:32:03,066 The building we designed in Copenhagen called CopenHill 748 00:32:03,066 --> 00:32:06,300 is the cleanest waste-to-energy power plant 749 00:32:06,300 --> 00:32:07,800 in the world. 750 00:32:07,800 --> 00:32:09,400 The steam that comes out of the chimney 751 00:32:09,400 --> 00:32:12,400 is actually cleaner than the air of Copenhagen. 752 00:32:12,400 --> 00:32:15,066 Suddenly, the power plant no longer had to be 753 00:32:15,066 --> 00:32:18,066 some ugly, dirty, polluting eyesore. 754 00:32:18,066 --> 00:32:20,233 It could actually be a welcoming, 755 00:32:20,233 --> 00:32:21,966 inclusive environment. 756 00:32:21,966 --> 00:32:23,900 We could make the facade into 757 00:32:23,900 --> 00:32:25,566 the tallest man-made climbing wall in the world 758 00:32:25,566 --> 00:32:26,733 and we could turn the roof 759 00:32:26,733 --> 00:32:28,400 into an alpine ski slope. 760 00:32:28,400 --> 00:32:30,566 So, it's an idea we call hedonistic sustainability, 761 00:32:30,566 --> 00:32:33,733 that the sustainable building or the sustainable city 762 00:32:33,733 --> 00:32:35,966 is not only better for the environment, 763 00:32:35,966 --> 00:32:38,233 it's also much more enjoyable 764 00:32:38,233 --> 00:32:41,400 for the people that get to inhabit it. 765 00:32:41,400 --> 00:32:44,200 Wallach: A lot of times, when people hear the term sustainability 766 00:32:44,200 --> 00:32:46,066 or even regenerative, they think, 767 00:32:46,066 --> 00:32:47,900 "Something's going to be taken away from me. 768 00:32:47,900 --> 00:32:49,200 "It's not going to be as fun. 769 00:32:49,200 --> 00:32:50,700 "I'm not going to be as happy. 770 00:32:50,700 --> 00:32:52,233 I'm going to lose all these things." 771 00:32:52,233 --> 00:32:53,400 That's not the way you think about it. 772 00:32:53,400 --> 00:32:54,733 23 years ago, we opened 773 00:32:54,733 --> 00:32:56,233 the Copenhagen Harbour Baths, 774 00:32:56,233 --> 00:32:58,233 a simple floating structure that 775 00:32:58,233 --> 00:33:01,200 extends the life of the city into the water around it. 776 00:33:01,200 --> 00:33:03,533 On opening day, it became so clear 777 00:33:03,533 --> 00:33:07,133 that the clean port is not only nice for the fish, 778 00:33:07,133 --> 00:33:10,033 it's actually amazing for the people that live in that city, 779 00:33:10,033 --> 00:33:12,866 and this idea that the sustainable city 780 00:33:12,866 --> 00:33:15,133 is not only better for the environment, 781 00:33:15,133 --> 00:33:17,933 it's much more enjoyable for the people that live in it. 782 00:33:17,933 --> 00:33:20,933 Like half of the Copenhageners commute by bicycle, 783 00:33:20,933 --> 00:33:23,633 not because it's environmentally friendly 784 00:33:23,633 --> 00:33:26,133 but because it's the most enjoyable and effortless way 785 00:33:26,133 --> 00:33:27,633 to move around the city quickly. 786 00:33:27,633 --> 00:33:29,100 So, in that sense, we just 787 00:33:29,100 --> 00:33:30,966 keep reminding ourselves that 788 00:33:30,966 --> 00:33:32,133 there is a better and more 789 00:33:32,133 --> 00:33:33,300 enjoyable way of doing it. 790 00:33:33,300 --> 00:33:35,300 And I think the benefits 791 00:33:35,300 --> 00:33:38,966 of a sort of environmentally friendly city is that it is 792 00:33:38,966 --> 00:33:41,933 greener and cleaner and more enjoyable. 793 00:33:43,533 --> 00:33:46,466 Wallach: Tell me 794 00:33:43,533 --> 00:33:46,466 ab  \h out the cities of the future. 795 00:33:46,466 --> 00:33:49,100 Where are we going? 796 00:33:49,100 --> 00:33:51,966 Where do we need to go? What does it look like? 797 00:33:51,966 --> 00:33:55,966 Ingels: If you would return to Manhattan in 10 or 20 798 00:33:55,966 --> 00:34:00,500 or maybe 50 years, you might see streets 799 00:34:00,500 --> 00:34:03,366 that entirely become almost like linear parks 800 00:34:03,366 --> 00:34:07,666 woven together in both directions of Manhattan. 801 00:34:07,666 --> 00:34:09,533 People can walk and play 802 00:34:09,533 --> 00:34:13,200 where you used to have traffic and parked cars. 803 00:34:13,200 --> 00:34:16,500 You might have different kinds of personal mobility 804 00:34:16,500 --> 00:34:19,533 also taking over whole areas. 805 00:34:19,533 --> 00:34:22,833 Our cities will really become greener and more enjoyable, 806 00:34:22,833 --> 00:34:26,533 which will make them more walkable and more bikeable. 807 00:34:26,533 --> 00:34:31,866 So, I think a lot of the dichotomy between the city 808 00:34:31,866 --> 00:34:33,700 and the countryside, 809 00:34:33,700 --> 00:34:37,666 you're going to get 810 00:34:33,700 --> 00:34:37,666 mu  \h ch more interesting blends. 811 00:34:37,666 --> 00:34:42,600 The city isn't the way it is because it has to be. 812 00:34:42,600 --> 00:34:44,266 The city is the way it is 813 00:34:44,266 --> 00:34:47,033 because that's how far we've gotten. 814 00:34:47,033 --> 00:34:50,366 And if we would like to ask more of our city 815 00:34:50,366 --> 00:34:53,866 or if we would like our city to accommodate 816 00:34:53,866 --> 00:34:56,666 another kind of life than what it used to, 817 00:34:56,666 --> 00:34:59,033 we actually not only have the possibility, 818 00:34:59,033 --> 00:35:01,333 we actually have 819 00:34:59,033 --> 00:35:01,333 a  \h responsibility to make sure 820 00:35:01,333 --> 00:35:03,733 that our city fits 821 00:35:01,333 --> 00:35:03,733 wi  \h th the way we want to live. 822 00:35:03,733 --> 00:35:10,100 ♪ 823 00:35:10,100 --> 00:35:15,433 Wallach, voice-over: In 1945, my mother was born in Oakland, California. 824 00:35:15,433 --> 00:35:18,900 And very early on, she realized that where 825 00:35:18,900 --> 00:35:21,566 she could kind of bring her gifts to bear in the world 826 00:35:21,566 --> 00:35:23,600 was through creativity. 827 00:35:23,600 --> 00:35:28,100 And so, she ended up becoming a professional artist. 828 00:35:28,100 --> 00:35:30,900 Growing up, my mom would often bring paintings 829 00:35:30,900 --> 00:35:34,233 that she was working on kind of back into circulation. 830 00:35:34,233 --> 00:35:35,666 So, most people think of an artist 831 00:35:35,666 --> 00:35:37,166 as someone who paints something. 832 00:35:37,166 --> 00:35:39,000 They're done. They put it up on the wall. 833 00:35:39,000 --> 00:35:40,600 My mom had paintings hanging in her garage 834 00:35:40,600 --> 00:35:42,600 or around the house that she had painted 835 00:35:42,600 --> 00:35:45,433 in the 1960s and '70s. 836 00:35:45,433 --> 00:35:48,066 And every once in a while, I'd see one of those on the easel, 837 00:35:48,066 --> 00:35:49,766 and she would kind of add to it. 838 00:35:49,766 --> 00:35:52,333 And I'd say, well, "I thought that painting was complete." 839 00:35:52,333 --> 00:35:55,100 She said, "A work of art is never necessarily complete. 840 00:35:55,100 --> 00:35:56,933 It's up to the artist." 841 00:35:56,933 --> 00:35:58,500 And so, what I learned is 842 00:35:58,500 --> 00:36:00,500 even when you're crafting something, 843 00:36:00,500 --> 00:36:02,300 that you can always come back to it. 844 00:36:02,300 --> 00:36:04,066 You can always make changes 845 00:36:04,066 --> 00:36:06,300 because you've learned new information. 846 00:36:06,300 --> 00:36:09,500 You can take from the past. You can augment it. 847 00:36:09,500 --> 00:36:12,133 It really means that things are fungible and changeable 848 00:36:12,133 --> 00:36:14,466 as long as you're trying to make them better. 849 00:36:14,466 --> 00:36:25,333 ♪ 850 00:36:25,333 --> 00:36:28,300 Sometimes, the way to make things better 851 00:36:28,300 --> 00:36:30,000 is to look at who needs help. 852 00:36:30,000 --> 00:36:31,900 Where is there a need? 853 00:36:31,900 --> 00:36:33,566 And how can we improve 854 00:36:33,566 --> 00:36:35,666 on the way things are currently being done? 855 00:36:35,666 --> 00:36:38,333 We have powerful new tools and technology 856 00:36:38,333 --> 00:36:40,133 available to us right now. 857 00:36:40,133 --> 00:36:42,000 And rather than just using them to entertain 858 00:36:42,000 --> 00:36:43,966 or sell us more stuff, 859 00:36:43,966 --> 00:36:46,466 we can meet actual human needs, 860 00:36:46,466 --> 00:36:50,133 altering and improving the experience of being alive. 861 00:36:52,066 --> 00:36:54,400 Woman: My name is Veena Somareddy, 862 00:36:54,400 --> 00:36:57,900 and I'm the co-founder and CEO of Neuro Rehab VR. 863 00:36:57,900 --> 00:37:00,533 We create virtual reality therapy applications 864 00:37:00,533 --> 00:37:02,566 for physical therapy, occupational therapy, 865 00:37:02,566 --> 00:37:04,900 for patients who might have gone through a stroke 866 00:37:04,900 --> 00:37:08,466 or a spinal cord injury or Parkinson's or MS, 867 00:37:08,466 --> 00:37:10,966 and we help them get back their limb function 868 00:37:10,966 --> 00:37:12,533 as best as we can. 869 00:37:12,533 --> 00:37:13,900 Physical therapy hasn't really 870 00:37:13,900 --> 00:37:14,966 changed since the sixties. 871 00:37:14,966 --> 00:37:16,400 It's very manual. 872 00:37:16,400 --> 00:37:18,400 It's very tedious for the patient 873 00:37:18,400 --> 00:37:20,533 as well as the physical therapist. 874 00:37:20,533 --> 00:37:22,233 And sometimes, you need one or two therapists 875 00:37:22,233 --> 00:37:24,066 working on one patient. 876 00:37:24,066 --> 00:37:26,366 So, it's just not possible in a modern world 877 00:37:26,366 --> 00:37:29,133 when there is shortage of clinicians 878 00:37:29,133 --> 00:37:31,233 and also access to care. 879 00:37:31,233 --> 00:37:35,300 What's the big opportunity for humanity 880 00:37:35,300 --> 00:37:40,133 in terms of these new kind of digital reality realms 881 00:37:40,133 --> 00:37:42,733 that are being built, and that we're kind of, 882 00:37:42,733 --> 00:37:45,033 in many ways, kind of living into? 883 00:37:45,033 --> 00:37:47,233 For me, I think on the health care side 884 00:37:47,233 --> 00:37:50,900 would be access to care, access to care for anybody, 885 00:37:50,900 --> 00:37:53,800 not--you know, in any socioeconomic status 886 00:37:53,800 --> 00:37:57,200 that they're in, which has been a huge problem in health care, 887 00:37:57,200 --> 00:38:00,133 being able to send our systems 888 00:38:00,133 --> 00:38:02,633 to somebody who doesn't have access to therapy 889 00:38:02,633 --> 00:38:05,266 so they can do it on their own in their own time 890 00:38:05,266 --> 00:38:07,866 and get back that function that they might have lost, 891 00:38:07,866 --> 00:38:09,600 and come back into society. 892 00:38:09,600 --> 00:38:11,433 ♪ 893 00:38:11,433 --> 00:38:13,266 Wallach, voice-over: Veena 894 00:38:11,433 --> 00:38:13,266 di  \h dn't just see what was broken 895 00:38:13,266 --> 00:38:15,133 in the field of physical therapy. 896 00:38:15,133 --> 00:38:18,133 She saw what was needed, what could be, 897 00:38:18,133 --> 00:38:20,266 and created something new. 898 00:38:20,266 --> 00:38:22,300 While we were at the clinic, 899 00:38:22,300 --> 00:38:25,300 she let me experience the work for myself. 900 00:38:25,300 --> 00:38:26,966 Somareddy: So, you'll have a green ball coming at you, 901 00:38:26,966 --> 00:38:29,433 and you have to dodge it. 902 00:38:29,433 --> 00:38:32,133 You can move to your left or you can move to your right. 903 00:38:32,133 --> 00:38:34,100 - Now I see a picnic table. - There you go. 904 00:38:34,100 --> 00:38:35,600 You'll see the ball coming at you. 905 00:38:35,600 --> 00:38:36,800 There you go. 906 00:38:36,800 --> 00:38:37,966 Do you feel like Neo from "Matrix"? 907 00:38:37,966 --> 00:38:39,600 - Yes. - There you go. 908 00:38:39,600 --> 00:38:41,300 I mean, obviously, this is helping with balance. 909 00:38:41,300 --> 00:38:43,133 But what other things is this helping with? 910 00:38:43,133 --> 00:38:45,466 Weight shifting is a huge thing with stroke patients. 911 00:38:45,466 --> 00:38:47,300 - OK. - Because usually when they're affected, 912 00:38:47,300 --> 00:38:49,966 when they have a stroke, they're paralyzed on one side. 913 00:38:49,966 --> 00:38:51,300 And they're very afraid about 914 00:38:51,300 --> 00:38:53,300 putting their weight on that affected side. 915 00:38:53,300 --> 00:38:54,966 Is it that you can't do it 916 00:38:54,966 --> 00:38:56,300 or it's like the fear of 917 00:38:56,300 --> 00:38:57,933 doing it and what might happen? 918 00:38:57,933 --> 00:38:59,300 It is mostly the fear, 919 00:38:59,300 --> 00:39:01,033 especially with chronic patients. 920 00:39:01,033 --> 00:39:03,500 They're used to what they cannot do, they know, 921 00:39:03,500 --> 00:39:05,366 "This is what I cannot do, and these are my limitations." 922 00:39:05,366 --> 00:39:07,100 And they're stuck with it. - Yeah. 923 00:39:07,100 --> 00:39:08,666 But once you put them in an immersive environment 924 00:39:08,666 --> 00:39:11,333 where they don't see the bias of their diagnosis, 925 00:39:11,333 --> 00:39:13,333 that you don't see your body right now. 926 00:39:13,333 --> 00:39:16,100 All you're concentrated on is on dodging that cannonball. 927 00:39:16,100 --> 00:39:17,366 - I am. - Right? 928 00:39:17,366 --> 00:39:19,200 So, that takes them out of that fear 929 00:39:19,200 --> 00:39:21,700 of not being able to do something. 930 00:39:21,700 --> 00:39:23,200 Now, what should I do with the chicken? 931 00:39:23,200 --> 00:39:24,766 Should I eat it? You can eat it. 932 00:39:24,766 --> 00:39:26,533 Yep. Exactly. [Chuckles] 933 00:39:26,533 --> 00:39:28,366 I was working with a stroke patient. 934 00:39:28,366 --> 00:39:31,200 And her goal for therapy was being able to 935 00:39:31,200 --> 00:39:33,433 go grocery shopping with her grandkids again. 936 00:39:33,433 --> 00:39:35,200 And so, we were like, let's create that. 937 00:39:35,200 --> 00:39:36,700 So, you can practice everything that you'll 938 00:39:36,700 --> 00:39:39,200 have to do in real life right here in VR, 939 00:39:39,200 --> 00:39:41,000 so, it feels like you've done this before, 940 00:39:41,000 --> 00:39:42,333 and you're not afraid. 941 00:39:44,266 --> 00:39:45,866 So, you've got eggs. 942 00:39:45,866 --> 00:39:48,366 So, what you're doing right here is pattern-matching, 943 00:39:48,366 --> 00:39:49,666 being able to pattern match 944 00:39:49,666 --> 00:39:51,500 from the item that's on the shopping list 945 00:39:51,500 --> 00:39:53,533 to the item that's on the shelf, 946 00:39:53,533 --> 00:39:56,533 which is something a lot of people can forget 947 00:39:56,533 --> 00:40:00,200 or lose that after a stroke incident. 948 00:40:00,200 --> 00:40:02,200 So, right here, we're able to simulate everything 949 00:40:02,200 --> 00:40:05,266 from the touch, the feel, the visual aspects, 950 00:40:05,266 --> 00:40:07,333 and the ambience, too, 951 00:40:07,333 --> 00:40:09,700 so that they can get used to all of that 952 00:40:09,700 --> 00:40:11,666 before they actually go into a grocery store. 953 00:40:12,833 --> 00:40:14,666 Wallach, voice-over: Obviously, the VR that 954 00:40:14,666 --> 00:40:16,433 you're mostly working on right now are people 955 00:40:16,433 --> 00:40:18,533 who have suffered either from a stroke 956 00:40:18,533 --> 00:40:20,500 or neurodegenerative diseases. Somareddy, voice-over: Right. 957 00:40:20,500 --> 00:40:21,866 Wallach: But I would imagine it also can start 958 00:40:21,866 --> 00:40:23,500 working for other traumas. 959 00:40:23,500 --> 00:40:25,366 Somareddy: So, this is something that works 960 00:40:25,366 --> 00:40:28,366 on also PTSD for veterans. 961 00:40:28,366 --> 00:40:31,700 Exposure therapy has been shown to desensitize them 962 00:40:31,700 --> 00:40:34,700 for the fear that they might have experienced, 963 00:40:34,700 --> 00:40:37,200 the trauma that they might have experienced. 964 00:40:37,200 --> 00:40:40,266 Just the sounds of being in a battlefield 965 00:40:40,266 --> 00:40:43,533 can help them decrease that anxiety. 966 00:40:43,533 --> 00:40:45,833 A fear of spiders, a fear of heights, 967 00:40:45,833 --> 00:40:48,700 you can work on all of this in the virtual world. 968 00:40:48,700 --> 00:40:51,866 And you know that you're not going to get hurt. 969 00:40:51,866 --> 00:40:54,200 And then maybe go back to the real world 970 00:40:54,200 --> 00:40:55,866 and be able to experience that 971 00:40:55,866 --> 00:40:59,200 without the amount of fear that you might have had. 972 00:40:59,200 --> 00:41:02,066 Wallach: The overlap between digital and lived realities 973 00:41:02,066 --> 00:41:03,900 is growing every day. 974 00:41:03,900 --> 00:41:07,100 And as new AI tools continue to expand what's possible, 975 00:41:07,100 --> 00:41:09,833 Veena believes this work is only the beginning. 976 00:41:09,833 --> 00:41:15,266 ♪ 977 00:41:15,266 --> 00:41:17,500 [Upbeat pipe organ music] 978 00:41:32,600 --> 00:41:35,100 AI has been around since the early 20th century 979 00:41:35,100 --> 00:41:36,933 as a concept. 980 00:41:36,933 --> 00:41:40,400 Voice: 1, 2, 3. 981 00:41:40,400 --> 00:41:41,766 Woman: Hello, Kismet. 982 00:41:41,766 --> 00:41:44,433 - Peek-a-boo! - I love you, doll. 983 00:41:44,433 --> 00:41:46,733 In the nineties, we got computers 984 00:41:46,733 --> 00:41:49,600 that could process vector graphics 985 00:41:49,600 --> 00:41:52,733 for video games and that type of thing. 986 00:41:52,733 --> 00:41:55,266 And that enabled a revolution to happen 987 00:41:55,266 --> 00:41:57,433 in neural computation. 988 00:41:57,433 --> 00:42:00,666 So, we could start to stack up layers of neurons, 989 00:42:00,666 --> 00:42:03,000 and that's what's called deep learning. 990 00:42:03,000 --> 00:42:06,400 And that's what's really advanced the field so much. 991 00:42:07,800 --> 00:42:10,266 We could create a world where AI is just driving us 992 00:42:10,266 --> 00:42:12,333 towards more consumption 993 00:42:12,333 --> 00:42:14,733 and more recommendations of products. 994 00:42:14,733 --> 00:42:17,633 Or we can create a world where AI is allowing us 995 00:42:17,633 --> 00:42:19,466 to express different things, 996 00:42:19,466 --> 00:42:21,500 to understand ourselves in different ways. 997 00:42:21,500 --> 00:42:23,566 Whichever of those outcomes is more likely to happen 998 00:42:23,566 --> 00:42:25,233 has a lot to do with who's making the AI 999 00:42:25,233 --> 00:42:26,633 and why they're making it. 1000 00:42:28,566 --> 00:42:31,300 Wallach, voice-over: The conversation around artificial intelligence 1001 00:42:31,300 --> 00:42:33,233 is thrilling and complex. 1002 00:42:33,233 --> 00:42:35,000 And at the current speed of innovation, 1003 00:42:35,000 --> 00:42:37,133 it's hard to keep up with how fast 1004 00:42:37,133 --> 00:42:41,000 these tools are developing and to what end they will be used. 1005 00:42:41,000 --> 00:42:43,500 Greg Cross and his team at Soul Machines 1006 00:42:43,500 --> 00:42:46,666 have been working on these technologies for years. 1007 00:42:46,666 --> 00:42:49,633 What is surprising you most about the field 1008 00:42:49,633 --> 00:42:51,633 and/or the state of AI today? 1009 00:42:51,633 --> 00:42:53,666 We are living in a moment of time 1010 00:42:53,666 --> 00:42:57,633 where sort of AI has crossed the threshold from something 1011 00:42:57,633 --> 00:43:00,566 that the techies and the geeks talked about all the time 1012 00:43:00,566 --> 00:43:03,866 to it's now on the lips of, you know, 1013 00:43:03,866 --> 00:43:06,366 just about every human being on this planet. 1014 00:43:06,366 --> 00:43:08,800 Artificial intelligence--will it be the savior of humanity 1015 00:43:08,800 --> 00:43:10,800 or lead to our ultimate demise? 1016 00:43:10,800 --> 00:43:13,233 Woman, voice-over: Many people are going to ask, 1017 00:43:13,233 --> 00:43:16,966 "Why on Earth did you create this technology?" 1018 00:43:16,966 --> 00:43:19,233 Cross, voice-over: The speed at which things are moving now 1019 00:43:19,233 --> 00:43:21,066 is just, you know, astonishing. 1020 00:43:21,066 --> 00:43:22,400 You know, stuff that I used to think about, well, 1021 00:43:22,400 --> 00:43:23,966 that's 3 to 5 years away, 1022 00:43:23,966 --> 00:43:25,900 that's, like, now 12 months away now. 1023 00:43:25,900 --> 00:43:27,200 Wallach, voice-over: So, what are you and your team 1024 00:43:27,200 --> 00:43:28,700 working on right now? 1025 00:43:28,700 --> 00:43:32,566 So, Soul Machines sits at this intersection 1026 00:43:32,566 --> 00:43:35,300 of technology and entertainment. 1027 00:43:36,533 --> 00:43:40,366 We create avatars, you know, so, CGI characters. 1028 00:43:40,366 --> 00:43:44,300 And we bring them to life using some very, 1029 00:43:44,300 --> 00:43:47,733 very specific different fields of artificial intelligence. 1030 00:43:47,733 --> 00:43:49,866 So, our digital characters are alive. 1031 00:43:49,866 --> 00:43:52,066 - Mm. - They are digitally alive. 1032 00:43:52,066 --> 00:43:53,633 A lot of people feel we are very much 1033 00:43:53,633 --> 00:43:55,566 at a crossroads moment for humanity, 1034 00:43:55,566 --> 00:43:57,233 for our species on this planet. 1035 00:43:57,233 --> 00:43:59,966 But you're generally very kind of optimistic. 1036 00:43:59,966 --> 00:44:02,466 But what's driving most of that hope right now? 1037 00:44:02,466 --> 00:44:04,366 One of the really, really cool things about 1038 00:44:04,366 --> 00:44:05,766 artificial intelligence is 1039 00:44:05,766 --> 00:44:08,200 you're creating a learning system, 1040 00:44:08,200 --> 00:44:12,633 so, the way in which we simulate human behavior, 1041 00:44:12,633 --> 00:44:15,433 the subtlety which we can simulate human behavior. 1042 00:44:15,433 --> 00:44:17,100 I mean, this is about making, yeah, 1043 00:44:17,100 --> 00:44:21,466 it sounds like a corny phrase, but making AI your friend. 1044 00:44:21,466 --> 00:44:23,433 Woman on screen: It's great to meet you. 1045 00:44:23,433 --> 00:44:25,800 Patrice, what do you think the future will be like? 1046 00:44:25,800 --> 00:44:28,133 Patrice: I'm very optimistic about the future of AI 1047 00:44:28,133 --> 00:44:29,766 and how it will shape our lives. 1048 00:44:29,766 --> 00:44:31,866 The possibilities are endless. 1049 00:44:31,866 --> 00:44:34,366 Can you tell me about yourself, Patrice? 1050 00:44:34,366 --> 00:44:36,300 Patrice: Absolutely. I consider myself 1051 00:44:36,300 --> 00:44:38,766 a vibrant and professional personality, 1052 00:44:38,766 --> 00:44:41,633 and I bring energy and enthusiasm to everything I do. 1053 00:44:41,633 --> 00:44:46,433 Patrice, how will AI change what it means to be human? 1054 00:44:46,433 --> 00:44:48,033 Patrice: On one hand, AI can help us 1055 00:44:48,033 --> 00:44:50,433 achieve things that were once impossible. 1056 00:44:50,433 --> 00:44:52,633 At the same time, we need to remember 1057 00:44:52,633 --> 00:44:55,466 that AI is still a tool and cannot replace humans 1058 00:44:55,466 --> 00:44:57,466 when it comes to making decisions. 1059 00:44:57,466 --> 00:44:59,633 Tell me more about the work that goes into 1060 00:44:59,633 --> 00:45:01,866 creating something like this. 1061 00:45:01,866 --> 00:45:05,833 Yeah. So, you know, Patrice is what we call 1062 00:45:05,833 --> 00:45:07,333 a synthetic digital person. 1063 00:45:07,333 --> 00:45:08,833 So, she doesn't exist in real life. 1064 00:45:08,833 --> 00:45:10,700 She's not a clone of a real person. 1065 00:45:10,700 --> 00:45:13,100 So, she's entirely made up. 1066 00:45:13,100 --> 00:45:18,333 So, to create Patrice, we've built a creation suite 1067 00:45:18,333 --> 00:45:21,200 we call Digital DNA Studio. 1068 00:45:21,200 --> 00:45:24,366 Wallach: So, right now, there's a lot of fear and concern 1069 00:45:24,366 --> 00:45:28,533 about what AI could do to us, do to humans. 1070 00:45:28,533 --> 00:45:30,000 How do you see this? 1071 00:45:30,000 --> 00:45:32,866 Well, I mean, at the end of the day, 1072 00:45:32,866 --> 00:45:36,033 I think the debate is really, really the most positive thing 1073 00:45:36,033 --> 00:45:40,600 that can happen at the moment in terms of people 1074 00:45:40,600 --> 00:45:42,766 talking about what it means 1075 00:45:42,766 --> 00:45:44,766 for the businesses that they work in, 1076 00:45:44,766 --> 00:45:46,366 the industries that they compete in, 1077 00:45:46,366 --> 00:45:49,200 the communities they live in, the type of 1078 00:45:49,200 --> 00:45:52,366 regulatory environment they would like to see. 1079 00:45:52,366 --> 00:45:53,800 But the debate is absolutely critical. 1080 00:45:55,000 --> 00:45:56,833 Allado-McDowell, voice-over: You have to look at the holistic picture 1081 00:45:56,833 --> 00:45:58,933 of everything that's happening right now. 1082 00:45:58,933 --> 00:46:00,833 AI is not happening in a vacuum. 1083 00:46:00,833 --> 00:46:05,433 It's a really profound technological shift. 1084 00:46:05,433 --> 00:46:07,733 But it's also happening alongside 1085 00:46:07,733 --> 00:46:11,066 mass extinction, climate change, 1086 00:46:11,066 --> 00:46:14,933 the greatest economic inequality that we've had. 1087 00:46:14,933 --> 00:46:17,733 We're becoming aware that our actions have global effects. 1088 00:46:17,733 --> 00:46:19,833 ♪ 1089 00:46:19,833 --> 00:46:22,900 Bremmer, voice-over: When we start talking about artificial intelligence, 1090 00:46:22,900 --> 00:46:27,233 that's the first thing in our history 1091 00:46:27,233 --> 00:46:31,266 that has the potential to either 1092 00:46:31,266 --> 00:46:36,266 change us as human beings into a future form 1093 00:46:36,266 --> 00:46:37,766 or extinguish us. 1094 00:46:37,766 --> 00:46:39,233 There's never been such a thing. 1095 00:46:39,233 --> 00:46:41,100 ♪ 1096 00:46:41,100 --> 00:46:42,900 Allado-McDowell, voice-over: 1097 00:46:41,100 --> 00:46:42,900 Th  \h is is part of a profound shift 1098 00:46:42,900 --> 00:46:45,600 in how we see the role of the human. 1099 00:46:45,600 --> 00:46:49,933 And that's a little scary but also potentially very hopeful. 1100 00:46:49,933 --> 00:46:51,900 Because I think a lot of the reason 1101 00:46:51,900 --> 00:46:55,566 we're having these perceptions is because of problems. 1102 00:46:55,566 --> 00:46:57,266 It's because of things that 1103 00:46:57,266 --> 00:47:03,466 human-centricity created, shortsightedness. 1104 00:47:03,466 --> 00:47:06,633 If we think about AI in the long term, 1105 00:47:06,633 --> 00:47:08,166 it really does matter what we do now 1106 00:47:08,166 --> 00:47:10,000 because it will affect future generations 1107 00:47:10,000 --> 00:47:13,333 just like it does with everything else that we do. 1108 00:47:13,333 --> 00:47:15,966 We cannot be narcissistic as a species. 1109 00:47:15,966 --> 00:47:19,166 ♪ 1110 00:47:19,166 --> 00:47:21,833 Wallach: So what should we be optimizing for? 1111 00:47:21,833 --> 00:47:24,733 As leaders, if you want to galvanize 1112 00:47:24,733 --> 00:47:27,666 and bring people together, you have to have a vision. 1113 00:47:27,666 --> 00:47:29,666 You want to co-create that vision. 1114 00:47:29,666 --> 00:47:32,000 But you need to-- you know, there's a telos. 1115 00:47:32,000 --> 00:47:33,566 There's an ultimate aim. 1116 00:47:33,566 --> 00:47:35,233 There's a goal if we want to move forward. 1117 00:47:35,233 --> 00:47:38,500 And I will put on the table that we've lost that. 1118 00:47:38,500 --> 00:47:40,833 So, take me there. 1119 00:47:40,833 --> 00:47:43,300 I think we are at an inflection point. 1120 00:47:43,300 --> 00:47:48,833 Like across society, we have this access to amazing technology. 1121 00:47:48,833 --> 00:47:50,566 We have people that are thinking differently 1122 00:47:50,566 --> 00:47:52,500 about what is their purpose? 1123 00:47:52,500 --> 00:47:55,900 What is the world going to be in 2040? 1124 00:47:55,900 --> 00:47:58,666 So, that gives us a chance to make change 1125 00:47:58,666 --> 00:48:01,400 and make decisions that can be beneficial to all. 1126 00:48:01,400 --> 00:48:05,233 However, we can very much have all these conversations 1127 00:48:05,233 --> 00:48:06,900 and not make any decisions. 1128 00:48:06,900 --> 00:48:08,866 So, nothing actually does change. 1129 00:48:08,866 --> 00:48:12,200 So, my fear is that while we have this opportunity 1130 00:48:12,200 --> 00:48:14,233 that we don't take advantage of it, 1131 00:48:14,233 --> 00:48:19,066 and then we're continuing to live through the same things 1132 00:48:19,066 --> 00:48:20,566 years from now. 1133 00:48:20,566 --> 00:48:23,133 Are we providing basic human rights, basic human needs? 1134 00:48:23,133 --> 00:48:25,400 And then beyond that, like, how do we incentivize 1135 00:48:25,400 --> 00:48:28,900 that innovation which pushes GDP, that solves cancer, 1136 00:48:28,900 --> 00:48:32,200 that brings AI into the new age 1137 00:48:32,200 --> 00:48:37,066 and allows us to just sit and make music all day 1138 00:48:37,066 --> 00:48:39,466 instead of having to worry about, you know, 1139 00:48:39,466 --> 00:48:41,066 building another slide deck. 1140 00:48:41,066 --> 00:48:42,533 Woman: I would like to challenge you on GDP. 1141 00:48:42,533 --> 00:48:44,466 Man: OK. Woman: Because I feel like GDP 1142 00:48:44,466 --> 00:48:46,133 is what we're all chasing now. 1143 00:48:46,133 --> 00:48:48,900 The question is, like, up until what point? 1144 00:48:48,900 --> 00:48:50,400 Some people or some corporations 1145 00:48:50,400 --> 00:48:51,900 are super successful, 1146 00:48:51,900 --> 00:48:54,033 they drive innovation, they drive the GDP. 1147 00:48:54,033 --> 00:48:56,066 But actually, the rights you're chasing after 1148 00:48:56,066 --> 00:49:00,300 or the equality or equity isn't achieved because of that. 1149 00:49:00,300 --> 00:49:02,133 And I feel like GDP doesn't account for 1150 00:49:02,133 --> 00:49:03,766 all of those things we care about, 1151 00:49:03,766 --> 00:49:05,933 and I feel like that's not always money-related. 1152 00:49:05,933 --> 00:49:07,300 It sounds like to answer your question, 1153 00:49:07,300 --> 00:49:08,800 we need a social contract. 1154 00:49:08,800 --> 00:49:10,933 That's the thing I'm tying together 1155 00:49:10,933 --> 00:49:14,433 from everyone's answers is there's a need 1156 00:49:14,433 --> 00:49:17,800 for accountability from people 1157 00:49:17,800 --> 00:49:19,933 that we're going to call leaders. 1158 00:49:19,933 --> 00:49:22,700 It sounds like we need to hold each other accountable 1159 00:49:22,700 --> 00:49:27,466 in some contractual way that emphasizes the need for rights 1160 00:49:27,466 --> 00:49:28,866 and no harm to others. 1161 00:49:28,866 --> 00:49:33,766 ♪ 1162 00:49:33,766 --> 00:49:36,466 Wallach, voice-over: When I look around and I see people 1163 00:49:36,466 --> 00:49:38,933 working on projects and ideas, 1164 00:49:38,933 --> 00:49:41,933 they're not all just thinking, how do I get as much as I can? 1165 00:49:41,933 --> 00:49:44,466 Like, what's my little pot of gold? 1166 00:49:44,466 --> 00:49:47,866 I see the folks who are building and making 1167 00:49:47,866 --> 00:49:50,366 better tomorrows for themselves, 1168 00:49:50,366 --> 00:49:53,633 for their kids, my kids, future generations. 1169 00:49:53,633 --> 00:49:57,200 I mean, I see things that are being worked on right now 1170 00:49:57,200 --> 00:49:59,933 that are going to have downrange impact 1171 00:49:59,933 --> 00:50:03,033 for the better for hundreds of years. 1172 00:50:03,033 --> 00:50:05,966 Courtois, voice-over: We're in a world of turmoil. 1173 00:50:05,966 --> 00:50:09,433 The human nature in turmoil is to kind of cocoon 1174 00:50:09,433 --> 00:50:12,300 and to become more conservative. 1175 00:50:12,300 --> 00:50:15,800 If you're insular, then you don't learn. 1176 00:50:15,800 --> 00:50:17,633 We could do better. 1177 00:50:17,633 --> 00:50:19,600 We can imagine a better future 1178 00:50:19,600 --> 00:50:21,733 and work on it together. 1179 00:50:23,100 --> 00:50:25,300 Ingels, voice-over: My son was born right around the time 1180 00:50:25,300 --> 00:50:28,133 we finished the CopenHill Power Plant. 1181 00:50:28,133 --> 00:50:30,100 So, he doesn't know... Wallach: Mm. 1182 00:50:30,100 --> 00:50:31,700 that there was a time when 1183 00:50:31,700 --> 00:50:34,366 you didn't ski on the power plant. 1184 00:50:34,366 --> 00:50:36,200 For him, that's just like a natural part 1185 00:50:36,200 --> 00:50:40,133 of the landscape of Copenhagen. 1186 00:50:40,133 --> 00:50:43,200 So, if that's the normal for him 1187 00:50:43,200 --> 00:50:45,433 and his generation of Danish kids, 1188 00:50:45,433 --> 00:50:48,100 imagine when they have to start coming up 1189 00:50:48,100 --> 00:50:50,300 with what-if scenarios for their future. 1190 00:50:50,300 --> 00:50:52,700 They're going to come up with some pretty wild stuff. 1191 00:50:54,266 --> 00:50:56,966 Bayer, voice-over: Innovation occurs at the intersection of things. 1192 00:50:56,966 --> 00:50:59,533 The greatest opportunities are in chaos. 1193 00:50:59,533 --> 00:51:02,500 In times of, like, change, 1194 00:51:02,500 --> 00:51:06,366 there's a maximum opportunity to change everything. 1195 00:51:06,366 --> 00:51:08,266 The future is all about integrating 1196 00:51:08,266 --> 00:51:11,766 different interdisciplinary areas into one. 1197 00:51:11,766 --> 00:51:14,100 So, anybody who is younger, 1198 00:51:14,100 --> 00:51:16,333 go beyond just the computer science. 1199 00:51:16,333 --> 00:51:18,033 Look at the arts. 1200 00:51:18,033 --> 00:51:19,866 Look at what's happening in marine biology 1201 00:51:19,866 --> 00:51:21,666 and then neuroscience. 1202 00:51:21,666 --> 00:51:23,866 And you'll be able to bring all of those ideas into one 1203 00:51:23,866 --> 00:51:25,200 and create something that 1204 00:51:25,200 --> 00:51:26,766 nobody might have thought about. 1205 00:51:27,866 --> 00:51:29,533 Slat, voice-over: Progress is not inevitable, 1206 00:51:29,533 --> 00:51:33,666 and it really requires conscious effort. 1207 00:51:33,666 --> 00:51:35,766 Sometimes, people ask me, why you? 1208 00:51:35,766 --> 00:51:37,200 Why did you decide to work on this? 1209 00:51:37,200 --> 00:51:40,366 And I think it's a strange question. 1210 00:51:40,366 --> 00:51:43,166 A much more interesting question to me 1211 00:51:43,166 --> 00:51:46,500 is why isn't everyone doing this? 1212 00:51:46,500 --> 00:51:49,200 If there is something that's bothering you, 1213 00:51:49,200 --> 00:51:52,866 I think it would be strange to just wait 1214 00:51:52,866 --> 00:51:54,500 for somebody else to solve it. 1215 00:51:54,500 --> 00:51:58,100 ♪ 1216 00:51:58,100 --> 00:52:00,433 Wallach, voice-over: Hundreds of years from now, 1217 00:52:00,433 --> 00:52:01,933 they are going to look back 1218 00:52:01,933 --> 00:52:04,266 on our moment that we're in right now 1219 00:52:04,266 --> 00:52:08,766 as potentially the most pivotal in human history. 1220 00:52:08,766 --> 00:52:11,733 The decisions that we make around our technologies 1221 00:52:11,733 --> 00:52:14,433 and how we're going to live on this planet 1222 00:52:14,433 --> 00:52:18,933 will actually dictate who and what we are to become. 1223 00:52:18,933 --> 00:52:23,600 We have to ask ourselves, 1224 00:52:18,933 --> 00:52:23,600 in  \h this moment of complexity, 1225 00:52:23,600 --> 00:52:26,600 what is it that we want to see happen? 1226 00:52:26,600 --> 00:52:28,066 Where do we want to go? 1227 00:52:28,066 --> 00:52:32,766 ♪ 1228 00:52:33,466 --> 00:53:02,033 ♪ 96389

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.