Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
1
00:00:00,420 --> 00:00:07,320
What Allison should know what are you saying I don't know what Internet is that massive computer.
2
00:00:07,380 --> 00:00:10,870
Was the one that's becoming really big now.
3
00:00:11,010 --> 00:00:11,400
What do you mean.
4
00:00:11,440 --> 00:00:16,350
That's just what you know what do you write to what like I don't know a lot of people use it and communicate.
5
00:00:16,410 --> 00:00:18,960
I guess they can communicate with NBC writers and producers.
6
00:00:18,960 --> 00:00:21,180
Allison can you explain what Internet is
7
00:00:29,140 --> 00:00:30,590
how amazing is that.
8
00:00:30,670 --> 00:00:34,470
Just over 20 years ago people didn't even know what the internet was.
9
00:00:34,510 --> 00:00:37,260
And today we can't even imagine our lives are for it.
10
00:00:37,300 --> 00:00:39,560
Welcome to the people earning ETAs that course.
11
00:00:39,580 --> 00:00:44,800
My name is Cora Menko and along with the contractor had Lunda Pontmercy were super excited to have you
12
00:00:44,800 --> 00:00:45,570
on board.
13
00:00:45,610 --> 00:00:51,670
And today we're going to give you a quick overview of what deploring it is and why it's picking up right
14
00:00:51,670 --> 00:00:52,300
now.
15
00:00:52,300 --> 00:00:53,590
So let's get started.
16
00:00:53,590 --> 00:00:57,640
Why did we have a look at that clip and what is this photo over here.
17
00:00:57,670 --> 00:01:00,310
Well that clip was from 1994.
18
00:01:00,310 --> 00:01:03,110
This is a photo of computer from 1980.
19
00:01:03,220 --> 00:01:09,490
And the reason why we kind of delving into history a little bit is because neural networks along with
20
00:01:09,490 --> 00:01:15,670
deep learning have been around for quite some time and they've only started picking up now and impacting
21
00:01:15,670 --> 00:01:16,640
the world right now.
22
00:01:16,750 --> 00:01:22,510
But if you look back at the 80s you'll see that even though they were invented in the 60s and 70s they
23
00:01:22,510 --> 00:01:30,760
really caught on to a trend or called the cold wind in the 80s so people are talking about them a lot.
24
00:01:30,760 --> 00:01:35,890
There was a lot of research in that area and everybody thought that deep learning or neural networks
25
00:01:35,890 --> 00:01:41,830
were this new thing that is going to impact the world is going to change everything is going to solve
26
00:01:41,860 --> 00:01:46,060
all the world problems and they did kind of slow they died off over the next decade.
27
00:01:46,080 --> 00:01:51,210
And so what happened why did why did the neural networks not survive and not change the world with it
28
00:01:51,200 --> 00:01:51,750
was it.
29
00:01:51,940 --> 00:01:57,220
The reason for that that they were just not good enough that they are not that good at predicting things
30
00:01:57,270 --> 00:02:02,320
are not that good at modeling and messy just not a good invention.
31
00:02:02,350 --> 00:02:03,400
Or is there another reason.
32
00:02:03,400 --> 00:02:08,410
Well actually there is another reason and the reason is in front of us it's the fact that technology
33
00:02:08,410 --> 00:02:15,640
back then was not up to the right standard to facilitate neural networks in order for neural networks
34
00:02:15,700 --> 00:02:17,130
and deep learning to work properly.
35
00:02:17,140 --> 00:02:21,880
You need two things you need data and you need a lot of data and you need processing power you need
36
00:02:21,880 --> 00:02:25,960
strong computers to process that data and facilitate and you know that works.
37
00:02:25,990 --> 00:02:32,900
So let's have a look at how as data or storage of data has evolved over the years and then we'll look
38
00:02:32,900 --> 00:02:34,880
at how technology has evolved.
39
00:02:34,880 --> 00:02:39,350
So here we go got three years 1956 1980 2017.
40
00:02:39,830 --> 00:02:43,220
How much did storage look back in 1956.
41
00:02:43,280 --> 00:02:47,600
Well there's a hard drive and that hard drive is only a five.
42
00:02:47,750 --> 00:02:54,980
Wait for a megabyte harddrive That's five megabytes right there on the forklift the size of a small
43
00:02:54,980 --> 00:03:01,040
room that's a hard drive being transported to another location on a plane.
44
00:03:01,370 --> 00:03:04,670
And that is what storage looked like in the.
45
00:03:04,700 --> 00:03:11,360
In 1956 you had to pay a company had to pay two and a half thousand dollars of those days dollars to
46
00:03:11,390 --> 00:03:16,400
rent that hard drive to rent it not buy it or rented for one month.
47
00:03:16,400 --> 00:03:18,760
In 1980 the situation improved a little bit.
48
00:03:18,800 --> 00:03:24,290
So here we got a 10 megabyte hard drive for three and a half thousand dollars is still very expensive
49
00:03:24,290 --> 00:03:27,260
and only 10 megabytes So that's like one photo these days.
50
00:03:27,260 --> 00:03:36,840
And today in 2017 we've got a 256 gigabyte SD card for $150 which can fit on your finger.
51
00:03:37,100 --> 00:03:43,790
And if you're watching this video a year later or like in 2019 or 2025 you probably laughing to us all
52
00:03:43,790 --> 00:03:47,240
because by then you have even stronger storage capacity.
53
00:03:47,240 --> 00:03:52,910
But nevertheless the point stands if we compare these across the board and we even taking price and
54
00:03:52,910 --> 00:03:58,370
size into consideration just the capacity of whatever was trending at the time.
55
00:03:58,370 --> 00:04:04,090
So from 1956 to 1980 capacity increased about double.
56
00:04:04,250 --> 00:04:12,380
And then it increased about twenty five thousand six hundred times and the know the length of the period
57
00:04:12,380 --> 00:04:20,210
is not that different from 1956 to 1980 24 years from 1980 to 2013 thirty seven years so not that much
58
00:04:20,210 --> 00:04:24,610
of an increase in time but a huge jump in technological progress.
59
00:04:24,890 --> 00:04:28,220
And that stands to show that this is not a linear trend.
60
00:04:28,220 --> 00:04:33,830
This is an exponential growth in technology and if we add into it take into account price and size you
61
00:04:33,910 --> 00:04:37,090
will be in the millions of increase.
62
00:04:37,280 --> 00:04:40,540
And here we actually have a chart on a logarithmic scale.
63
00:04:40,640 --> 00:04:46,290
So if we plot the harddrive cost per gigabyte you'll see that looks something like this.
64
00:04:46,430 --> 00:04:49,880
We're very quickly approaching zero.
65
00:04:49,910 --> 00:04:54,200
Right now you can get storage on Dropbox and Google Drive which doesn't cost you anything.
66
00:04:54,200 --> 00:05:01,280
Cloud storage and that's going to continue and in fact over the years this is going to go even further.
67
00:05:01,280 --> 00:05:05,860
Right now scientists are looking into using DNA for storage.
68
00:05:05,960 --> 00:05:13,720
And right now it's quite expensive it costs $7000 to synthesize two megabytes of data and then another
69
00:05:13,730 --> 00:05:15,210
thought 2004's to read it.
70
00:05:15,230 --> 00:05:19,670
But that kind of reminds you of this whole situation of the harddrive and the plane you know that this
71
00:05:19,670 --> 00:05:24,590
is going to be mitigated very very quickly with this exponential curve 10 to 10 years from now 20 years
72
00:05:24,590 --> 00:05:28,590
from now everybody's going to be using DNA storage if we go down this direction.
73
00:05:28,660 --> 00:05:33,770
And here are some stats on that so you can explore it further maybe pause the pause the video if you
74
00:05:33,770 --> 00:05:35,360
want to read a bit more about this.
75
00:05:35,360 --> 00:05:36,890
This is from nature dot com.
76
00:05:37,040 --> 00:05:44,990
And basically you can store all of the world's data in just one kilo one kilogram of DNA storage or
77
00:05:44,990 --> 00:05:49,350
you can store about 1 billion terabytes of data in one gram of DNA storage.
78
00:05:49,350 --> 00:05:56,150
So that's just something to to show how quickly we're progressing and that this is why deep learning
79
00:05:56,150 --> 00:06:02,840
is picking up now that we are finally at the stage where we have enough data to train super cool super
80
00:06:02,840 --> 00:06:04,280
sophisticated models.
81
00:06:04,280 --> 00:06:08,350
Back then in the 80s when I was first initially invented juice just wasn't the case.
82
00:06:08,690 --> 00:06:15,740
And the second thing we talked about is processing capacity So here we've got an exponential curve again
83
00:06:16,460 --> 00:06:17,850
on a log scale.
84
00:06:17,850 --> 00:06:23,840
It's an ideally portrayed here but on the right because it's a log scale and this is how computers have
85
00:06:23,840 --> 00:06:24,410
been evolving.
86
00:06:24,410 --> 00:06:30,530
So again feel free to post the slide this is called Moore's Law you've probably heard of it how quickly
87
00:06:30,530 --> 00:06:34,120
the processing capacity of computers has been evolving.
88
00:06:34,310 --> 00:06:39,920
Right now we're somewhere over here where an average computer can buy for a thousand bucks thinks at
89
00:06:39,920 --> 00:06:47,960
the speed of the brain of a rat and between two and five will be the speed of a human or 20:23 and then
90
00:06:48,470 --> 00:06:54,740
by 2050 or 2045 it will surpass all of the humans combined.
91
00:06:54,740 --> 00:07:01,610
So basically we're entering the era of computers that are extremely powerful that can process things
92
00:07:01,610 --> 00:07:05,670
WAY faster then then we then we can imagine.
93
00:07:05,720 --> 00:07:08,490
And that is what is facilitating the learning.
94
00:07:08,600 --> 00:07:14,510
So all of this brings us to the question What is deep learning what what is this whole neural network
95
00:07:14,510 --> 00:07:18,390
situation what what is going on what are we even talking about here.
96
00:07:18,410 --> 00:07:20,490
And you've probably seen a picture or something like this.
97
00:07:20,570 --> 00:07:21,590
So let's dive into it.
98
00:07:21,590 --> 00:07:29,140
What is deep learning this gentleman over here Jeffrey Hinton is known as the godfather of deep thing.
99
00:07:29,350 --> 00:07:37,030
And he did research on deep learning in the 80s and he's done lots and lots of work lots of research
100
00:07:37,030 --> 00:07:41,370
papers he's published in deep learning right now.
101
00:07:41,370 --> 00:07:42,920
He works at Google.
102
00:07:43,030 --> 00:07:47,770
So a lot of the things that we're going to be talking about actually come from Jeffrey Hinton and you
103
00:07:47,770 --> 00:07:48,260
can see a lot.
104
00:07:48,300 --> 00:07:49,840
He's got quite a few YouTube videos.
105
00:07:49,840 --> 00:07:53,930
He explains things really well so I highly recommend checking them out.
106
00:07:54,130 --> 00:07:59,140
And so the idea behind deep learning is to look at the human brain.
107
00:07:59,140 --> 00:08:01,710
And this guy is going to be quite a bit of neuroscience coming up.
108
00:08:01,750 --> 00:08:09,370
And in these tutorials and what we're trying to do here is to mimic how the human brain operates.
109
00:08:09,400 --> 00:08:13,300
And you know we don't know that much you don't know everything about the human brain but that little
110
00:08:13,300 --> 00:08:16,730
man that we all know we want to mimic it and recreate it.
111
00:08:16,730 --> 00:08:17,230
And why is that.
112
00:08:17,230 --> 00:08:21,970
Well because the human brain seems to be one of the most powerful tools on this planet for learning
113
00:08:22,300 --> 00:08:28,690
for learning adapting skills and then applying them and if computers could copy that then we could just
114
00:08:28,990 --> 00:08:32,980
leverage what natural selection has already decided for us.
115
00:08:32,980 --> 00:08:37,840
All of those kind of algorithms that it has decided are the best which are going to leverage that.
116
00:08:37,840 --> 00:08:39,750
Why reinvent the bicycle ride.
117
00:08:39,880 --> 00:08:41,700
So let's see how this works.
118
00:08:41,710 --> 00:08:50,260
Here we've got some neurons so these neurons which have been smeared onto glass and then have been looked
119
00:08:50,260 --> 00:08:52,210
at under a microscope with some coloring.
120
00:08:52,270 --> 00:08:57,310
And this is you can see what they look like so they have like a body they have these branches and they
121
00:08:57,310 --> 00:09:02,700
have like tails and so and so you can see them they have like a nucleus inside in the middle and that's
122
00:09:02,710 --> 00:09:06,940
that's basically what a neuron looks like in the human brain.
123
00:09:06,970 --> 00:09:12,310
There's approximately 100 billion neurons all together so these are individual neurons these are actually
124
00:09:12,310 --> 00:09:17,380
motor neurons because they're bigger they're easier to see but nevertheless there's a hundred billion
125
00:09:17,470 --> 00:09:19,740
neurons in the human brain.
126
00:09:20,000 --> 00:09:23,920
And it is connected to as many as about a thousand of its neighbors.
127
00:09:23,920 --> 00:09:26,590
So to give you a picture this is what it looks like.
128
00:09:26,590 --> 00:09:32,140
This is an actual data section of the human brain.
129
00:09:32,140 --> 00:09:38,950
And this is the cerebellum which is this part of your brain at the back.
130
00:09:38,950 --> 00:09:47,140
It is responsible for like more Torex and for you know keeping a balance and some language capabilities
131
00:09:47,140 --> 00:09:47,680
and something like that.
132
00:09:47,680 --> 00:09:57,460
So this is just to show how Vorst How many neurons there are like billions and billions and billions
133
00:09:57,460 --> 00:10:02,770
of neurons all connecting It's like we're talking about five or five hundred or a thousand or millions
134
00:10:02,770 --> 00:10:04,780
billions of neurons in there.
135
00:10:04,910 --> 00:10:08,350
And so that's that's what we're going to be trying to recreate.
136
00:10:08,350 --> 00:10:11,700
So how do we recreate this in a computer.
137
00:10:11,890 --> 00:10:20,200
Well we create an artificial structure called an artificial neural net where we have nodes or neurons
138
00:10:20,590 --> 00:10:26,500
and we're going to have some neurons for input value so these are values that you that you know about
139
00:10:26,500 --> 00:10:27,340
a certain situation.
140
00:10:27,340 --> 00:10:32,080
So for instance you're modeling something you want to predict something you always could have some input
141
00:10:32,080 --> 00:10:33,310
something to start.
142
00:10:33,310 --> 00:10:36,740
Your prediction is off then that's called the input layer.
143
00:10:36,820 --> 00:10:38,100
Then you have the output.
144
00:10:38,140 --> 00:10:43,780
So that's of value that you want to predict or it's surprise whether it's is somebody going to leave
145
00:10:43,930 --> 00:10:45,980
the bank or stay in the bank.
146
00:10:46,260 --> 00:10:50,530
Is this a fraudulent transaction it's a real transaction and so on.
147
00:10:50,890 --> 00:10:52,420
So that's going to be output lower.
148
00:10:52,540 --> 00:10:55,330
And in between we're going to have a hidden layer.
149
00:10:55,330 --> 00:11:01,600
So as you could see in your brain you have so many neurons so some information is coming in through
150
00:11:01,600 --> 00:11:04,990
your eyes ears nose so basically your senses.
151
00:11:05,140 --> 00:11:10,210
And then it's it's not just going right away to the output where you have the result is going through
152
00:11:10,240 --> 00:11:15,130
all of these billions and billions and billions of neurons before guess output and this is the whole
153
00:11:15,130 --> 00:11:17,050
concept behind it that we're going to model the brain.
154
00:11:17,050 --> 00:11:23,170
So we need these hidden layers that are there before the output so the input Lares neurons connected
155
00:11:23,170 --> 00:11:26,580
to a hidden layer neurons that neurons are connect to output.
156
00:11:26,880 --> 00:11:29,280
And so this is this is pretty cool.
157
00:11:29,290 --> 00:11:30,550
But what is this all about.
158
00:11:30,550 --> 00:11:34,040
Where is the deep learning here or why is it called deeper nothing deeper in here.
159
00:11:34,090 --> 00:11:41,090
While this is kind of like an option which one might call shallow learning where there isn't much indeed
160
00:11:41,110 --> 00:11:41,890
going on.
161
00:11:41,890 --> 00:11:43,400
But why is it called deploring.
162
00:11:43,420 --> 00:11:50,020
Well because then we take this to the next level we separate it even further and we have not just one
163
00:11:50,020 --> 00:11:58,180
hit and there we have lots and lots and lots of hidden layers and then we connect everything just like
164
00:11:58,180 --> 00:12:01,930
in the human brain connect everything interconnected everything.
165
00:12:01,930 --> 00:12:08,290
And that's how the input values are processed through all these hidden layers just like in the human
166
00:12:08,290 --> 00:12:08,660
brain.
167
00:12:08,770 --> 00:12:12,380
Then we have an output value and now we're talking deep learning.
168
00:12:12,460 --> 00:12:15,910
So that's what learning is all about on a very abstract level.
169
00:12:15,940 --> 00:12:21,180
And the further tutorials we're going to dissect and dive deep into deep learning and by the end of
170
00:12:21,180 --> 00:12:26,480
it you will know what the planning is all about and you will know how to apply it in your projects.
171
00:12:26,500 --> 00:12:31,910
Super excited about this wait to get started and I look forward to seeing the next tutorial.
172
00:12:31,950 --> 00:12:33,700
Until then enjoy deep learning.
19253
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.