All language subtitles for 064 How do Neural Networks work-en

af Afrikaans
ak Akan
sq Albanian
am Amharic
ar Arabic
hy Armenian
az Azerbaijani
eu Basque
be Belarusian
bem Bemba
bn Bengali
bh Bihari
bs Bosnian
br Breton
bg Bulgarian
km Cambodian
ca Catalan
ceb Cebuano
chr Cherokee
ny Chichewa
zh-CN Chinese (Simplified)
zh-TW Chinese (Traditional)
co Corsican
hr Croatian
cs Czech
da Danish
nl Dutch
en English
eo Esperanto
et Estonian
ee Ewe
fo Faroese
tl Filipino
fi Finnish
fr French
fy Frisian
gaa Ga
gl Galician
ka Georgian
de German
el Greek
gn Guarani
gu Gujarati
ht Haitian Creole
ha Hausa
haw Hawaiian
iw Hebrew
hi Hindi
hmn Hmong
hu Hungarian
is Icelandic
ig Igbo
id Indonesian
ia Interlingua
ga Irish
it Italian
ja Japanese
jw Javanese
kn Kannada
kk Kazakh
rw Kinyarwanda
rn Kirundi
kg Kongo
ko Korean
kri Krio (Sierra Leone)
ku Kurdish
ckb Kurdish (Soranî)
ky Kyrgyz
lo Laothian
la Latin
lv Latvian
ln Lingala
lt Lithuanian
loz Lozi
lg Luganda
ach Luo
lb Luxembourgish
mk Macedonian
mg Malagasy
ms Malay
ml Malayalam
mt Maltese
mi Maori
mr Marathi
mfe Mauritian Creole
mo Moldavian
mn Mongolian
my Myanmar (Burmese)
sr-ME Montenegrin
ne Nepali
pcm Nigerian Pidgin
nso Northern Sotho
no Norwegian
nn Norwegian (Nynorsk)
oc Occitan
or Oriya
om Oromo
ps Pashto
fa Persian
pl Polish
pt-BR Portuguese (Brazil)
pt Portuguese (Portugal)
pa Punjabi
qu Quechua
ro Romanian
rm Romansh
nyn Runyakitara
ru Russian
sm Samoan
gd Scots Gaelic
sr Serbian
sh Serbo-Croatian
st Sesotho
tn Setswana
crs Seychellois Creole
sn Shona
sd Sindhi
si Sinhalese
sk Slovak
sl Slovenian
so Somali
es Spanish
es-419 Spanish (Latin American)
su Sundanese
sw Swahili
sv Swedish
tg Tajik
ta Tamil
tt Tatar
te Telugu
th Thai
ti Tigrinya
to Tonga
lua Tshiluba
tum Tumbuka
tr Turkish
tk Turkmen
tw Twi
ug Uighur
uk Ukrainian
ur Urdu
uz Uzbek
vi Vietnamese Download
cy Welsh
wo Wolof
xh Xhosa
yi Yiddish
yo Yoruba
zu Zulu
Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated: 1 00:00:00,560 --> 00:00:02,950 Arrive exciting tutorial ahead. 2 00:00:02,950 --> 00:00:04,850 Welcome back to the course on deep learning. 3 00:00:04,870 --> 00:00:08,090 Today we're talking about how neural networks work. 4 00:00:08,110 --> 00:00:13,840 Now we've led a lot of ground work we've talked about how neural networks are structured what elements 5 00:00:13,840 --> 00:00:16,420 they consist of and even their functionality. 6 00:00:16,660 --> 00:00:21,940 And today we're going to look at and a real example of how unusual neural network can be applied and 7 00:00:21,940 --> 00:00:27,760 we're actually going to work step by step through the process of its application so we know what is 8 00:00:27,760 --> 00:00:28,340 going on. 9 00:00:28,360 --> 00:00:30,690 So let's have a look what example we're going to be talking about. 10 00:00:30,790 --> 00:00:37,180 We're going to be looking at a property evaluation so we're going to look at a neural network that takes 11 00:00:37,180 --> 00:00:39,600 in some parameters of our property and value values. 12 00:00:39,640 --> 00:00:40,930 And the thing here. 13 00:00:40,960 --> 00:00:45,820 There's a small caveat for today's tutorial and that is we're not actually going to train the network. 14 00:00:45,820 --> 00:00:50,980 So a very important part in neural networks is training them up and we're going to look at that in the 15 00:00:50,980 --> 00:00:53,430 next tutorials in this section. 16 00:00:53,440 --> 00:00:57,760 For now we're going to focus on actual applications are we going to work with a neural network that 17 00:00:58,390 --> 00:01:04,870 we're going to pretend is already trained up and that will allow us to focus on the application side 18 00:01:04,870 --> 00:01:09,370 of things and not get bogged down in the training aspect and then we'll cover off the training when 19 00:01:09,370 --> 00:01:11,750 we already know the end goal we're working towards. 20 00:01:11,800 --> 00:01:12,640 Sounds good. 21 00:01:12,640 --> 00:01:14,900 All right let's jump straight into it. 22 00:01:15,250 --> 00:01:19,090 So let's say we have some input parameters. 23 00:01:19,090 --> 00:01:24,670 Right so let's say we have full parameters about the property we have area in square feet we have the 24 00:01:24,670 --> 00:01:29,950 number of bedrooms that distance the city and Miles the New York City and the age of the property and 25 00:01:29,950 --> 00:01:33,590 all of those four are going to comprise our inputs layer. 26 00:01:33,670 --> 00:01:40,810 Now of course they're probably way more parameters that define the price of a property but for simplicity 27 00:01:40,810 --> 00:01:43,000 sake we're going to look at just this for now. 28 00:01:43,010 --> 00:01:50,260 It's very basic form a neural network only has an input learn an output layer so no hidden layers and 29 00:01:50,260 --> 00:01:52,300 our output layer is the price that we're predicting. 30 00:01:52,300 --> 00:02:00,990 So in this form what these inputs variables would do is they would just be weighted up by the synopses 31 00:02:01,030 --> 00:02:04,150 and then the output there would be calculated. 32 00:02:04,150 --> 00:02:06,190 Basically the price would be calculated and would get a price. 33 00:02:06,190 --> 00:02:14,250 And for instance the price could be calculated as simple as the weighted sum of all of the inputs. 34 00:02:14,290 --> 00:02:17,720 And again here you could use pretty much any function you could use. 35 00:02:17,830 --> 00:02:25,330 What we're using now we could use any of the activation functions we had previously you could use logistic 36 00:02:25,330 --> 00:02:26,020 regression. 37 00:02:26,050 --> 00:02:32,680 You could use a squared function you can do pretty much anything here but the point is that you get 38 00:02:32,740 --> 00:02:33,570 a certain output. 39 00:02:33,600 --> 00:02:40,980 And moreover most of the machine learning algorithms that exist can be represented in this form and 40 00:02:40,990 --> 00:02:46,390 this is basically a diagrammatic representation of how you deal with the variables or by changing the 41 00:02:46,390 --> 00:02:51,910 way it's a formalised you can accomplish quite a lot of the machine learning algorithms that we've talked 42 00:02:51,910 --> 00:02:58,510 about before and put them into this form and that just tends to show how powerful Noul are neural networks 43 00:02:58,510 --> 00:02:58,900 are. 44 00:02:58,960 --> 00:03:04,270 Even without the hidden layers we are ready where we have a representation that works for most other 45 00:03:04,270 --> 00:03:05,360 machine learning algorithms. 46 00:03:05,470 --> 00:03:13,210 But in neural networks what we do have is an advantage that gives us lots of flexibility and power which 47 00:03:13,600 --> 00:03:17,050 is where that increase in accuracy comes from. 48 00:03:17,050 --> 00:03:24,490 And that power is the hidden layers and there we go that's our hit Alair we added it in and now we're 49 00:03:24,490 --> 00:03:30,100 going to understand how that hidden lair gives us that extra power. 50 00:03:30,300 --> 00:03:35,830 And in fact to do that we're going to walk through an example so as we agreed this neural network has 51 00:03:35,830 --> 00:03:40,150 really been trained up and now we're just going to plug in we're going to imagine they were plugging 52 00:03:40,150 --> 00:03:48,370 in a property and we're going to walk step by step through how the neural network will deal with the 53 00:03:48,370 --> 00:03:51,880 input variables and calculate the Hindol area and then calculate the output. 54 00:03:51,880 --> 00:03:54,430 So let's go through this is going to be exciting. 55 00:03:54,430 --> 00:03:54,880 All right. 56 00:03:55,060 --> 00:04:00,950 We've got all four variables on the left and we're going to first start with the top Nurin on the Hindle 57 00:04:00,980 --> 00:04:01,420 there. 58 00:04:01,450 --> 00:04:08,290 Now as we previously saw in the press literals all of the neurons from the input layer they have Cynapsus 59 00:04:08,290 --> 00:04:12,940 connecting it to each one of them to the top neuron in the hidden lair. 60 00:04:13,480 --> 00:04:15,240 And those systems have weights. 61 00:04:15,250 --> 00:04:20,470 Now let's agree that some weights will have a non-zero value some ways will have zero value because 62 00:04:20,710 --> 00:04:28,390 basically not all inputs will be valid or not all inputs will be important for every single neuron sometimes 63 00:04:28,390 --> 00:04:29,420 inputs will not be important. 64 00:04:29,420 --> 00:04:34,180 Here we can see two examples that X-1 next three the area and the distance to the city and Miles are 65 00:04:34,180 --> 00:04:38,770 important for that neuron whereas bedrooms and age are not like let's think about this for a second 66 00:04:38,770 --> 00:04:40,540 why how would that be the case. 67 00:04:40,550 --> 00:04:47,350 Like why would a neuron be linked to the area and the distance what does that what could that mean. 68 00:04:47,350 --> 00:04:53,410 Well that could mean that normally the further away you get from the city the cheaper real estate becomes 69 00:04:53,410 --> 00:04:58,420 and therefore the space in square feet of properties becomes larger. 70 00:04:58,420 --> 00:05:02,380 So for the same price you can get a larger property the further away you go from the city that's normal 71 00:05:02,380 --> 00:05:02,640 right. 72 00:05:02,640 --> 00:05:08,930 That that makes sense and probably what this neuron is doing is it is looking specifically it's like 73 00:05:08,950 --> 00:05:16,990 like a sniper it's looking for area properties which have which are not so far from the city but have 74 00:05:16,990 --> 00:05:17,630 a large area. 75 00:05:17,630 --> 00:05:24,830 So for their distance from the city they have an unfair square foot area. 76 00:05:24,850 --> 00:05:28,690 Right so something as abnormals height is higher than average so they're quite close to the city but 77 00:05:28,690 --> 00:05:35,290 they're still large as opposed to the other ones at the same distance and so that neuron again we're 78 00:05:35,290 --> 00:05:40,450 speculating here but that neuron might be picking up laser picking out those specific properties and 79 00:05:40,450 --> 00:05:46,150 it will activate and hence the activation function it will activate it'll fire up only when the certain 80 00:05:46,150 --> 00:05:50,830 criteria is met that you know the distance and the area of the proper distance to Syrian air of the 81 00:05:50,830 --> 00:05:57,580 area of the property and it performs on calculations inside itself and it combines those two and as 82 00:05:57,580 --> 00:06:01,990 soon as certain areas where it fires up and that contributes to the price in output. 83 00:06:02,260 --> 00:06:06,760 And therefore this neuron doesn't really care about bedrooms and age of the property because it's focused 84 00:06:06,760 --> 00:06:07,730 on that specific thing. 85 00:06:07,750 --> 00:06:12,640 That's where the power of the neural network comes from because you have many of these years and will 86 00:06:12,640 --> 00:06:14,390 see just now how the other ones work. 87 00:06:14,650 --> 00:06:21,370 But what I want to agree here is that let's not even draw these lines for the synopses that are not 88 00:06:22,090 --> 00:06:26,170 in place so that we don't clutter up our image as the only reason we're not going to draw them so let's 89 00:06:26,170 --> 00:06:27,290 just get rid of those too. 90 00:06:27,460 --> 00:06:33,940 And that way we will know exactly OK so this neuron is focused on area and distance to the city. 91 00:06:33,940 --> 00:06:34,520 All right. 92 00:06:34,900 --> 00:06:35,980 So as always we agree on that. 93 00:06:35,980 --> 00:06:36,830 Let's move on to next. 94 00:06:36,850 --> 00:06:42,400 Let's take them one in the middle here we've got three parameters feeding into this neuron so we've 95 00:06:42,400 --> 00:06:46,010 got the area the bedrooms and the age of the property. 96 00:06:46,030 --> 00:06:48,600 So what could be the reason here. 97 00:06:48,600 --> 00:06:55,660 Let's again let's try to understand the intuition and the thinking of this neuron how is this neuron 98 00:06:55,660 --> 00:06:56,110 thinking. 99 00:06:56,110 --> 00:06:57,770 Why is it picking these two parents. 100 00:06:57,780 --> 00:07:02,110 What could it be what could have a hit like found in the data. 101 00:07:02,110 --> 00:07:06,580 Right so we've already established this trained up data set the training has happened a long time ago 102 00:07:06,580 --> 00:07:11,590 maybe like a day ago or somebody is written up as it is now we're just applying and we know that this 103 00:07:11,590 --> 00:07:17,650 neuron through all of the thousands of examples of properties has found out that the area plus the bedrooms 104 00:07:17,650 --> 00:07:20,590 plus the age combination of those parameters is important. 105 00:07:20,590 --> 00:07:21,600 Why could that be the case. 106 00:07:21,610 --> 00:07:29,410 Well for instance maybe in that specific city in those suburbs that this neural network has been trained 107 00:07:29,410 --> 00:07:38,860 up in perhaps there's a lot of families with kids who have two or more children who are looking for 108 00:07:40,000 --> 00:07:47,470 large properties with lots of bedrooms but which are new rights which are not old proper because maybe 109 00:07:47,590 --> 00:07:52,910 that's in that area almost appropriate or kind of like big properties are usually old. 110 00:07:52,930 --> 00:07:59,860 But there's lots of modern families and maybe there has been a social demographic shift and or maybe 111 00:07:59,860 --> 00:08:07,540 there's been like a lot of like some growth in terms of employment and jobs for the younger self population 112 00:08:07,570 --> 00:08:15,520 maybe just you know the like the population demographics have changed and now younger couples or younger 113 00:08:15,520 --> 00:08:22,210 families are looking for properties but they prefer new properties so they want the age of the property 114 00:08:22,210 --> 00:08:27,370 to be lower and hence from the training that this neural network has undergone. 115 00:08:27,370 --> 00:08:32,890 It knows that when there's a property with a large area and with lots of bedroom with these three at 116 00:08:32,890 --> 00:08:36,790 least three bedrooms for the parents for the first of the second child for at least three bedrooms maybe 117 00:08:36,790 --> 00:08:45,190 a guest room when you property with high area and lots of bedrooms that is valued that in that market 118 00:08:45,190 --> 00:08:48,270 that is valuable so that Meuron has picked that up. 119 00:08:48,280 --> 00:08:49,390 It knows that. 120 00:08:49,420 --> 00:08:51,490 OK so this is what I'm going to be looking for. 121 00:08:51,490 --> 00:08:56,620 I don't care about the distance to the city and Miles wherever it is as long as it has high area lots 122 00:08:56,620 --> 00:08:57,080 of bedrooms. 123 00:08:57,110 --> 00:09:01,750 As soon as that criteria is met the neuron fires up and the combination of these two parameters and 124 00:09:01,750 --> 00:09:06,190 this is again this is where the power of the neural network is coming from because it combines these 125 00:09:06,190 --> 00:09:11,300 two parameters into a brand new parameter into brand new attributes. 126 00:09:11,470 --> 00:09:17,860 That helps with the evaluation that helps with the valuation of the property. 127 00:09:17,860 --> 00:09:21,910 It combines them into a new attribute and therefore it's more precise. 128 00:09:21,910 --> 00:09:24,160 So there we go that's how that works. 129 00:09:24,250 --> 00:09:30,040 And let's look at another one let's look at the very bottom one for instance this neuron could be could 130 00:09:30,040 --> 00:09:35,110 even have picked up just one pair and that it could have just picked up eight and not in any of the 131 00:09:35,110 --> 00:09:35,540 other ones. 132 00:09:35,540 --> 00:09:37,530 And how could that be the case. 133 00:09:37,900 --> 00:09:45,760 Well this is a classic example of when age could mean like as we all know the older properties usually 134 00:09:45,760 --> 00:09:47,710 it's less valuable because it's worn out. 135 00:09:47,710 --> 00:09:50,740 Probably the building is old probably you know things are falling apart. 136 00:09:50,740 --> 00:09:55,460 More maintenance is required so the price drops in terms of the price of the real estate. 137 00:09:55,480 --> 00:09:58,880 Whereas a brand new building it would be more expensive because it's brand new. 138 00:09:59,200 --> 00:10:03,980 Perhaps if a property is over a certain age that could indicate that it's a historic property. 139 00:10:04,030 --> 00:10:11,240 For instance if a property is under 100 years old then the older it is the less valuable it is. 140 00:10:11,380 --> 00:10:17,050 But as soon as it jumps over 100 years old all of a sudden it becomes a historic property because this 141 00:10:17,050 --> 00:10:20,840 is a property where people still have hundreds of years ago. 142 00:10:20,860 --> 00:10:26,240 It tells a story it's got all this history behind it and some people like that some people value that. 143 00:10:26,290 --> 00:10:32,110 In fact quite a lot of people would like that and would be proud to live in a property and especially 144 00:10:32,110 --> 00:10:36,970 in the higher socioeconomic classes they would they would show off to their friends or things like that 145 00:10:37,000 --> 00:10:42,330 and therefore properties that are over 100 years old could be deemed as historic. 146 00:10:42,330 --> 00:10:47,890 And therefore this neuron as soon as it sees a property over 100 years old it'll fire up and contribute 147 00:10:48,430 --> 00:10:49,360 to the overall price. 148 00:10:49,360 --> 00:10:55,090 And otherwise if it's under 100 years old then it won't work and this is a good example of that. 149 00:10:55,180 --> 00:10:57,580 The rectifier function being applied. 150 00:10:57,580 --> 00:11:04,630 So here you've got like a very like a zero until a certain point and then let's say 100 years old and 151 00:11:04,630 --> 00:11:09,340 then after 100 years old the older it gets the higher the value the higher the contribution of this 152 00:11:09,340 --> 00:11:11,310 neuron to the overall price. 153 00:11:11,470 --> 00:11:18,910 And there's just a wonderful example of a very simple example of this rectifier function in action. 154 00:11:18,910 --> 00:11:19,720 So there we go. 155 00:11:19,750 --> 00:11:20,960 That could be this year. 156 00:11:21,100 --> 00:11:27,840 And moreover the neural network could have picked up things that we wouldn't have thought of ourselves 157 00:11:27,850 --> 00:11:28,530 right. 158 00:11:28,570 --> 00:11:34,360 For instance bedrooms plus distance the city maybe that's in combination somehow contributes to the 159 00:11:34,360 --> 00:11:38,710 price maybe not as strong as the other neurons and it contributes but it still contributes or maybe 160 00:11:38,890 --> 00:11:43,510 it detracts from the price that could also be the case or other things like that and maybe add your 161 00:11:43,510 --> 00:11:51,040 own picked up all for a combination of all four of these parameters and as you can see that these neurons 162 00:11:51,040 --> 00:11:58,840 this whole hidden layer situation allows you to increase the flexibility of your neural network and 163 00:11:58,840 --> 00:12:05,890 allows you to really allows the neural network to look for very specific things and then in combination 164 00:12:06,070 --> 00:12:08,260 that's where the power comes from. 165 00:12:08,260 --> 00:12:12,570 It's like that example the answer I'd like an ad by itself cannot build an anthill. 166 00:12:12,580 --> 00:12:17,920 But when you have like a thousand or 100000 ads they can build an anthill together and that's that's 167 00:12:17,920 --> 00:12:18,600 the situation here. 168 00:12:18,610 --> 00:12:20,990 Each one of these neurons by itself cannot predict the price. 169 00:12:21,100 --> 00:12:27,910 But together they have super powers and they predict the price and they can do quite an accurate job 170 00:12:27,940 --> 00:12:30,300 if trained properly set up properly. 171 00:12:30,730 --> 00:12:35,410 And that's what this whole Course is about understanding how to utilize them. 172 00:12:35,410 --> 00:12:42,430 There we go so that is a step by step example and walkthrough of how neural networks actually work. 173 00:12:42,460 --> 00:12:45,590 I hope you enjoyed today's tutorial and I can't wait to see you next time. 174 00:12:45,610 --> 00:12:47,350 Until then enjoy learning. 20265

Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.