Would you like to inspect the original subtitles? These are the user uploaded subtitles that are being translated:
0
00:00:01,040 --> 00:00:02,200
[Autogenerated] all of our discussion so
1
00:00:02,200 --> 00:00:03,940
far should have emphasized to you the
2
00:00:03,940 --> 00:00:06,000
importance off the features that you used
3
00:00:06,000 --> 00:00:07,839
to train your machine learning models.
4
00:00:07,839 --> 00:00:09,560
When we spoke off the basic machine
5
00:00:09,560 --> 00:00:11,519
learning workflow in the previous clip, we
6
00:00:11,519 --> 00:00:14,000
discussed the importance off the 1st 3
7
00:00:14,000 --> 00:00:16,829
steps. Selecting and extracting the right
8
00:00:16,829 --> 00:00:19,269
features to train your model. These are
9
00:00:19,269 --> 00:00:22,219
onerous, and I am consuming on. All of
10
00:00:22,219 --> 00:00:25,219
these encompass feature engineering. What
11
00:00:25,219 --> 00:00:27,839
exactly is featured engineering? Well,
12
00:00:27,839 --> 00:00:29,789
feature engineering basically involves
13
00:00:29,789 --> 00:00:31,800
working with your features. Engineering
14
00:00:31,800 --> 00:00:34,070
your features so that you get the best out
15
00:00:34,070 --> 00:00:36,060
off your machine learning Marty. And this
16
00:00:36,060 --> 00:00:38,390
can be anything. Feature. Engineering
17
00:00:38,390 --> 00:00:40,649
isn't just one technique or a set off
18
00:00:40,649 --> 00:00:42,719
techniques or a class of techniques. It's
19
00:00:42,719 --> 00:00:45,899
block and tackle work. You're inevitably
20
00:00:45,899 --> 00:00:47,780
barking and improving the features off
21
00:00:47,780 --> 00:00:50,600
your model. Also featured engineering
22
00:00:50,600 --> 00:00:53,200
tends to be bespoke. It's specific to the
23
00:00:53,200 --> 00:00:55,109
problem that you're working on and the
24
00:00:55,109 --> 00:00:57,259
data that you have to work with. There are
25
00:00:57,259 --> 00:01:00,000
no set of techniques that apply to all
26
00:01:00,000 --> 00:01:02,909
classified files or regresses. You can't
27
00:01:02,909 --> 00:01:04,799
say that this kind of future engineering
28
00:01:04,799 --> 00:01:07,079
works best for noodle networks. Where is
29
00:01:07,079 --> 00:01:09,000
this other kind of future? Engineering
30
00:01:09,000 --> 00:01:11,569
works were traditionally male models. The
31
00:01:11,569 --> 00:01:13,269
ideal and principles off feature
32
00:01:13,269 --> 00:01:16,030
engineering lie along a continuum between
33
00:01:16,030 --> 00:01:18,519
art and science. It's not quite art. It's
34
00:01:18,519 --> 00:01:20,459
not quite signs. It's more just
35
00:01:20,459 --> 00:01:22,950
engineering, where you basically bring
36
00:01:22,950 --> 00:01:25,349
your features together in a form that
37
00:01:25,349 --> 00:01:28,200
build robust models. There is no one size
38
00:01:28,200 --> 00:01:31,569
fits all imagined. Future engineering as a
39
00:01:31,569 --> 00:01:34,219
very broad umbrella with encompasses a
40
00:01:34,219 --> 00:01:35,709
number of different techniques. This
41
00:01:35,709 --> 00:01:37,879
involves feature selection using
42
00:01:37,879 --> 00:01:40,060
techniques to select the most relevant
43
00:01:40,060 --> 00:01:42,500
features for your model. Feature.
44
00:01:42,500 --> 00:01:45,069
Engineering also encompasses feature
45
00:01:45,069 --> 00:01:47,370
learning where you can use supervised on
46
00:01:47,370 --> 00:01:49,840
unsupervised techniques to learn late and
47
00:01:49,840 --> 00:01:52,390
features that exists in your data. Feature
48
00:01:52,390 --> 00:01:54,760
engineering includes feature extraction as
49
00:01:54,760 --> 00:01:57,420
well. Feature extraction involves the
50
00:01:57,420 --> 00:01:59,909
transformation or reorientation off your
51
00:01:59,909 --> 00:02:02,390
input features into fundamentally
52
00:02:02,390 --> 00:02:04,629
transformed derived features, which are
53
00:02:04,629 --> 00:02:06,879
often unrecognizable and cannot be
54
00:02:06,879 --> 00:02:09,610
interpreted. Feature engineering also
55
00:02:09,610 --> 00:02:11,889
include FEATURE combination. You might
56
00:02:11,889 --> 00:02:14,669
have raw granular features in your data.
57
00:02:14,669 --> 00:02:17,139
You might combine these features together
58
00:02:17,139 --> 00:02:19,150
to get a feature that is more meaningful
59
00:02:19,150 --> 00:02:21,669
and has more predictive power. And
60
00:02:21,669 --> 00:02:23,710
finally, feature engineering includes
61
00:02:23,710 --> 00:02:25,840
dimensionality reduction as well.
62
00:02:25,840 --> 00:02:28,150
Dimensionality reduction involves reducing
63
00:02:28,150 --> 00:02:30,360
the complexity off your input data. It
64
00:02:30,360 --> 00:02:32,770
also involves re orienting your features
65
00:02:32,770 --> 00:02:36,000
along new axes, which better represent your data
5112
Can't find what you're looking for?
Get subtitles in any language from opensubtitles.com, and translate them here.