|
4 | 4 | Machine learning allows computers to find hidden insights without being explicitly programmed where to look or what to look for.
|
5 | 5 |
|
6 | 6 | 00:00:06.840 --> 00:00:14.060
|
7 |
| -Thanks to the work of some dedicated developers, Python has one of the best machine learning platforms out there called Scikit-Learn. |
| 7 | +Thanks to the work of some dedicated developers, Python has one of the best machine learning platforms out there called scikit-learn. |
8 | 8 |
|
9 | 9 | 00:00:14.060 --> 00:00:19.180
|
10 |
| -In this episode, Alexander Gramfort is here to tell us about Scikit-Learn and machine learning. |
| 10 | +In this episode, Alexander Gramfort is here to tell us about scikit-learn and machine learning. |
11 | 11 |
|
12 | 12 | 00:00:19.180 --> 00:00:25.460
|
13 | 13 | This is Talk Python To Me, number 31, recorded Friday, September 25, 2015.
|
@@ -1048,10 +1048,10 @@ Like, what kind of problems would I bring that in for?
|
1048 | 1048 | I guess it's hard to summarize.
|
1049 | 1049 |
|
1050 | 1050 | 00:30:44.060 --> 00:31:02.060
|
1051 |
| -The hundreds of hundreds of pages that you have in Scikit-Learn in the documentation, I'm trying to give you a big picture without too much technical detail to tell you when these algorithms are useful and what they are useful for, and what are the hypotheses and what kind of output you can hope to get. |
| 1051 | +The hundreds of hundreds of pages that you have in scikit-learn in the documentation, I'm trying to give you a big picture without too much technical detail to tell you when these algorithms are useful and what they are useful for, and what are the hypotheses and what kind of output you can hope to get. |
1052 | 1052 |
|
1053 | 1053 | 00:31:02.800 --> 00:31:05.760
|
1054 |
| -It's one of the strengths of the Scikit-Learn documentation, by the way. |
| 1054 | +It's one of the strengths of the scikit-learn documentation, by the way. |
1055 | 1055 |
|
1056 | 1056 | 00:31:05.760 --> 00:31:22.280
|
1057 | 1057 | And so to answer your question, dimensionality reduction, I would say like the 101 way of doing it is the principal component analysis, where you're trying to extract subspace that captures the most variance in the data.
|
@@ -1171,7 +1171,7 @@ So you mentioned neural networks.
|
1171 | 1171 | Yes.
|
1172 | 1172 |
|
1173 | 1173 | 00:34:21.040 --> 00:34:23.380
|
1174 |
| -So Scikit-Learn has support for neural networks as well? |
| 1174 | +So scikit-learn has support for neural networks as well? |
1175 | 1175 |
|
1176 | 1176 | 00:34:23.380 --> 00:34:29.560
|
1177 | 1177 | Well, you have a multilayer perception, which is like the basic neural network.
|
|
0 commit comments