This intermediate-level course introduces the mathematical foundations to derive Principal Component Analysis (PCA), a fundamental dimensionality reduction technique. We'll cover some basic statistics of data sets, such as mean values and variances, we'll compute distances and angles between vectors using inner products and derive orthogonal projections of data onto lower-dimensional subspaces. Using all these tools, we'll then derive PCA as a method that minimizes the average squared reconstruction error between data points and their reconstruction.
- 5 stars51.01%
- 4 stars22.59%
- 3 stars12.83%
- 2 stars6.69%
- 1 star6.86%
MATHEMATICS FOR MACHINE LEARNING: PCA からの人気レビュー
It is a bit difficult and jumpy. You will need some hard work to fill in the missing links of knowledge which not explicite on the lectrue. Overall, great experience.
Course content is interesting and well planned, Can be improved by making it Simpler for Students as it was more technical than the other 2 courses of the Specialization.
The course is generally good but the assignment setting definitely needs to be rectified. Thanks anyway for this course. An important element of machine learning.
This is one hell of an inspiring course that demystified the difficult concepts and math behind PCA. Excellent instructors in imparting the these knowledge with easy-to-understand illustrations.
What level of programming is required to do this course?
How difficult is this course in comparison to the other two of this specialization?