CommonLounge is a community of learners who learn together. Get started with the featured resources above, ask questions and discuss related stuff with everyone.
NaN.
discussion
Chapter 5: Machine Learning Basics
If you're reading Deep Learning Book, feel free to ask questions, discuss the material or share resources.
Hi, I have a question about sentences of the page 102 of chapter 5.1.3. Regarding the formula 5.1, it says: this decomposition means that we can solve the ostensibly unsupervised problem of mo...
I've always wondered about deep learning's history, and had read in several articles that neural networks and back-propagation used to be popular in 1980s. Didn't know that there was another deep learning phase in the 1960s!
I am confused by equation 3.28 in section 3.9.5, used to define an empirical distribution. According to the text, when dealing with discrete variables, p(x) is simply the frequency of that val...
Consider what happens when all the xi are equal to some constantc. Analytically,we can see that all the outputs should be...
Read more… (45 words)
Read (45 words)
NaN.
discussion
Question on formula 2.68
Hey, I just started with the book and I have a question on formula 2.68. Here the constraint
D^TD=I_l
is stated. But this would mean that D has to be a orthogonal matrix with n=l. And this would miss the whole point of the compression. At the beginning of section 2.12 it is explained that l<n, columns of D are orthogonal and
Since shape(D) is (n,l), and transpose of D is (l,n), after left multiply D with its transpose, the output shape will be(l,l), and the constraint of all columns of D are orthonormal is essential if you wanna simplify the matrix myltiplication