Is the formula 5.4 correct? I expected y to be a scalar since we sum the error over m examples. Please let me know if I am missing something:

Deep Learning Book

3619 members.

A supporting forum / community for the book by Ian Goodfellow, Yoshua Bengio and Aaron Courville available at deeplearningbook.org

CommonLounge is a community of learners who learn together. Get started with the featured resources above, ask questions and discuss related stuff with everyone.

comment in this discussion

Is the formula 5.4 correct? I expected y to be a scalar since we sum the error over m examples. Please let me know if I am missing something:

Read more… (29 words)

comment in this discussion

Hello EveryOne !!!

I am Abdulrehman working as Graduate Researcher in a University of Engineering and Technology, Lahore Pakistan. I have bee...

Read more… (75 words)

comment in this discussion

P(y) = exp(y*z)/SUM[y'=0..1](y'*z) = exp(y*z)/(exp(0*z)+exp(1*z)) = exp(y*z)/(1+exp(z)) 6.22

P(y) = Sigmoid((2*y-1)*z) 6.23

where

Read more… (97 words)

comment in this discussion

Hi guys! I'm a 30 years old software engineer living in Japan.

Recently, I'm studying machine learning and deep learning for my product and investment for myself.

I'm using text book from Raschka and Udacity deep le...

Read more… (57 words)

comment in this discussion

Ya sure. In probablity theory, the sum of all possible outcomes sum over to 1. For example a coin(fair or not) can only land on two sides, heads and tails. So in neural networ if there are only two possible states 0 and 1 the probablity of them occuring add up to 1.

So if prob(1) is say X, the prob(0) would be 1-x.

Read more… (64 words)

comment in this discussion

士坤 郭3y

Hi,everyone , i am a Chinese student ,not a native English speaker , so you should cost much time to understand my english.

I am a senior student at the jiangsu University, maj...

Read more… (70 words)