P(y) = exp(y*z)/SUM[y'=0..1](y'*z) = exp(y*z)/(exp(0*z)+exp(1*z)) = exp(y*z)/(1+exp(z)) 6.22

P(y) = Sigmoid((2*y-1)*z) 6.23

where

Deep Learning Book

3394 members.

A supporting forum / community for the book by Ian Goodfellow, Yoshua Bengio and Aaron Courville available at deeplearningbook.org

CommonLounge is a community of learners who learn together. Get started with the featured resources above, ask questions and discuss related stuff with everyone.

comment in this discussion

PB

Pius Braun22w

P(y) = exp(y*z)/SUM[y'=0..1](y'*z) = exp(y*z)/(exp(0*z)+exp(1*z)) = exp(y*z)/(1+exp(z)) 6.22

P(y) = Sigmoid((2*y-1)*z) 6.23

where

Read more… (97 words)

comment in this discussion

Hello EveryOne !!!

I am Abdulrehman working as Graduate Researcher in a University of Engineering and Technology, Lahore Pakistan. I have bee...

Read more… (75 words)

comment in this discussion

Ya sure. In probablity theory, the sum of all possible outcomes sum over to 1. For example a coin(fair or not) can only land on two sides, heads and tails. So in neural networ if there are only two possible states 0 and 1 the probablity of them occuring add up to 1.

So if prob(1) is say X, the prob(0) would be 1-x.

Read more… (64 words)

comment in this discussion

Hi guys! I'm a 30 years old software engineer living in Japan.

Recently, I'm studying machine learning and deep learning for my product and investment for myself.

I'm using text book from Raschka and Udacity deep le...

Read more… (57 words)