P(y) = exp(y*z)/SUM[y'=0..1](y'*z) = exp(y*z)/(exp(0*z)+exp(1*z)) = exp(y*z)/(1+exp(z)) 6.22

P(y) = Sigmoid((2*y-1)*z) 6.23

where

Deep Learning Book

3636 members.

A supporting forum / community for the book by Ian Goodfellow, Yoshua Bengio and Aaron Courville available at deeplearningbook.org

CommonLounge is a community of learners who learn together. Get started with the featured resources above, ask questions and discuss related stuff with everyone.

comment in this discussion

P(y) = exp(y*z)/SUM[y'=0..1](y'*z) = exp(y*z)/(exp(0*z)+exp(1*z)) = exp(y*z)/(1+exp(z)) 6.22

P(y) = Sigmoid((2*y-1)*z) 6.23

where

Read more… (97 words)

comment in this discussion

comment in this discussion

Ya sure. In probablity theory, the sum of all possible outcomes sum over to 1. For example a coin(fair or not) can only land on two sides, heads and tails. So in neural networ if there are only two possible states 0 and 1 the probablity of them occuring add up to 1.

So if prob(1) is say X, the prob(0) would be 1-x.

Read more… (64 words)