P(y) = exp(y*z)/SUM[y'=0..1](y'*z) = exp(y*z)/(exp(0*z)+exp(1*z)) = exp(y*z)/(1+exp(z)) 6.22
P(y) = Sigmoid((2*y-1)*z) 6.23
Ya sure. In probablity theory, the sum of all possible outcomes sum over to 1. For example a coin(fair or not) can only land on two sides, heads and tails. So in neural networ if there are only two possible states 0 and 1 the probablity of them occuring add up to 1.
So if prob(1) is say X, the prob(0) would be 1-x.
Hi guys! I'm a 30 years old software engineer living in Japan.
Recently, I'm studying machine learning and deep learning for my product and investment for myself.
I'm using text book from Raschka and Udacity deep le...