P(y) = exp(y*z)/SUM[y'=0..1](y'*z) = exp(y*z)/(exp(0*z)+exp(1*z)) = exp(y*z)/(1+exp(z)) 6.22
P(y) = Sigmoid((2*y-1)*z) 6.23
Hi guys! I'm a 30 years old software engineer living in Japan.
Recently, I'm studying machine learning and deep learning for my product and investment for myself.
I'm using text book from Raschka and Udacity deep le...
Ya sure. In probablity theory, the sum of all possible outcomes sum over to 1. For example a coin(fair or not) can only land on two sides, heads and tails. So in neural networ if there are only two possible states 0 and 1 the probablity of them occuring add up to 1.
So if prob(1) is say X, the prob(0) would be 1-x.
Hello, Stefan! Welcome to the world of AI. If you are serious about learning machine learning and deep learning, I recommend you read Michael Nielsen's EBook "Neural Network and Deep Learning"...
Hi,everyone , i am a Chinese student ,not a native English speaker , so you should cost much time to understand my english.
I am a senior student at the jiangsu University, maj...