# Neurons

The basic foundational unit of a neural network is the *neuron*, which is conceptually quite simple.

Each neuron has a set of inputs, each of which is given a specific weight. The neuron computes some function on these weighted inputs. A linear neuron takes a linear combination of the weighted inputs. A sigmoidal neuron feeds the weighted sum of the inputs into the logistic function, which results in a value between 0 and 1.

When the weighted sum is very negative, the return value is very close to 0. When the weighted sum is very large and positive, the return value is very close to 1. The logistic function is important because it introduces a *non-linearity*, and this is important to enable the neural network to learn more complex models. In the absence of these non-linear functions (called *activation functions*), the entire neural network would be a linear function, and the layers would not be useful.