## Input Layer

The input layer is the layer that contains the inputs we want to feed into our algorithm, also known as a *model*. Our algorithm will do some calculations using our inputs and then spit out an answer.

The input layer is usually represented as a *vector* of numbers. For our image example, how do we turn an image into a vector of numbers? Since the image is just a *matrix* of *RGB *values, we can just *unroll *or *flatten* the matrix of values into a vector of numbers to be used in our algorithm.

The resources below will go over *vectors* and *matrices,* the first major mathematical concepts you should know to understand neural networks.

**Resources**

## Hidden Layer

The hidden layer(s) is where a lot of the action happens in a neural network. In Figure 1, notice that the layer is split into two portions: the weighted sum calculation and the activation calculation. I won’t go into too much detail about *why* these calculations are done (deep learning courses will dive into this topic), but I will outline the mathematical concepts you should practice to understand how they work.

**Weighted Sum**

The weighted sum, also known as a *dot product* is used to compute a value specified as the *z* value in Figure 1. It uses 3 variables: w, x, and b. W is a matrix of numbers that represent weights, which are initialized before computation. Even though it looks like w and x are being multiplied, there is actually a dot product happening.

W has an exponent *T*, which is not really an exponent, it represents *w transpose. *B is a* *bias, which is just a number that is initialized before computation. X is the input vector from the previous layer. You don’t have to worry about how w and b are initialized, since it will be outlined later in this article. The resources below will go over the dot product, matrix transpose, and matrix multiplication.

**Resources**

**Activation**

The activation calculation is the calculation used to generate the *a *value in Figure 1. The activation calculation uses a *sigmoid *function with a *parameter* of z that was calculated previously. Not all neural networks use a sigmoid function, this is normally used as a starter for simple neural networks. The sigmoid function is used as part of a l*ogistic regression* model. The resources below will go over the sigmoid function and logistic regression.

**Resources**