What is Activation Function in Neural Networks?
Activation functions in Neural Networks are the most crucial topic to understand. What is a Neural network? It is also essential to learn what is an artificial neural network first. Then only the whole concept will be understandable, you can find the tutorial for the Artificial Neural Network here.
What is an Activation Function in Neural Networks?
We have to apply the activation function over the calculated net input. The activation function is a mathematical “gate” in between the input layer and its output layer. As a result, we will get the output.
What is Neural Network?
A Neural Network typically consists of three layers. On the other hand if it has many hidden layers, itwill be considered as Deep Neural Network or Deep Learning.
- Input Layer – This layer takes the input.
- Hidden Layer – This layer process the given data. Hidden layer does all the processing.
- Output Layer – This layer produces the output.
For more information about Neural Network, please Click Here to read the blog post.
What is Pre-Activation function?
Pre-activation is taking the summation of all the inputs from the Input layer and produce a net input and it passes that net input to a function and that function is basically the Activation function, which will decide if the neuron will fire or the neuron will not fire. And this decision totally depends on the Threshold value.
What is Thresold Value?
The threshold value is a set of values, based on that the final output of the network will be calculated. For instance, compare the calculated net input and the threshold value to obtain the final output. We can define an activation function using threshold value like this:
For example, here, x is the calculated net input and we are passing the calculated net input through an activation function, and with the help of the threshold value, the activation function will decide if the neuron will fire or not.
For example, If I give you a problem and ask you to solve & tell me the answer, if and only if your answer is greater than or equal to 5. So, we are adding some criteria to the submission of the answer, if it meets that criteria then only produce the output. So, if your result is greater than or equal to 5, then only tell me the answer. I hope you got the idea of the Threshold value.
Why do we need Activation Functions?
- We use the Activation Function to achieve non-linearity. For instance, we can feed any data into it.
- A nonlinear function is used to achieve the advantages of a multilayer network. When input passes through the multilayer network with a linear activation function. The output remains the same as that it could be obtained using a single layer network.
- If we do not apply the activation function then the output would be a linear function, and that will be a Simple Linear Regression Model.
This is why Activation function in Neural networks is very much important.
Types of Activation functions
Basically actvation functions are two types:
- Linear Function
- Non-Linear Function
In short, the output remains the same as the input. Most importanty it is the only Linear Activation Function.
Binary Step Function
Θ is threshold value. This Activation Function converts input to binary output.
Bipolar Step Function
Θ is threshold value. This Activation Function converts input to bipolar output.
Meanwhile, Sigmoidal functions are widely used in back-propagation. There are two types –
- Binary sigmoidal function (Range is between 0 and 1)
- Bipolar sigmoidal function (Range is between -1 and 1)
Hyperbolic Tangent Activation Function – Tanh
Hyperbolic Tangent Activation Function – Tanh is closely related to Bipolar Sigmoidal Function in addition most of the cases it is better to use Tanh function. As a result just implement it from Bipolar Sigmoidal function.
ReLU (Rectified Linear Unit) Activation Function
ReLU is the most commonly used activation function in neural networks, especially in CNNs.
Learn the Activation Function on YouTube
That’s all I have right now with me, I hope you’ve learned the basic things about Activation Functions and why it is so important to use it on Neural networks.