Bias in Machine Learning?
Bias in Machine Learning and in Artificial Neural Network is very much important. The Bias included in the network has its impact on calculating the net input. The bias is included by adding a value X0 = 1 to the input vector X. The input vector will be –
X = (1, X1, X2, … Xn) [Where X0 is 1 as the bias.]
How does it really work?
The bias is considered as another Weight. We can represent bias as W0 = b. Look at the below image carefully –
Here as you can see X1, X2, …, Xn is the inputs and W1, W2, …, Wn as the weights. From the definition of Bias, we have to add one value to the node, we are giving X0=1, but what about the weight? The Bias is replaced with weight W0 = b.
Now we have to calculate the net input of Y and it’s pretty much simple and understandable from the above image.
Now we have net input of Y and we have to apply the Activation Function over this net input to calculate the output as Y=f(Yin).
Watch on Youtube
Download this Tutorial as a PDF
Linear Function in ANN
Here ‘x’ is the input, ‘m’ is the weight, ‘c’ is the bias and ‘y’ is the output, according to the formula to calculate the net input, The output is -> ” y = m*x + c “, this is basically the equation of a Straight Line.
Types of Bias
There are two types of Bias.
- Positive Bias – It helps in increasing the net input of the network.
- Negative Bias – It helps in decreasing the net input of the network.
Why do we need Bias in Neural Network?
A bias value allows us to shift the activation function to the left or to the right. It is totally based on the Types of Bias. If it is positive, it will increase the net input and if it is negative it will decrease the net input. It helps This is why we need bias.
Some Popular Courses on Neural Networks and Deep Learning
- Coursera – http://bit.ly/tec4tric-neural-1
- Coursera – http://bit.ly/tec4tric-neural-2
- Udemy – http://bit.ly/tec4tric-neural-3