What are Underfitting and Overfitting in Machine Learning?
Underfitting and Overfitting are very common in Machine Learning(ML). Many beginners who are trying to get into ML often face these issues. Well, it is very easy to solve, but for that, you need to learn it first. Firstly, I am going to discuss what are they and how they can affect your model.
What is Machine Learning?
I have already discussed Machine Learning. Read this article – Machine Learning Introduction, Step by Step Guide, because Machines are Learning, now it’s your turn.
Both are Not Good!
Both the Underfitting and Overfitting are not good for a Machine Learning model. To build a good model, you should not overfit or underfit the ML model, we’ll talk about it later.
What is Overfitting?
Overfits happen with the training data. If we train our model for too long with the training data, it will start to overfit. The model will be so good at finding patterns in the training data, that it fails to detect patterns on the test data. It will learn all the patterns in training data that don’t actually generalize the test data. It is very important to learn how to overcome Overfitting. To prevent overfitting, we need to use early stopping. We will discuss this in detail later.
What is Underfitting?
Underfitting is the opposite of Overfitting. Underfit occurs when the model still needs to learn so that it has some improvements to the test data. To prevent underfitting it is better to train the model a little bit more or to use more complete training data. We will discuss this prevention technique in later blog posts.

Keep in Mind
You have to find a balance in such a way so that the model neither Overfits nor Underfits. You have to know how to train a neural network model for an appropriate number of epochs.
Next
In the next blog post, we will discuss how to prevent Underfitting and Overfitting in Machine Learning, so don’t forget to check that. For that, please subscribe to the newsletter, whenever we will release the post you will be notified.