What is Overfitting and Underfitting in Machine Learning?

Spread the love

Overfitting in Machine Learning –

Overfitting happens when we apply a complex model to solve a simple problem.

Overfitting happens when the model performs well on the training data but do not generalize well on the unseen data. This happens when the model fits the data too well and starts capturing noises in the data.

How to overcome overfitting?

1 . Simplify the model by selecting one with fewer parameters (eg. a linear model rather than a high degree polynomial model)

2 . By regularizing the model i.e. putting constraints on the model.

3 . By gathering more training data

4 . Removing useless features that doesn’t improve the model.

5 . Removing the noise in the training data( eg. fix data errors and remove outliers).

Underfitting in Machine Learning –

Underfitting happens when we use a simple model to solve a complex problem.

Underfitting is the opposite of overfitting. It occurs when our model is too simple to learn the underlying structure of the data.

How to overcome underfitting ?

1 . Select a more powerful model with more parameters.

2 . Feed better features to the learning algorithm( feature engineering).

3 . Reduce the constraints on the model.

4 . Note – Adding more training examples will not help.

Related Posts –

  1. What is the Bias/Variance Trade-off in Machine Learning?

Rating: 1 out of 5.

Leave a Reply