Overfitting and underfitting in machine learning

aaron's profile image
By aaron

In #MachineLearning, overfitting is a phenomenon in which the learning algorithm is trained against practice data so much that it doesn't do well when given real world data.

Training data is useful to help the algorithm start to learn, but too much training can cause it to only learn how to predict the training data, while leaving it unable to translate the model into a real world prediction.

I'm not sure if this is a good analogy, but it sounds a little bit like the machine learning equivalent of "teaching to the test". If you are studying for a test and just read practice questions, all you do is remember the answers to the test without fully understanding the fundamental concepts. That's what overfitting is. The machine learning model becomes so tightly bound to the training data that it doesn't know what to do when it gets something else.

But you also have to be careful about underfitting, which is when the model is not trained enough and can't make real world predictions because it hasn't seen enough practice data.

The challenge of training a model is to find the sweet spot between underfitting and overfitting.

#ArtificialIntelligence #programming #statistics


Support the author

This author accepts donations via the services listed below. Your donation will help them continue to create great content!

* Lernabit doesn't take any of the money from your donation, but the donation services or payment processors might take a fee. These trademarks are the property of their respective owners.


Login or signup to leave a reply.

Signup Login
No more replies to show here