Remember everything you learn about any topic

Lernabit is a notecard app that creates a review schedule so you remember more of what you learn

How it works

You write down what you learn
Lernabit reminds you to review it
You remember it forever

Overfitting and underfitting in machine learning

In , overfitting is a phenomenon in which the learning algorithm is trained against practice data so much that it doesn't do well when given real world data. Training data is useful to help the algorithm start to learn, but too much training can cause it to only learn how to predict the training data, while leaving it unable to translate the model into a real world prediction. I'm not sure if this is a good analogy, but it sounds a little bit like the machine learning equivalent of "teaching to the test". If you are studying for a test and just read practice questions, all you do is remember the answers to the test without fully understanding the fundamental concepts. That's what overfitting is. The machine learning model becomes so tightly bound to the training data that it doesn't know what to do when it gets something else. But you also have to be careful about underfitting, which is when the model is not trained enough and can't make real world predictions because it hasn't seen enough practice data. The challenge of training a model is to find the sweet spot between underfitting and overfitting.

Get a free account

Start creating your own notes and remember more of what you learn. Signup now to get started.

You might like