Forums
What is under fitting in ml model and how to solve - Printable Version

+- Forums (https://bdn.bdb.ai)
+-- Forum: BDB Knowledge Base (https://bdn.bdb.ai/forumdisplay.php?fid=13)
+--- Forum: DS Labs (https://bdn.bdb.ai/forumdisplay.php?fid=61)
+---- Forum: DS- Lab Q&A (https://bdn.bdb.ai/forumdisplay.php?fid=63)
+---- Thread: What is under fitting in ml model and how to solve (/showthread.php?tid=462)



What is under fitting in ml model and how to solve - manjunath - 12-23-2022

Underfitting in machine learning refers to a model that is too simple to accurately capture the complexity of the data. This results in poor performance on the training data, and generalizes poorly to new data.
 
There are several ways to solve underfitting in a machine learning model:
 
  • Use a more complex model: One option is to use a more complex model with more parameters. This can help capture more of the complexity of the data. However, be aware that using a very complex model can also lead to overfitting, which is when the model is too specific to the training data and does not generalize well to new data.
     
  • Add more features: Another option is to add more features to the model. This can provide the model with more information and help it capture more of the complexity of the data.
     
  • Increase the amount of training data: Increasing the amount of training data can also help the model capture more of the complexity of the data.
     
  • Regularization: Regularization is a technique that helps prevent overfitting by adding a penalty to the model based on the complexity of the model. This can help balance the trade-off between model complexity and overfitting.
     
  • Early stopping: Early stopping is a technique that involves training the model until the performance on a validation set starts to degrade, and then stopping the training process. This can help prevent overfitting and improve the generalization performance of the model.
 
It is important to keep in mind that underfitting is often a result of a trade-off between model complexity and overfitting. Finding the right balance between these two factors can be key to achieving good performance on the training data and generalization to new data.