Posts

MEDIUM

In statistical learning, one of the most important topics is underfitting and overfitting. They are important because they explain the state of a model based on its performance. The best way to understand these terms is to see them as a trade-off between the bias and the variance of the model.

The term overfitting refers to a model that fits very well to the data with which it is trained, but it poorly generalizes them, meaning that when faced with values other than those of training they are predicted with low precision. Read more...

Personal opinions and brief summary of Confident Data Skills: Master the Fundamentals of Working with Data and Supercharge Your Career (Confident Series) by Kirill Eremenko. Read more...