What is the difference between least squares error and mean squared error?

Least squares error is the method used to find the best-fit line through a set of datapoints. The idea behind the least squares error method is to minimise the square of errors between the actual datapoints and the line fitted. 

Mean squared error, on the other hand, is used once you have fitted the model and want to evaluate it. So, the mean squared error finds out the average of the difference between the actual and predicted values and, hence, is a good parameter to compare various models on the same data set.

Thus, LSE is a method used during model fitting to minimise the sum of squares, and MSE is a metric used to evaluate the model after fitting the model, based on the average squared errors.

Comments