The major difference between R-squared and adjusted R-squared is that R-squared does not penalise the model for having more number of variables. Thus, if you keep on adding variables to the model, the R-squared will always increase (or remain the same when the value of the correlation between that variable and the dependent variable is 0). Thus, R-squared assumes that any variable added to the model will increase the predictive power.
Adjusted R-squared, on the other hand, penalises models on the basis of the number of variables present in them. So, if you add a variable and the adjusted R-squared drops, you can be certain that that variable is insignificant to the model and should not be used. Thus, in the case of multiple linear regression, you should always look at the adjusted R-squared value in order to keep redundant variables out of your regression model.
Comments