What is the difference between Multiple R-squared and Adjusted R-squared in a single-variate least s
Posted
by fmark
on Stack Overflow
See other posts from Stack Overflow
or by fmark
Published on 2010-05-20T02:17:40Z
Indexed on
2010/05/20
2:20 UTC
Read the original article
Hit count: 360
Could someone explain to the statistically naive what the difference between Multiple R-squared
and Adjusted R-squared
is? I am doing a single-variate regression analysis as follows:
v.lm <- lm(epm ~ n_days, data=v)
print(summary(v.lm))
Results:
Call:
lm(formula = epm ~ n_days, data = v)
Residuals:
Min 1Q Median 3Q Max
-693.59 -325.79 53.34 302.46 964.95
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 2550.39 92.15 27.677 <2e-16 ***
n_days -13.12 5.39 -2.433 0.0216 *
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 410.1 on 28 degrees of freedom
Multiple R-squared: 0.1746, Adjusted R-squared:
0.1451 F-statistic: 5.921 on 1 and 28 DF, p-value: 0.0216
Apologies for the newbiness of this question.
© Stack Overflow or respective owner