Error sum of squares (SSE) is the sum of squared differences between each observed value (y
i
) and
regressed (predicted) value of
y.
Error sum of squares = (SSE) =
2
ˆ
()−
∑
ii
yy
Figure 14.19 exhibits the measures of variation in simple linear regression. It can be seen easily
thatTotal sum of squares (SST)
= regression sum of squares (SSR) +error sum of squares (SSE), that
is, 138,966.6667(SST) = 125,197.4582 (SSR) +13,769.20842 (SSE)
Figure 14.20 is the ANOVA table produced using MS Excel exhibiting values of SST, SSR and
SSE and other values for Example 14.1. The same ANOVA table as shown in Figure 14.20 can be
obtained ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month, and much more.