14.5: r² and the Standard Error of the Estimate of y′ (2024)

  1. Last updated
  2. Save as PDF
  • Page ID
    52061
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}}}\)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}} % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}} % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}}}\)

    \( \newcommand{\vectorC}[1]{\textbf{#1}}\)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}}\)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}}\)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}}}\)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)

    Consider the deviations :

    14.5: r² and the Standard Error of the Estimate of y′ (2)

    Looking at the picture we see that

    \[\begin{eqnarray*} \textrm{total deviation} & = & \textrm{explained deviation + unexplained deviation}\\ (y_{i} - \overline{y}_{i}) & = & (y^{\prime}_{i} - \overline{y}_{i}) + (y_{i} - y^{\prime}_{i}) \end{eqnarray*}\]

    Remember that variance is the sum of the squared deviations (divided by degrees of freedom), so squaring the above and summing gives:

    \[ \sum_{i=1}^{n} (y_{i} - \overline{y}_{i})^{2} = \sum_{i=1}^{n} (y^{\prime}_{i} - \overline{y}_{i})^{2} + \sum_{i=1}^{n} (y_{i} - y^{\prime}_{i})^{2} \]

    (the cross terms all cancel because \(y^{\prime}\) is the least square solution and \(a = \overline{y} - b \overline{x}\), see Section 14.6.1, below, for details). This is also a sum of squares statement:

    \[ \mbox{SS}_{T} = \mbox{SS}_{R} + \mbox{SS}_{E} \]

    where SS\(_{E} = \sum (y_{i} - y^{\prime}_{i})^{2}\), SS\(_{T} = \sum (y_{i} - \overline{y})^{2}\) and SS\(_{R} = \sum (y^{\prime}_{i} - \overline{y})^{2}\) are the sum of squares — error, sum of squares — total and sum of squares — regression (explained) respectively.

    Dividing by the degrees of freedom, which is \(n-2\) in this {\em bivariate} situation, we get:

    \[\begin{eqnarray*} \frac{\sum (y_{i} - \overline{y}_{i})^{2}}{n-2} & = & \frac{\sum (y^{\prime}_{i} - \overline{y}_{i})^{2}}{n-2} + \frac{\sum (y_{i} - y^{\prime}_{i})^{2}}{n-2} \\ \mbox{total variance} & = & \mbox{explained variance} + \mbox{unexplained variance} \\ & = & \mbox{signal (or model)} + \mbox{noise} \end{eqnarray*}\]

    It turns out that

    \[ r^{2} = \frac{\mbox{explained variance}}{\mbox{total variance}} = \frac{ \mbox{SS}_{R} }{ \mbox{SS}_{T} } \]

    The quantity \(r^{2}\) is called the coefficient of determination and gives the the fraction of variance explained by the model (here the model is the equation of a line). The quantity \(r^{2}\) appears with many statistical models. For example with ANOVA it turns out that the “effect size” eta-squared is the fraction of variance explained by the ANOVA model[1], \(\eta^{2} = r^{2}\).

    The standard error of the estimate is the standard deviation of the noise (the square root of the unexplained variance) and is given by

    \[ s_{\mbox{est}} = \sqrt{\frac{\sum (y - y^{\prime})^{2} }{n-2} } = \sqrt{\frac{\sum y^{2} - a \sum y - b \sum xy}{n-2} } \]

    Example 14.4: Continuing with the data of Example 14.3, we had

    \[ \sum y = 511 \;\;\; \sum y^{2} = 38993 \;\;\; \sum xy = 3745 \;\;\; a = 102.493 \;\;\; b = -3.622 \;\;\; n=7 \]

    so

    \[\begin{eqnarray*} s_{\mbox{est}} & = & \sqrt{\frac{(38993) - (102.493)(511) - (-3.622)(3745)}{5} } \\ s_{\mbox{est}} & = & \sqrt{\frac{38993 - 52373.923 + 13564.39}{5} } \\ s_{\mbox{est}} & = & 6.06 \end{eqnarray*}\]

    Here is a graphical interpretation of \(s_{\mbox{est}}\) :

    14.5: r² and the Standard Error of the Estimate of y′ (3)

    The assumption for computing confidence intervals for is that \(s_{\mbox{est}}\) is independent of \(x\). This is the assumption of homoscedasticity. You can think of the regression situation as a generalized one-way ANOVA where instead of having a finite number of discrete populations for the IV, we have an infinite number of (continuous) populations. All the populations have the same variance \(\sigma^{2}\) (and they are assumed to be normal) and \(s_{\mbox{est}^{2}}\) is the pooled estimate of that variance.

    Squaring both sides of

    \[ (y_{i} - \overline{y}_{i}) = (y^{\prime}_{i} - \overline{y}_{i}) + (y_{i} - y^{\prime}_{i}) \]

    and summing gives

    \[ \sum (y_{i} - \overline{y}_{i})^{2} = \sum (y^{\prime}_{i} - \overline{y}_{i})^{2} + \sum (y_{i} - y^{\prime}_{i})^{2} + \sum 2 (y^{\prime}_{i} - \overline{y})(y_{i} - y^{\prime}_{i}) \]

    Working on that cross term, using \(a = \overline{y} - b \overline{x}\), we get

    \[\begin{eqnarray*} \sum 2 (y^{\prime}_{i} - \overline{y})(y_{i} - y^{\prime}_{i}) & = & \sum 2((\overline{y} - b \overline{x} + b x_{i}) - \overline{y})(y_{i} - y^{\prime}_{i}) \\ & = & \sum 2((\overline{y} + b(x_{i} - \overline{x}))-\overline{y})(y_{i} - y^{\prime}_{i}) \\ & = & \sum 2(b(x_{i}- \overline{x}))(y_{i} - y^{\prime}_{i}) \\ & = & \sum 2 b (x_{i} - \overline{x})(y_{i} - (\overline{y} + b(x_{i} - \overline{x}))) \\ & = & \sum 2 b ((y_{i} - \overline{y})(x_{i} - \overline{x}) - b(x_{i} - \overline{x})^{2}) \\ & = & 2 b \sum ((y_{i} - \overline{y})(x_{i} - \overline{x}) - (y_{i} - \overline{y})(x_{i} - \overline{x})) = 0 \end{eqnarray*}\]

    where

    \[ b = \frac{\sum (x_{i}-\overline{x})(y_{i}-\overline{y})}{\sum(x_{i}-\overline{x})^{2}} \]

    was used in the last line.

    1. In ANOVA the ``model'' is the difference of means between the groups. We will see more about this aspect of ANOVA in Chapter 17.
    14.5: r² and the Standard Error of the Estimate of y′ (2024)

    References

    Top Articles
    Latest Posts
    Recommended Articles
    Article information

    Author: Edwin Metz

    Last Updated:

    Views: 5601

    Rating: 4.8 / 5 (58 voted)

    Reviews: 89% of readers found this page helpful

    Author information

    Name: Edwin Metz

    Birthday: 1997-04-16

    Address: 51593 Leanne Light, Kuphalmouth, DE 50012-5183

    Phone: +639107620957

    Job: Corporate Banking Technician

    Hobby: Reading, scrapbook, role-playing games, Fishing, Fishing, Scuba diving, Beekeeping

    Introduction: My name is Edwin Metz, I am a fair, energetic, helpful, brave, outstanding, nice, helpful person who loves writing and wants to share my knowledge and understanding with you.