STAT 350: 97-1
Final Exam, 8 April 1997Instructor: Richard Lockhart
Instructions: This is an open book test. You may use notes, text, other books and a calculator. Your presentations of statistical analysis will be marked for clarity of explanation. I expect you to explain what assumptions you are making and to comment if those assumptions seem unreasonable. The exam is out of 60.
where the errors are independent normal variables
and have mean 0 and
variance
. What is the design matrix for this linear model? [2 marks]
Solution:
Solution:
and
so that
Solution:
and
Solution:
so that has a multivariate normal distribution
with mean 0 and variance covariance matrix
Solution:
[4 marks]
Solution:
The variances of the errors are and
so that
the weights are
and
. Then
and
so that
Solution:
Normal with mean and variance
.
Solution: We would have 4 degrees of freedom for error rather than 1.
Vars | ESS | Vars | ESS | Vars | ESS | Vars | ESS |
![]() | 154 | ![]() | 109 | ![]() | 133 | ![]() | 139 |
![]() | 156 | ![]() | 144 | ![]() | 175 | ![]() | 132 |
![]() | 203 | ![]() | 146 | ![]() | 106 | All | 104 |
![]() | 250 | ![]() | 150 | ![]() | 107 | None | 506 |
Solution: This compares the model with all variables in to the model with just and
and
so
This is much less than 1 so the added variables are not significant.
Solution: We begin with all variables. Among the 3 variable models the
model containing only ,
and
has the smallest error sum of squares
so if we delete a variable it must be
. The F statistic is
so we delete . Among the two variable models which contain 2 of the variables
,
and
the model containing
and
has the smallest
error sum of squares so we try to delete
getting
which is still far from significant. We delete and look at 1 variable
models which either use
or
. The smallest error SS is for
so we
try to delete
getting
We compare this to the F tables with 1 numerator and 17 denominator degrees
of freedom and see that so that
and
will be retained.
Solution:
.
Model I
the error sum of squares is 85355 and the estimates are
![]() | ![]() |
37.26 | 0.65 |
Model II
the error sum of squares is 66115 and the estimates are
![]() | ![]() | ![]() | ![]() | ![]() | ![]() |
14.2424 | 67.5325 | 48.3918 | 49.6033 | 68.7786 | 0.5509 |
Model III
the error sum of squares is 62433 and the estimates are
![]() | ![]() | ![]() | ![]() | ![]() | |
52.04954 | -68.05918 | 62.48453 | 46.66416 | 112.529 | |
![]() | ![]() | ![]() | ![]() | ![]() | |
0.2309892 | 1.619385 | 0.4313726 | 0.5757114 | 0.1818949 |
Solution: Testing Model III vs Model II we get
which is not significant. Thus Model II is preferred to Model III. Comparing Model II to Model I we have
which leads to a P-value around 0.03 so that Model II is preferred to Model I.
Solution:
I want confidence intervals for the 10 values of with i;SPMlt;j. To get
simultaneous 95% confidence intervals you divide
by 10 and just work
out ordinary 99.5% confidence intervals. The t multiplier is around 2.96. z
You also need a standard error for
which is
the square root of
You estimate using 66115/44 and get
Solution: The plots show clear signs of heteroscedasticity; a transformation might be useful. (In fact taking logs is the thing to do.)
Solution:
I just wanted people to compare the various statistics to the guidelines in the text. For the externally studentized residuals I was looking for some mention of the Bonferroni adjustment. Cases 15 and 44 stand out as worth looking at again.
Diagnostics for Model II for Question 3
Ext'ly | Ext'ly | ||||||||
Obs | ![]() | Stud'zed | DFFITS | Cooks | Obs | ![]() | Stud'zed | DFFITS | Cooks |
# | Residual | ![]() | # | Residual | ![]() | ||||
1 | 0.120 | -0.777 | -0.287 | 0.014 | 26 | 0.100 | -0.096 | -0.032 | 0.000 |
2 | 0.108 | 0.407 | 0.142 | 0.003 | 27 | 0.100 | 2.018 | 0.674 | 0.071 |
3 | 0.129 | 0.047 | 0.018 | 0.000 | 28 | 0.158 | 0.768 | 0.333 | 0.019 |
4 | 0.103 | 0.868 | 0.295 | 0.015 | 29 | 0.104 | -0.475 | -0.162 | 0.004 |
5 | 0.101 | 0.141 | 0.047 | 0.000 | 30 | 0.106 | -0.997 | -0.343 | 0.020 |
6 | 0.124 | -0.377 | -0.142 | 0.003 | 31 | 0.102 | -1.133 | -0.383 | 0.024 |
7 | 0.102 | 0.681 | 0.229 | 0.009 | 32 | 0.144 | -0.139 | -0.057 | 0.001 |
8 | 0.150 | -0.578 | -0.243 | 0.010 | 33 | 0.154 | -0.201 | -0.086 | 0.001 |
9 | 0.148 | -0.180 | -0.075 | 0.001 | 34 | 0.103 | 1.186 | 0.401 | 0.027 |
10 | 0.127 | -0.261 | -0.099 | 0.002 | 35 | 0.137 | -0.009 | -0.004 | 0.000 |
11 | 0.121 | 1.073 | 0.398 | 0.026 | 36 | 0.134 | 0.607 | 0.238 | 0.010 |
12 | 0.100 | -1.076 | -0.359 | 0.021 | 37 | 0.114 | 0.184 | 0.066 | 0.001 |
13 | 0.102 | -0.179 | -0.060 | 0.001 | 38 | 0.101 | 0.069 | 0.023 | 0.000 |
14 | 0.130 | 0.329 | 0.127 | 0.003 | 39 | 0.109 | 0.372 | 0.130 | 0.003 |
15 | 0.106 | 3.436 | 1.186 | 0.188 | 40 | 0.101 | -0.934 | -0.312 | 0.016 |
16 | 0.180 | -0.613 | -0.288 | 0.014 | 41 | 0.115 | -2.130 | -0.766 | 0.091 |
17 | 0.104 | -0.306 | -0.104 | 0.002 | 42 | 0.146 | -0.732 | -0.303 | 0.015 |
18 | 0.100 | 0.516 | 0.172 | 0.005 | 43 | 0.126 | 1.295 | 0.491 | 0.040 |
19 | 0.110 | -1.138 | -0.401 | 0.027 | 44 | 0.101 | 3.038 | 1.016 | 0.145 |
20 | 0.110 | -1.742 | -0.611 | 0.059 | 45 | 0.107 | -1.635 | -0.565 | 0.051 |
21 | 0.117 | 0.211 | 0.076 | 0.001 | 46 | 0.148 | 1.019 | 0.425 | 0.030 |
22 | 0.130 | 0.385 | 0.149 | 0.004 | 47 | 0.115 | 0.417 | 0.150 | 0.004 |
23 | 0.152 | -0.699 | -0.296 | 0.015 | 48 | 0.103 | 0.105 | 0.036 | 0.000 |
24 | 0.111 | -0.320 | -0.113 | 0.002 | 49 | 0.143 | -0.911 | -0.372 | 0.023 |
25 | 0.104 | -0.715 | -0.243 | 0.010 | 50 | 0.142 | -0.333 | -0.135 | 0.003 |