Q1
It was mentioned in the chapter that a cubic regression spline with one knot at ξξ can be obtained using a basis of the form , where if and equals 0 otherwise. We will now show that a function of the form
1.a
Find a cubic polynomial such that for all . Express in terms of .
For , we have so we take .
1.b
Find a cubic polynomial such that for all . Express in terms of .
For , we have so we take .
1.c
Show that . That is, is continuous at .
1.d
Show that . That is, is continuous at .Therefore, is indeed a cubic spline.
Q2
Suppose that a curve is computed to smoothly fit a set of n points using the following formula:
where represents the th derivative of .
Provide example sketches of in each of the following scenarios
(a)
In this case because a large smoothing parameter forces
(b)
In this case because a large smoothing parameter forces
(c)
In this case because a large smoothing parameter forces
(d)
In this case because a large smoothing parameter forces
(e)
This penalty term doesn't play any role, so in this case is the interpolating spline.
Q3
Suppose we fit a curve with basis function . We fit the linear regression model:
and obtain coefficient estimates . Sketch the estimated curve between and . Note the intercepts, slopes, and other relevant information.
Q5
Suppose two curves, and , defined by
where represents the th derivative of .
(a). As , will or have the smaller training RSS?
The smoothing spline will probably have the smaller training RSS because it will be a higher order polynomial due to the order of the penalty term(it will be more flexible)
(b). As , will or have the smaller test RSS?
As mentioned above we expect to be more flexible, so it may overfit the data. It will probably be that have the smaller test RSS
(b). As , will or have the smaller training and test RSS?
If , we have , so they will have the same training and test RSS