QUESTIONS
7.1. (a) In the regression context, the method of least squares estimates the regression parameters in such a way that the sum of the squared difference between the actual Y values (i.e., the values of the dependent variable) and the estimated Y values is as small as possible. (b) The estimators of the regression parameters obtained by the method of least squares. (c) An estimator being a random variable, its variance, like the variance of any random variable, measures the spread of the estimated values around the mean value of the estimator. (d) The (positive) square root value of the variance of an estimator. (e) Equal variance. (f) Unequal variance. (g) Correlation between successive values of a random variable. (h) In the regression context, TSS is the sum of squared difference between the individual and the mean value of the dependent variable Y, namely, [pic]. (i) ESS is the part of the TSS that is explained by the explanatory variable(s). (j) RSS is the part of the TSS that is not explained by the explanatory variable(s), the X variable(s). (k) It measures the proportion of the total variation in Y explained by the explanatory variables. In short, it is the ratio of ESS to TSS. (l) It is the standard deviation of the Y values about the estimated regression line. (m) BLUE means best linear unbiased estimator, that is, a linear estimator that is unbiased and has the least variance in the class of all such linear unbiased estimators. (n) A statistical procedure of testing statistical hypotheses. (o) A test of significance based on the t distribution. (p) In a one-tailed test, the alternative hypothesis is one-sided. For example: [pic] against [pic] or [pic], where ( is the mean value. (q) In a two-tailed test, the