Several approaches to data analysis can be used when autocorrelation is present. One uses additional independent variables and another transforms the independent variable. • Addition of Independent Variables
Often the reason autocorrelation occurs in regression analyses is that one or more important predictor variables have been left out of the analysis. For example, suppose a researcher develops a regression forecasting model that attempts to predict sales of new homes by sales of used homes over some period of time. Such a model might contain significant autocorrelation. The exclusion of the variable “prime mortgage interest rate” might be a factor driving the autocorrelation between the other two variables. Adding this variable to the regression model might significantly reduce the autocorrelation.
• Transforming Variables
When the inclusion of additional variables is not helpful in reducing autocorrelation to an acceptable level, transforming the data in the variables may help to solve the problem. One such method is the first-differences approach. With the first-differences approach, each value of X is subtracted from each succeeding time period value of X; these “differences” become the new and transformed X variable. The same process is used to transform the Y variable. The regression analysis is then computed on the transformed X and transformed Y variables to compute a new model that is hopefully free of significant autocorrelation effects. Another way is to generate new variables by using the percentage changes from period to period and regressing these new variables. A third way is to use autoregression models.
• Autoregression
A forecasting technique that takes advantage of the relationship of values (Yt) to previous period values (Y t-1, Y t-2, Y t-3 . . .) is called autoregression. Autoregression is a multiple regression technique in which the independent variables are time-lagged versions