First we define some important concepts. A stochastic process (c.q. probabilistic process) is defined by a T-dimensional distribution function.
Time Series Analysis - ARIMA models - Basic Definitions and Theorems about ARIMA models
marginal distribution function of a time series
(V.I.1-1)
Before analyzing the structure of a time series model one must make sure that the time series are stationary with respect to the variance and with respect to the mean. First, we will assume statistical stationarity of all time series (later on, this restriction will be relaxed).
Statistical stationarity of a time series implies that the marginal probability distribution is time-independent which means that: bullet the expected values and variances are constant
stationary time series - expected values and variances are constant
(V.I.1-2)
where T is the number of observations in the time series;
bullet
the autocovariances (and autocorrelations) must be constant
stationary time series - autocovariances (and autocorrelations) are constant
(V.I.1-3)
where k is an integer time-lag;
bullet
the variable has a joint normal distribution f(X1, X2, ..., XT) with marginal normal distribution in each dimension
stationary time series - normality assumption
(V.I.1-4)
If only this last condition is not met, we denote this by weak stationarity.
Now it is possible to define white noise as a stochastic process (which is statistically stationary) defined by a marginal distribution function (V.I.1-1), where all Xt are independent variables (with zero covariances), with a joint normal distribution f(X1, X2, ..., XT), and with
variance and expected value of white noise
(V.I.1-5)
It is obvious from this definition that for any white noise process the probability function can be written as
probability density function of white noise
(V.I.1-6)
Define the autocovariance as
autocovariance