The linear Gaussian white noise process (LGWNP) is an independent and identically distributed ( iid ) sequence with zero mean and finite variance with distribution . Some processes, such as the simple bilinear white noise process (SBWNP), have the same covariance structure like the LGWNP. How can these two processes be distinguished and/or compared? If is a realization of the SBWNP . This paper studies in detail the covariance structure of . It is shown from this study that; 1) the covariance structure of is non-normal with distribution equivalent to the linear ARMA(2, 1) model ; 2) the covariance structure of is iid ; 3) the variance of can be used for comparison of SBWNP and LGWNP.
A stochastic process X t , t ∈ Z , where Z = { ⋯ , − 1 , 0 , 1 , ⋯ } is called a white noise or purely random process, if with finite mean and finite variance, all the autocovariances are zero except at lag zero. In many applications, X t , t ∈ Z is assumed to be normally distributed with mean zero and variance, σ 2 < ∞ , and the series is called a linear Gaussian white noise process with the following properties [
E ( X t ) = μ (1.1)
R ( 0 ) = var ( X t ) = E ( X t − μ ) = σ 2 (1.2)
R ( k ) = cov ( X t , X t + k ) = E [ ( X t − μ ) ( X t + k − μ ) ] = { σ 2 , k = 0 0 , otherwise (1.3)
ρ ( k ) = corr ( X t , X t + k ) = R ( k ) R ( 0 ) = { 1 , k = 0 0 , otherwise (1.4)
ϕ k k = corr ( X t , X t + k / X t + 1 , X t + 2 , ⋯ , X t + k − 1 ) = 0 ∀ k (1.5)
where R(k) is the autocovariance function at lag k, rk is the autocorrelation function at lag k and ϕ k k is the partial autocorrelation function at lag k.
In other words, a stochastic process X t , t ∈ Z is called a linear Gaussian white noise if X t , t ∈ Z is a sequence of independent and identically distributed (iid) random variables with finite mean and finite variance. Under the assumption that the sample X 1 , X 2 , ⋯ , X n is an iid sequence, we compute the sample autocorrelations as
ρ ^ X ( k ) = ∑ t = 1 n ( X t − X ¯ ) ( X t + k − X ¯ ) ∑ t = 1 n ( X t − X ¯ ) 2 (1.6)
where
X ¯ = 1 n ∑ i = 1 n X t (1.7)
The iid hypothesis is always tested with the Ljung and Box [
Q L B ( m ) = n ( n + 2 ) ∑ k = 1 m ( [ ρ ^ X ( k ) ] 2 n − k ) (1.8)
where Q L B ( m ) is asymptotically a chi-squared random variable with m degree of freedom.
Several values of m are often used and simulation studies suggest that the choice of m ≈ ln ( n ) provides better power performance [
If the data are iid, the squared data X 1 2 , X 2 2 , ⋯ , X n 2 are also iid [
Q M L ( m ) = n ( n + 2 ) ∑ k = 1 m ( [ ρ ^ X 2 ( k ) ] 2 n − k ) (1.9)
where the sample autocorrelations of the data are replaced by the sample autocorrelations of the squared data, ρ ^ X 2 ( k ) .
As noted by Iwueze et al. [
Let Y t = X t d , d = 1 , 2 , 3 , ⋯ where X t , t ∈ Z , be the linear Gaussian white noise process, the mean [ E ( Y t ) = E ( X t d ) ] , the variance [ var ( Y t ) = var ( X t d ) ] , autocovariances [ R y ( k ) = cov ( Y t Y t − k ) = cov ( X t d X t − k d ) ] were obtained to be [
E ( Y t ) = E ( X t d ) = { σ 2 m ( 2 m − 1 ) ! ! , d = 2 m , m = 1 , 2 , ⋯ 0 , d = 2 m + 1 , m = 0 , 1 , 2 , ⋯ (1.10)
V a r ( Y t ) = V a r ( X t d ) = { σ 4 m [ ∏ k = 1 2 m ( 2 k − 1 ) − ( ∏ k = 1 m ( 2 k − 1 ) ) 2 ] , d = 2 m σ 2 ( 2 m + 1 ) ∏ k = 1 2 m + 1 ( 2 k − 1 ) , d = 2 m + 1 (1.11)
R Y ( k ) = R X t d ( l ) = { σ 4 m [ ∏ k = 1 2 m ( 2 k − 1 ) − ( ∏ k = 1 m ( 2 k − 1 ) ) 2 ] , d = 2 m , l = 0 σ 2 ( 2 m + 1 ) ∏ k = 1 2 m + 1 ( 2 k − 1 ) , d = 2 m + 1 , l = 0 0 , l ≠ 0 (1.12)
where
( 2 m − 1 ) ! ! = ∏ k = 1 m ( 2 k − 1 ) (1.13)
It is clear from (1.12) that when X t , t ∈ Z are iid, the powers Y t = X t d , d = 1 , 2 , 3 , ⋯ of X t , t ∈ Z are also iid. Iwueze et al. [
α = 1 2 , β = 2 σ 2 . That is, Y t = X t 2 ~ G ( α , β ) , α = 1 2 , β = 2 σ 2 .
when X t ~ N ( 0 , σ 2 ) and [
Using the coefficient of symmetry and kurtosis, Iwueze et al. [
β 1 = μ 3 ( d ) ( μ 2 ( d ) ) 3 / 2 (1.14)
β 2 = μ 4 ( d ) ( μ 2 ( d ) ) 2 (1.15)
where
μ 2 ( d ) = E [ ( X t d − E ( X t d ) ) 2 ] = var ( X t d ) (1.16)
d | Y t | E ( Y t ) ( μ y ) | μ 2 ( d ) ( var ( Y t ) ) | μ 3 ( d ) | μ 4 ( d ) | β 1 | β 2 |
---|---|---|---|---|---|---|---|
1 | X t | 0 | σ 2 | 0 | 3 σ 4 | 0 | 3.000 |
2 | X t 2 | σ 2 | 2 σ 4 | 8 σ 6 | 60 σ 8 | 2.828 | 15.000 |
3 | X t 3 | 0 | 15 σ 6 | 0 | 10395 σ 12 | 0 | 46.200 |
4 | X t 4 | 3 σ 4 | 96 σ 8 | 9504 σ 12 | 1907712 σ 16 | 10.104 | 207.00 |
5 | X t 5 | 0 | 945 σ 10 | 0 | 654729075 σ 20 | 0 | 733.159 |
6 | X t 6 | 15 σ 6 | 10170 σ 12 | 33998400 σ 18 | 3.142 × 10 11 σ 24 | 33.150 | 3037.836 |
Source: Iwueze et al. (2017).
μ 3 ( d ) = E [ ( X t d − E ( X t d ) ) 3 ] (1.17)
μ 4 ( d ) = E [ ( X t d − E ( X t d ) ) 4 ] (1.18)
Using the standard deviations when σ 2 = 1 and the kurtosis of Y t = X t d , d = 1 , 2 , 3 , ⋯ , Iwueze et al. [
The most commonly used white noise process is the linear Gaussian white noise process. The process is one of the major outcomes of any estimation procedure which is used in checking the adequacy of fitted models. The linear Gaussian white noise process also plays significant role as a basic building block in the construction of linear and non-linear time series models. However, the major problem is that there are many non-linear processes that exhibit the same covariance structure (Equation (1.1) through Equation (1.5)) as the linear Gaussian white noise process. One of such non-linear models is the bilinear models.
The study of bilinear models was introduced by Granger and Andersen [
X t = β X t − k e t − j + e t , k > j (1.19)
appear to be second order white noise where β is a constant and e t , t ∈ Z is an independent identically distributed normal random variable with E ( e t ) = 0 , E ( e t 2 ) = σ 2 < ∞ . Guegan [
X t = β X t − 2 e t − 1 + e t (1.20)
Martins [
X t = ( ∑ j = 1 m β j X t − q − j ) e t − q + e t (1.21)
where e t , t ∈ Z is as defined in (1.19). Iwueze [
1) The series X t , t ∈ Z satisfying (1.21) is strictly stationary, ergodic and unique.
2) The series X t , t ∈ Z satisfying (1.21) is invertible.
3) The series X t , t ∈ Z satisfying (1.21) has the same covariance structure as the linear Gaussian white noise processes.
4) Obtained the covariance structure of (1.21) to be
μ = E ( X t ) = 0 (1.22)
R ( k ) = { σ 2 1 − ∑ j = 1 m σ 2 β j 2 , k = 0 0 , otherwise (1.23)
5) The series satisfying (1.21) is invertible if
2 ∑ j = 1 m β j 2 σ 2 < 1 (1.24)
For the simple bilinear model (1.19), it follows that
R ( k ) = { 1 1 − σ 2 β 2 , σ 2 β 2 < 1 0 , otherwise (1.25)
and the invertibility condition is
σ 2 β 2 < 1 2 (1.26)
It is worthy to note that the stationarity condition
σ 2 β 2 < 1 (1.27)
is structure (k, n) independent [
1) Determine V a r ( X t d ) , d = 2 , 3 for the simple bilinear model (1.20).
2) Determine the covariance structure of X t d , d = 2 , 3 , when X t , t ∈ Z satisfies (1.20).
3) Determine for what values of β the simple bilinear white noise process will be identified as a Linear Gaussian white noise process.
4) Determine for what values of β the simple bilinear model will be normally distributed.
This paper is further divided into four sections in order to establish and achieve these goals. Section 2 discusses the covariance structure of Y t = X t d , d = 1 , 2 , 3 when X t = β X t − 2 e t − 1 + e t , e t ~ i i d N ( 0 , σ 2 ) , Section 3 presents the methodology, Section 4 is the results and discussion while, Section five is the conclusion.
Theorem 2.1.
Let e t , t ∈ Z be the linear Gaussian white noise process with E ( e t ) = 0 and E ( e t 2 ) = σ 2 < ∞ . Suppose there exists a stationary and invertible process X t , t ∈ Z satisfying X t = β X t − 2 e t − 1 + e t for every t ∈ Z for some constant β , then Y t = X t 2 has the following properties:
E ( Y t ) = μ Y = σ 2 1 − σ 2 β 2 ; σ 2 β 2 < 1 (2.1)
R Y ( k ) = cov ( Y t , Y t − k ) = { 2 σ 4 ( 1 − σ 2 β 2 ) 2 ( 1 − 3 σ 4 β 4 ) , σ 2 β 2 < 1 3 , k = 0 2 σ 6 β 2 ( 1 − σ 2 β 2 ) 2 , σ 2 β 2 < 1 , k = 1 σ 2 β 2 R Y ( k − 2 ) , k = 2 , 3 , ⋯ (2.2)
ρ Y ( k ) = R Y ( k ) R Y ( 0 ) = { 1 , k = 0 σ 2 β 2 ( 1 − 3 σ 4 β 4 ) , k = 1 σ 2 β 2 ρ Y ( k − 2 ) , k = 2 , 3 , ⋯ (2.3)
Y t = X t 2 , t ∈ Z has the same covariance structure as the linear ARMA(2, 1) process (2.4)
X t = λ + ϕ 1 X t − 1 + ϕ 2 X t − 2 + θ 1 a t − 1 + a t , ϕ 1 = 0 (2.4)
where a t is the sequence of independent and identically distributed random variable with E ( a t ) = 0 and V a r ( a t ) = σ 1 2 < ∞ .
Proof:
Let
Y t = X t 2 = ( β X t − 2 e t − 1 + e t ) 2 = β 2 X t − 2 2 e t − 1 2 + e t 2 + 2 β X t − 2 e t − 1 e t
E ( Y t ) = E ( X t 2 ) = β 2 E ( X t − 2 2 ) E ( e t − 1 2 ) + E ( e t 2 ) + 2 β E ( X t − 2 ) E ( e t − 1 ) E (t)
E ( Y t ) = E ( X t 2 ) = β 2 E ( X t 2 ) E ( e t 2 ) + E ( e t 2 ) = σ 2 β 2 E ( X t 2 ) + σ 2
( 1 − σ 2 β 2 ) E ( X t 2 ) = σ 2
μ Y = E ( X t 2 ) = σ 2 1 − σ 2 β 2 ; σ 2 β 2 < 1 (2.5)
V a r ( Y t ) = V a r ( X t 2 ) = E ( X t 4 ) − [ E ( X t 2 ) ] 2
X t 4 = β 4 X t − 2 4 e t − 1 4 + 4 β 3 X t − 2 3 e t − 1 3 e t + 6 β 2 X t − 2 2 e t − 1 2 e t 2 + 4 β X t − 2 e t − 1 e t 3 + e t 4
E ( X t 4 ) = 3 σ 4 β 4 E ( X t 4 ) + 6 σ 4 β 2 E ( X t 2 ) + 3 σ 4
( 1 − 3 σ 4 β 4 ) E ( X t 4 ) = 6 σ 6 β 2 1 − σ 2 β 2 + 3 σ 4
⇒ E ( X t 4 ) = 3 σ 4 ( 1 + σ 2 β 2 ) ( 1 − σ 2 β 2 ) ( 1 − 3 σ 4 β 4 ) , σ 4 β 4 < 1 3 (2.6)
Now,
V a r ( Y t ) = V a r ( X t 2 ) = E ( X t 4 ) − [ E ( X t 2 ) ] 2 = 3 σ 4 ( 1 + σ 2 β 2 ) ( 1 − σ 2 β 2 ) ( 1 − 3 σ 4 β 4 ) − ( σ 2 1 − σ 2 β 2 ) 2 = 3 σ 4 ( 1 + σ 2 β 2 ) ( 1 − σ 2 β 2 ) − σ 4 ( 1 − 3 σ 4 β 4 ) ( 1 − σ 2 β 2 ) 2 ( 1 − 3 σ 4 β 4 ) (2.7)
Hence,
R Y ( 0 ) = V a r ( Y t ) = V a r ( X t 2 ) = 2 σ 4 ( 1 − σ 2 β 2 ) 2 ( 1 − 3 σ 4 β 4 ) , σ 2 β 2 < 1 3 (2.8)
R Y ( k ) = E [ Y t Y t − l ] − μ y 2 = E [ X t 2 X t − l 2 ] − μ x 2 , k = 0 , 1 , 2 , ⋯
Y t Y t − 1 = X t 2 X t − 1 2 = β 2 X t − 2 2 X t − 1 2 e t − 1 2 + 2 β X t − 2 X t − 1 2 e t − 1 e t + X t − 1 2 e t 2
E [ Y t Y t − 1 ] = β 2 E [ X t − 2 2 X t − 1 2 e t − 1 2 ] + σ 2 E ( X t − 1 2 )
E [ Y t Y t − 1 ] = β 2 E [ X t − 1 2 X t 2 e t 2 ] + σ 2 E ( X t 2 )
X t − 1 2 X t 2 e t 2 = X t − 1 2 ( β 2 X t − 2 2 e t − 1 2 + 2 β X t − 2 e t − 1 e t + e t ) e t 2
X t − 1 2 X t 2 e t 2 = β 2 X t − 2 2 X t − 1 2 e t − 1 2 e t 2 + 2 β X t − 2 X t − 1 2 e t − 1 e t 3 + X t − 1 2 e t 4
By the assumption of stationarity,
E [ X t − 1 2 X t 2 e t 2 ] = σ 2 β 2 E [ X t − 1 2 X t 2 e t 2 ] + 3 σ 4 E ( X t 2 )
( 1 − σ 2 β 2 ) E [ X t − 1 2 X t 2 e t 2 ] = 3 σ 4 ( σ 2 1 − σ 2 β 2 )
E [ X t − 1 2 X t 2 e t 2 ] = 3 σ 6 ( 1 − σ 2 β 2 ) 2 , σ 2 β 2 < 1 (2.9)
E [ Y t Y t − 1 ] = β 2 [ 3 σ 6 ( 1 − σ 2 β 2 ) 2 ] + σ 2 ( σ 2 1 − σ 2 β 2 ) = σ 4 ( 1 + 2 σ 2 β 2 ) ( 1 − σ 2 β 2 ) 2 (2.10)
Hence,
R y ( 1 ) = E ( Y t Y t − 1 ) = E 2 ( Y t ) = σ 4 ( 1 + 2 σ 2 β 2 ) ( 1 − σ 2 β 2 ) 2 − ( σ 2 1 − σ 2 β 2 ) 2 = 2 σ 6 β 2 ( 1 − σ 2 β 2 ) 2 (2.11)
Y t Y t − 2 = X t 2 X t − 2 2 = ( β 2 X t − 2 2 e t − 1 2 + 2 β X t − 2 e t − 1 e t + e t 2 ) X t − 2 2
Y t Y t − 2 = β 2 X t − 2 4 e t − 1 2 + 2 β X t − 2 3 e t − 1 e t + X t − 2 2 e t 2
E [ Y t Y t − 2 ] = σ 2 β 2 E ( X t − 2 4 ) + σ 2 E ( X t − 2 2 )
E [ Y t Y t − 2 ] = σ 2 β 2 E ( Y t − 2 2 ) + σ 2 E (Yt)
⇒ E [ Y t Y t − 2 ] = σ 2 β 2 E ( Y t 2 ) + σ 2 μ y
R y ( 2 ) + μ y 2 = σ 2 β 2 [ R y ( 0 ) + μ y 2 ] + σ 2 μ y (2.12)
R y ( 2 ) = σ 2 β 2 R y ( 0 ) + σ 2 β 2 μ y 2 + σ 2 μ y − μ y 2 = σ 2 β 2 R y ( 0 ) + σ 2 μ y − μ y 2 ( 1 − σ 2 β 2 )
Note that
μ Y = E ( Y t ) = E ( X t 2 ) = σ 2 1 − σ 2 β 2
⇒ ( 1 − σ 2 β 2 ) μ Y = σ 2
1 − σ 2 β 2 = σ 2 μ Y (2.13)
Hence
R Y ( 2 ) = σ 2 β 2 R y ( 0 ) + σ 2 μ y − μ y 2 ( σ 2 μ y ) = σ 2 β 2 R y ( 0 ) + σ 2 μ y − σ 2 μ y = σ 2 β 2 R y ( 0 ) (2.14)
We have shown that
σ 2 β 2 μ y 2 + σ 2 μ y − μ y 2 = 0 (2.15)
Similarly;
Y t Y t − 3 = X t 2 X t − 3 2 = ( β 2 X t − 2 2 e t − 1 2 + 2 β X t − 2 e t − 1 e t + e t 2 ) X t − 3 2
Y t Y t − 3 = β 2 X t − 3 2 X t − 2 2 e t − 1 2 + 2 β X t − 3 2 X t − 2 e t − 1 e t + X t − 3 2 e t 2
E [ Y t Y t − 3 ] = σ 2 β 2 E [ X t − 2 2 X t − 1 2 ] + σ 2 E ( X t 2 ) = σ 2 β 2 E [ Y t Y t − 1 ] + σ 2 E (Yt)
⇒ R y ( 3 ) + μ y 2 = σ 2 β 2 [ R y ( 1 ) + μ y 2 ] + μ y 2 = σ 2 β 2 R y ( 1 ) + σ 2 β 2 μ y 2 + σ 2 μ y − μ y 2 = σ 2 β 2 R y ( 1 ) (2.16)
Generally;
R Y ( k ) = σ 2 β 2 R Y ( k − 2 ) , k = 2 , 3 , ⋯ (2.17)
Hence,
R Y ( k ) = { 2 σ 4 ( 1 − σ 2 β 2 ) 2 ( 1 − 3 σ 4 β 4 ) , σ 2 β 2 < 1 3 , k = 0 2 σ 6 β 2 ( 1 − σ 2 β 2 ) 2 , σ 2 β 2 < 1 , k = 1 σ 2 β 2 R Y ( k − 2 ) , k = 2 , 3 , ⋯ (2.18)
and
ρ Y ( k ) = { 1 , k = 0 σ 2 β 2 ( 1 − 3 σ 4 β 4 ) , k = 1 σ 2 β 2 ρ Y ( k − 2 ) , k = 2 , 3 , ⋯ (2.19)
With this result, it is clear that when X t , t ∈ Z is defined by (1.20), Y t = X t 2 has the same covariance structure as the linear ARMA(2, 1) process. Its linear equivalence is
Y t = λ + ϕ 1 X t − 1 + ϕ 2 Y t − 2 + θ 1 a t − 1 + a t , ϕ 1 = 0 (2.20)
where a t is the purely random process with E ( a t ) = 0 and V a r ( a t ) = σ 1 2 < ∞ .
Theorem 2.2.:
Let e t , t ∈ Z be the linear Gaussian white noise process with E ( e t ) = 0 and E ( e t 2 ) = σ 2 < ∞ . Suppose there exists a stationary and invertible process X t , t ∈ Z satisfying X t = β X t − 2 e t − 1 + e t for every t ∈ Z and some constant β , then the mean and variance of Y t = X t 3 , t ∈ Z are
E ( Y t ) = μ Y = 0 (2.21)
Properties | Process | |
---|---|---|
Bilinear | Linear ARMA(2, 1) | |
Structure | X t = β X t − 2 e t − 1 + e t , e t ~ N ( 0 , σ 2 ) , Y t = X t 2 ~ ARMA ( 2 , 1 ) with ϕ 1 = 0 | Y t = λ + ϕ 2 Y t − 2 + θ 1 a t − 1 + a t , E ( a t ) = 0 , V a r ( a t ) = σ 1 2 |
Mean | μ Y = E ( Y t ) = E ( X t 2 ) = σ 2 1 − σ 2 β 2 ; σ 2 β 2 < 1 | μ Y = E ( Y t ) = λ 1 − ϕ 2 , [ λ = ( 1 − ϕ 2 ) μ X ] |
Autocovariance | R Y ( k ) = { 2 σ 4 ( 1 − σ 2 β 2 ) 2 ( 1 − 3 σ 4 β 4 ) , σ 2 β 2 < 1 3 , k = 0 2 σ 6 β 2 ( 1 − σ 2 β 2 ) 2 , σ 2 β 2 < 1 , k = 1 σ 2 β 2 R Y ( k − 2 ) , k = 2 , 3 , ⋯ | R Y ( k ) = { σ 1 2 ( 1 + θ 1 2 ) 1 − ϕ 2 2 , | ϕ 2 | < 1 , k = 0 σ 1 2 θ 1 1 − ϕ 2 , ϕ 2 ≠ 1 , k = 1 ϕ 2 R Y ( k − 2 ) , k = 2 , 3 , ⋯ |
Autocorrelation | ρ Y ( k ) = { 1 , k = 0 σ 2 β 2 ( 1 − 3 σ 4 β 4 ) , k = 1 σ 2 β 2 ρ Y ( k − 2 ) , k = 2 , 3 , ⋯ | ρ Y ( k ) = { 1 , k = 0 θ 1 ( 1 + ϕ 2 ) 1 + θ 1 2 , k = 1 ϕ 2 ρ Y ( k − 2 ) , k = 2 , 3 , ⋯ |
R Y ( k ) = { 15 σ 6 ( 1 + 2 σ 2 β 2 + 6 σ 4 β 4 + 3 σ 6 β 6 ) ( 1 − σ 2 β 2 ) ( 1 − 3 σ 4 β 4 ) ( 1 − 15 σ 6 β 6 ) , σ 2 β 2 < 1 15 3 , k = 0 0 , k ≠ 0 (2.22)
ρ k ( k ) = { 1 , k = 0 0 , k ≠ 0 (2.23)
The covariance structure of Y t = X t 3 , t ∈ Z is that of the linear white noise process.
Proof:
Let Y t = X t 3 = ( β X t − 2 e t − 1 + e t ) 3 = β 3 X t − 2 3 e t − 1 3 + 3 β 2 X t − 2 2 e t − 1 2 e t + 3 β X t − 2 e t − 1 e t 2 + e t 3 (2.24)
E ( Y t ) = E ( X t 3 ) = μ y = β 3 E ( X t − 2 3 e t − 1 3 ) + 3 σ 2 β 2 E ( X t − 2 e t − 1 ) = β 3 E ( X t − 1 3 e t 3 ) + 3 σ 2 β 2 E ( X t − 1 e t ) = 0 (2.25)
Y t 2 = X t 6 = ( β X t − 2 e t − 1 + e t ) 6 = β 6 X t − 2 6 e t − 1 6 + 6 β 5 X t − 2 5 e t − 1 5 e t + 15 β 4 X t − 2 4 e t − 1 4 e t 2 + 20 β 3 X t − 2 3 e t − 1 3 e t 3 + 15 β 2 X t − 2 2 e t − 1 2 e t 4 + 6 β X t − 2 e t − 1 e t 5 + e t 6 (2.26)
E ( Y t 2 ) = β 6 E ( X t − 2 6 e t − 1 6 ) + 6 β 5 E ( X t − 2 5 e t − 1 5 e t ) + 15 β 4 E ( X t − 2 4 e t − 1 4 e t 2 ) + 20 β 3 E ( X t − 2 3 e t − 1 3 e t 3 ) + 15 β 2 E ( X t − 2 2 e t − 1 2 e t 4 ) + 6 β E ( X t − 2 e t − 1 e t 5 ) + E ( e t 6 ) = β 6 E ( X t − 2 6 e t − 1 6 ) + 15 σ 2 β 4 E ( X t − 2 4 e t − 1 4 ) + 45 σ 4 β 2 E ( X t − 2 2 e t − 1 2 ) + 15 σ 6 = 15 σ 6 β 6 E ( X t 6 ) + 45 σ 6 β 4 E ( X t 4 ) + 45 σ 6 β 2 E ( X t 2 ) + 15 σ 6 = 15 σ 6 β 6 E ( Y t 2 ) + 45 σ 6 β 4 [ 3 σ 4 ( 1 + σ 2 β 2 ) ( 1 − σ 2 β 2 ) ( 1 − 3 σ 4 β 4 ) ] + 45 σ 6 β 2 ( σ 2 1 − σ 2 β 2 ) + 15 σ 6
( 1 − 15 σ 6 β 6 ) E ( Y t 2 ) = 45 σ 6 β 4 [ 3 σ 4 ( 1 + σ 2 β 2 ) ( 1 − σ 2 β 2 ) ( 1 − 3 σ 4 β 4 ) ] + 45 σ 6 β 2 ( σ 2 1 − σ 2 β 2 ) + 15 σ 6 = 1 ( 1 − σ 2 β 2 ) ( 1 − 3 σ 4 β 4 ) [ 45 σ 6 β 4 [ 3 σ 4 ( 1 + σ 2 β 2 ) ] + 45 σ 6 β 2 [ σ 2 ( 1 − 3 σ 4 β 4 ) ] + 15 σ 6 ( 1 − σ 2 β 2 ) ( 1 − 3 σ 4 β 4 ) ]
= 1 ( 1 − σ 2 β 2 ) ( 1 − 3 σ 4 β 4 ) [ 135 σ 10 β 4 + 135 σ 12 β 6 + 45 σ 8 β 2 − 135 σ 12 β 6 + 15 σ 6 − 45 σ 10 β 4 − 15 σ 8 β 2 + 45 σ 12 β 6 ] = 1 ( 1 − σ 2 β 2 ) ( 1 − 3 σ 4 β 4 ) [ 90 σ 10 β 4 + 30 σ 8 β 2 + 15 σ 6 + 45 σ 12 β 6 ] = 15 σ 6 ( 1 + 2 σ 2 β 2 + 6 σ 4 β 4 + 3 σ 6 β 6 ) ( 1 − σ 2 β 2 ) ( 1 − 3 σ 4 β 4 ) , σ 2 β 2 < 1 15 3 (2.27)
∴ E ( Y t 2 ) = R y ( 0 ) + μ y 2 (2.28)
⇒ V a r ( Y t ) = V a r ( X t 3 ) = R y ( 0 ) = E ( Y t 2 ) − μ y 2 = 15 σ 6 ( 1 + 2 σ 2 β 2 + 6 σ 4 β 4 + 3 σ 6 β 6 ) ( 1 − σ 2 β 2 ) ( 1 − 3 σ 4 β 4 ) ( 1 − 15 σ 6 β 6 ) , σ 2 β 2 < 1 15 3 (2.29)
Some Results
E ( X t − 1 X t e t ) = σ 2 E ( X t ) = 0
Proof:
X t − 1 X t e t = X t − 1 [ β X t − 2 e t − 1 + e t ] e t = β X t − 2 X t − 1 e t − 1 e t + X t − 1 e t 2
E ( X t − 1 X t e t ) = σ 2 E ( X t − 1 ) = σ 2 E ( X t ) = 0
E ( X t − 1 X t 2 e t ) = 2 σ 2 β E ( X t − 1 X t e t ) = 0
Proof:
X t − 1 X t 2 e t = X t − 1 [ β 2 X t − 2 2 e t − 1 2 + 2 β X t − 2 e t − 1 e t + e t 2 ] e t = β 2 X t − 2 2 X t − 1 e t − 1 2 e t + 2 β X t − 2 X t − 1 e t − 1 e t 2 + e t 3
E ( X t − 1 X t 2 e t ) = 2 β σ 2 E ( X t − 2 X t − 1 e t − 1 ) = 2 β σ 2 E ( X t − 1 X t e t ) = 0
E ( X t − 1 2 X t e t 2 ) = σ 2 β E ( X t − 1 X t 2 e t ) = 0
Proof:
X t − 1 2 X t e t 2 = X t − 1 2 [ β X t − 2 e t − 1 + e t ] e t 2 = β X t − 2 X t − 1 2 e t − 1 + X t − 1 2 e t 3
E ( X t − 1 2 X t e t 2 ) = σ 2 β E ( X t − 2 X t − 1 2 e t − 1 ) = σ 2 β E ( X t − 1 X t 2 e t ) = 0
E ( X t − 1 X t 3 e t ) = 3 σ 2 β 2 E ( X t − 1 2 X t e t 2 ) = 0
Proof:
X t − 1 X t 3 e t = X t − 1 [ β 3 X t − 2 3 e t − 1 3 + 3 β 2 X t − 2 2 e t − 1 2 e t + 3 β X t − 2 e t − 1 e t 2 + e t 3 ] e t = β 3 X t − 2 3 X t − 1 e t − 1 3 e t + 3 β 2 X t − 2 2 X t − 1 e t − 1 2 e t 2 + 3 β X t − 2 X t − 1 e t − 1 e t 3 + X t − 1 e t 4
E ( X t − 1 X t 3 e t ) = 3 σ 2 β 2 E ( X t − 2 2 X t − 1 e t − 1 2 ) = 3 σ 2 β 2 E ( X t − 1 2 X t e t 2 ) = 0
ÞNow
Y t Y t − 1 = X t 3 X t − 1 3 = [ β 3 X t − 2 3 e t − 1 3 + 3 β 2 X t − 2 2 e t − 1 2 e t + 3 β X t − 2 e t − 1 e t 2 + e t 3 ] X t − 1 3 = β 3 X t − 2 3 X t − 1 3 e t − 1 3 + 3 β 2 X t − 2 2 X t − 1 3 e t − 1 2 e t + 3 β X t − 2 X t − 1 3 e t − 1 e t 2 + X t − 1 3 e t 3
E ( Y t Y t − 1 ) = β 3 E ( X t − 2 3 X t − 1 3 e t − 1 3 ) + 3 σ 2 β E ( X t − 2 X t − 1 3 e t − 1 ) = β 3 E ( X t − 1 3 X t 3 e t 3 ) + 3 σ 2 β E ( X t − 1 X t 3 e t ) = β 3 E ( X t − 1 3 X t 3 e t 3 )
X t − 1 3 X t 3 e t 3 = X t − 1 3 [ β 3 X t − 2 3 e t − 1 3 + 3 β 2 X t − 2 2 e t − 1 2 e t + 3 β X t − 2 e t − 1 e t 2 + e t 3 ] e t 3 = β 3 X t − 2 3 X t − 1 3 e t − 1 3 e t 3 + 3 β 2 X t − 2 2 X t − 1 3 e t − 1 2 e t 4 + 3 β X t − 2 X t − 1 3 e t − 1 e t 5 + X t − 1 3 e t 6
E ( X t − 1 3 X t 3 e t 3 ) = 3 β 2 ( 3 σ 4 ) E ( X t − 2 2 X t − 1 3 e t − 1 2 ) = 9 σ 4 β 2 E ( X t − 2 2 X t − 1 3 e t − 1 2 ) = 9 σ 4 β 2 E ( X t − 1 2 X t 3 e t 2 )
Hence,
E ( Y t Y t − 1 ) = β 3 [ 9 σ 4 β 2 E ( X t − 1 2 X t 3 e t 2 ) ] = 9 σ 4 β 5 E ( X t − 1 2 X t 3 e t 2 )
Now
X t − 1 2 X t 3 e t 2 = X t − 1 2 [ β 3 X t − 2 3 e t − 1 3 + 3 β 2 X t − 2 2 e t − 1 2 e t + 3 β X t − 2 e t − 1 e t 2 + e t 3 ] e t 2 = β 3 X t − 2 3 X t − 1 2 e t − 1 3 e t 2 + 3 β 2 X t − 2 2 X t − 1 2 e t − 1 2 e t 3 + 3 β X t − 2 X t − 1 2 e t − 1 e t 4 + X t − 1 2 e t 5
E ( X t − 1 2 X t 3 e t 2 ) = σ 2 β 3 E ( X t − 2 3 X t − 1 2 e t − 1 3 ) + 3 β ( 3 σ 4 ) E ( X t − 2 X t − 1 2 e t − 1 ) = σ 2 β 3 E ( X t − 1 3 X t 2 e t 3 ) + 9 σ 4 β E ( X t − 1 X t 2 e t ) = σ 2 β 3 E ( X t − 1 3 X t 2 e t 3 )
But,
E ( Y t Y t − 1 ) = 9 σ 4 β 5 E ( X t − 1 2 X t 3 e t 2 ) = 9 σ 4 β 5 ( σ 2 β 3 E ( X t − 1 3 X t 2 e t 3 ) ) = 9 σ 6 β 8 E ( X t − 1 3 X t 2 e t 3 )
Now,
X t − 1 3 X t 2 e t 3 = X t − 1 3 [ β 2 X t − 2 2 e t − 1 2 + 2 β X t − 2 e t − 1 e t + e t 2 ] e t 3 = β 2 X t − 2 2 X t − 1 3 e t − 1 2 e t 3 + 2 β X t − 2 X t − 1 3 e t − 1 e t 4 + X t − 1 3 e t 5
E ( X t − 1 3 X t 2 e t 3 ) = 2 β ( 3 σ 4 ) E ( X t − 2 X t − 1 3 e t − 1 ) = 6 σ 4 β E ( X t − 1 X t 3 e t ) = 0
Hence,
E ( Y t Y t − 1 ) = 9 σ 6 β 8 [ 6 σ 4 β E ( X t − 1 X t 3 e t ) ] = 54 σ 10 β 9 E ( X t − 1 X t 3 e t ) = 0
⇒ R Y ( 1 ) = 0 , when Y = X t 3 .
Y t Y t − 2 = X t 3 X t − 2 3 = [ β 3 X t − 2 3 e t − 1 3 + 3 β 2 X t − 2 2 e t − 1 2 e t + 3 β X t − 2 e t − 1 e t 2 + e t 3 ] X t − 2 3 = β 3 X t − 2 6 e t − 1 3 + 3 β 2 X t − 2 5 e t − 1 2 e t + 3 β X t − 2 4 e t − 1 e t 2 + X t − 2 3 e t 3
E ( Y t Y t − 2 ) = 0
⇒ R Y ( 2 ) = 0 , when Y = X t 3 .
Generally, R Y ( k ) = 0 ∀ k ≠ 0 , when Y = X t 3 .
Therefore, given X t = β X t − 2 e t − 1 + e t , e t ~ N ( 0 , σ 2 ) and Y t = X t 3 , the following are true E ( Y t ) = E ( X t 3 ) = 0 .
R Y ( k ) = { 15 σ 6 ( 1 + 2 σ 2 β 2 + 6 σ 4 β 4 + 3 σ 6 β 6 ) ( 1 − σ 2 β 2 ) ( 1 − 3 σ 4 β 4 ) ( 1 − 15 σ 6 β 6 ) , σ 2 β 2 < 1 15 3 , k = 0 0 , k ≠ 0
ρ k ( k ) = { 1 , k = 0 0 , k ≠ 0
The covariance structure of Y t = X t 3 , t ∈ Z identifies the process as linear white noise.
The Jarque-Bera (JB) test [
JB = n ( γ ^ 1 2 6 + ( γ ^ 2 − 3 ) 2 24 ) (3.1)
where
γ ^ 1 = 1 n ∑ t = 1 n ( X t − X ¯ ) 3 ( 1 n ∑ t = 1 n ( X t − X ¯ ) 2 ) 3 / 2 (3.2)
γ ^ 2 = 1 n ∑ t = 1 n ( X t − X ¯ ) 4 ( 1 n ∑ t = 1 n ( X t − X ¯ ) 2 ) 2 (3.3)
n is the sample size while, γ ^ 1 and γ ^ 2 are the sample skewness and kurtosis coefficients. The asymptotic null distribution of JB is χ 2 with 2 degrees of freedom.
The modified Ljung-Box test statistic [
Q * ( m ) = n ( n + 2 ) ∑ k = 1 m ( [ ρ ^ X d ( k ) ] 2 n − k ) (3.4)
is used to test the iid hypothesis for X t d , d = 1 , 2 , 3 for the simple bilinear model (1.20). It is important to note from Theorem 2.1 that X t 2 has ARMA(2, 1) structure while from Theorem 2.2, X t 3 is iid. This test will look for β values where both X t 2 and X t 3 are jointly iid. That will help determine the values of β for which the simple bilinear model (1.20) is not distinguishable from the linear Gaussian white noise process (LGWNP). Then, the hypothesis of iid data is rejected at level α if the observed Q * ( m ) is larger than the 1 − α 2 quartile of the χ 2 ( m ) distribution, where m ≈ ln ( n ) [
From Theorem 2.3, the third power of the simple bilinear process is iid. A test is needed to confirm that the simple bilinear process (1.20) is not a linear Gaussian white noise process (LGWNP). For the LGWNP X t , t ∈ T ; E ( X t ) = μ , var ( X t ) = σ 2 < ∞ and var ( X t 3 ) = 15 σ 6 . To show that the simple bilinear process (1.20) is not LGWNP, we need to test the hypothesis;
H 0 : σ X t 3 2 = 15 σ X t 6 (3.5)
against the alternative hypothesis
H 0 : σ X t 3 2 ≠ 15 σ X t 6 (3.6)
The chi-square test [
χ c a l 2 = ( n − 1 ) S X t 3 2 15 σ ⌢ X t 6 (3.7)
where S X t 3 2 is the sample variance of X t 3 ; X t , t ∈ Z that follows (1.20), σ ⌢ X t 2 is an estimate of the true variance of the simple bilinear process (1.20) and n is the number of observations of the series. The null hypothesis is rejected at level α if the observed value of χ c a l 2 is larger than 1 − α 2 quartile of the chi-square distribution with n − 1 .degree of freedom. It should be noted that this test works well when the underlying original population X t , t ∈ Z is normally distributed.
One thousand random digits e t , t ∈ Z that met the condition e t ~ N ( 0 , 1 ) were simulated using Minitab 16 series software. Only one random digit, shown in Appendix I, was used for demonstration in the study because of want of space. The estimates of the descriptive statistics (mean, variance, skewness ( γ 1 ) and kurtosis ( γ 2 )) and other tests (Jarque Bera (JB) test, modified Ljung Box test (Q*) and chi-square calculated test statistic) for the powers e t d , d = 1 , 2 , 3 of the random digit are shown in
The LGWNP were used to simulate the SBWNP X t = B X t − 2 e t − 1 + e t , e t ~ N ( 0 , 1 ) for − 0.60 ≤ β ≤ 0.60 satisfying the existence of E ( X t 3 ) using Fortran 77 program. The estimates of the descriptive statistic and that for the test statistic (JB, Q* and the chi-square calculated test statistic) are shown in
We have been able to establish the covariance structure for X t d , d = 1 , 2 , 3 ; t ∈ Z
Statistic | Mean | Median | Estimated Value | Skewness | Kurtosis | JB value | Q* | Estimate of Test Statistic | |
---|---|---|---|---|---|---|---|---|---|
S 2 | γ 1 | γ 2 | ( n − 1 ) S X t 2 2 2 σ ^ 0 4 | ( n − 1 ) S X t 3 2 15 σ ^ 0 6 | |||||
X t | 0.0000 | 0.1261 | 1.0000 | −0.28 | −0.04 | 1.87 | 3.36 | - | - |
X t 2 | 0.9931 | 0.4763 | 1.9074 | 1.90 | 2.79 | 133.19 | 0.04 | 136.38 | - |
X t 3 | −0.2728 | 0.0020 | 11.5236 | −0.61 | 6.47 | 259.67 | −0.14 | - | 109.86 |
β | Statistic | Estimated Values | Estimate of Test Statistic | ||||||
---|---|---|---|---|---|---|---|---|---|
Mean | Variance | γ 1 | γ 2 | JB value | Q* | H 0 | |||
−0.60 | X t | 0.0418 | 1.9037 | 0.27 | 1.20 | 10.28 | 8.44 | - | |
X t 2 | 1.8923 | 11.3331 | 3.09 | 11.85 | 1072.25 | 71.39 | - | ||
X t 3 | 0.9233 | 186.5203 | 3.78 | 26.73 | 4628.20 | 22.66 | 257.74 | ||
−0.59 | X t | 0.0410 | 1.8610 | 0.24 | 1.11 | 8.78 | 8.16 | . | |
X t 2 | 1.8490 | 10.5110 | 2.99 | 11.09 | 952.49 | 70.34 | - | ||
X t 3 | 0.8100 | 164.3700 | 3.61 | 25.83 | 4315.90 | 20.32 | 243.12 | ||
−0.58 | X t | 0.0390 | 1.8200 | 0.21 | 1.02 | 7.30 | 7.89 | . | |
X t 2 | 1.8090 | 9.7720 | 2.89 | 10.39 | 848.16 | 69.04 | - | ||
X t 3 | 0.7100 | 145.5500 | 3.44 | 24.98 | 4028.01 | 18.02 | 230.17 | ||
−0.57 | X t | 0.0380 | 1.7800 | 0.18 | 0.94 | 6.08 | 7.63 | . | |
X t 2 | 1.7700 | 9.1080 | 2.80 | 9.75 | 758.53 | 67.49 | - | ||
X t 3 | 0.6220 | 129.4920 | 3.29 | 24.13 | 3753.32 | 15.84 | 218.89 | ||
−0.56 | X t | 0.0370 | 1.7430 | 0.15 | 0.87 | 5.12 | 7.39 | . | |
X t 2 | 1.7320 | 8.5110 | 2.72 | 9.16 | 680.73 | 65.73 | - | ||
X t 3 | 0.5390 | 115.7480 | 3.14 | 23.25 | 3479.84 | 13.84 | 208.38 | ||
−0.55 | X t | 0.0360 | 1.7080 | 0.13 | 0.80 | 4.29 | 7.15 | . | |
X t 2 | 1.6970 | 7.9730 | 2.64 | 8.61 | 612.26 | 63.76 | - | ||
X t 3 | 0.4630 | 103.9380 | 2.99 | 22.32 | 3203.09 | 12.04 | 198.86 | ||
−0.54 | X t | 0.0346 | 1.6739 | 0.10 | 0.74 | 3.58 | 6.93 | - | |
X t 2 | 1.6634 | 7.4872 | 2.57 | 8.09 | 550.71 | 61.63 | - | ||
X t 3 | 0.3948 | 93.7500 | 2.84 | 21.33 | 2921.69 | 10.48 | 190.56 | ||
−0.53 | X t | 0.0334 | 1.6416 | 0.08 | 0.69 | 3.00 | 6.73 | - | |
X t 2 | 1.6313 | 7.0486 | 2.50 | 7.59 | 495.95 | 59.36 | - | ||
X t 3 | 0.3325 | 84.9253 | 2.69 | 20.27 | 2637.97 | 9.18 | 183.01 | ||
−0.52 | X t | 0.0322 | 1.6108 | 0.06 | 0.64 | 2.52 | 6.54 | - | |
X t 2 | 1.6006 | 6.6518 | 2.44 | 7.12 | 446.71 | 56.99 | - | ||
X t 3 | 0.2759 | 77.2508 | 2.54 | 19.16 | 2356.47 | 8.12 | 176.21 | ||
−0.51 | X t | 0.0310 | 1.5814 | 0.04 | 0.59 | 2.12 | 6.36 | - | |
X t 2 | 1.5714 | 6.2921 | 2.38 | 6.66 | 402.32 | 54.54 | - | ||
X t 3 | 0.2246 | 70.5498 | 2.39 | 18.01 | 2082.80 | 7.31 | 170.07 | ||
−0.50 | X t | 0.0299 | 1.5534 | 0.02 | 0.55 | 1.79 | 6.20 | - |
---|---|---|---|---|---|---|---|---|
X t 2 | 1.5435 | 5.9656 | 2.32 | 6.23 | 362.29 | 52.04 | - | |
X t 3 | 0.1781 | 64.6758 | 2.25 | 16.84 | 1822.60 | 6.72 | 164.49 | |
−0.40 | X t | 0.0188 | 1.3345 | −0.12 | 0.26 | 0.74 | 5.22 | - |
X t 2 | 1.3256 | 3.8995 | 1.92 | 3.05 | 144.03 | 29.34 | - | |
X t 3 | −0.1026 | 32.3961 | 1.06 | 7.97 | 408.18 | 7.16 | 129.95 | |
−0.30 | X t | 0.0096 | 1.1946 | −0.19 | 0.12 | 0.95 | 4.92 | - |
X t 2 | 1.1864 | 2.9329 | 1.77 | 2.17 | 103.47 | 15.82 | - | |
X t 3 | −0.2082 | 20.8415 | 0.56 | 6.08 | 229.18 | 8.31 | 116.55 | |
−0.29 | X t | 0.0088 | 1.1836 | −0.19 | 0.11 | 0.98 | 4.90 | - |
X t 2 | 1.1755 | 2.8659 | 1.77 | 2.16 | 103.03 | 14.92 | - | |
X t 3 | −0.2143 | 20.1484 | 0.53 | 6.07 | 227.89 | 8.32 | 115.84 | |
−0.28 | X t | 0.0080 | 1.1731 | −0.20 | 0.10 | 1.01 | 4.88 | - |
X t 2 | 1.1650 | 2.8026 | 1.77 | 2.16 | 102.93 | 14.09 | - | |
X t 3 | −0.2199 | 19.5079 | 0.50 | 6.08 | 227.54 | 8.31 | 115.20 | |
−0.27 | X t | 0.0073 | 1.1630 | −0.20 | 0.09 | 1.05 | 4.86 | - |
X t 2 | 1.1550 | 2.7430 | 1.77 | 2.17 | 103.12 | 13.32 | - | |
X t 3 | −0.2251 | 18.9148 | 0.48 | 6.09 | 227.87 | 8.29 | 114.63 | |
−0.26 | X t | 0.0065 | 1.1533 | −0.21 | 0.08 | 1.08 | 4.83 | - |
X t 2 | 1.1453 | 2.6866 | 1.77 | 2.18 | 103.55 | 12.61 | - | |
X t 3 | −0.2298 | 18.3644 | 0.45 | 6.11 | 228.69 | 8.26 | 114.13 | |
−0.25 | X t | 0.0059 | 1.1440 | −0.21 | 0.07 | 1.11 | 4.81 | - |
X t 2 | 1.1361 | 2.6333 | 1.77 | 2.20 | 104.19 | 11.95 | - | |
X t 3 | −0.2342 | 17.8527 | 0.42 | 6.13 | 229.83 | 8.22 | 113.68 | |
−0.24 | X t | 0.0052 | 1.1351 | −0.22 | 0.07 | 1.15 | 4.78 | - |
X t 2 | 1.1272 | 2.5828 | 1.77 | 2.22 | 104.99 | 11.35 | - | |
X t 3 | −0.2382 | 17.3761 | 0.40 | 6.16 | 231.17 | 8.17 | 113.26 | |
−0.23 | X t | 0.0046 | 1.1265 | −0.22 | 0.06 | 1.18 | 4.75 | - |
X t 2 | 1.1187 | 2.5350 | 1.78 | 2.24 | 105.93 | 10.79 | - | |
X t 3 | −0.2418 | 16.9312 | 0.37 | 6.18 | 232.59 | 8.11 | 112.91 | |
−0.22 | X t | 0.0040 | 1.1183 | −0.22 | 0.05 | 1.21 | 4.72 | - |
X t 2 | 1.1105 | 2.4898 | 1.78 | 2.27 | 106.98 | 10.28 | - | |
X t 3 | −0.2452 | 16.5153 | 0.34 | 6.21 | 234.04 | 8.05 | 112.58 | |
−0.21 | X t | 0.0034 | 1.1103 | −0.23 | 0.04 | 1.24 | 4.69 | - |
X t 2 | 1.1026 | 2.4468 | 1.79 | 2.29 | 108.11 | 9.81 | - | |
X t 3 | −0.2483 | 16.1256 | 0.31 | 6.23 | 235.45 | 7.98 | 112.32 | |
−0.20 | X t | 0.0029 | 1.1027 | −0.23 | 0.04 | 1.28 | 4.65 | - |
X t 2 | 1.0951 | 2.4061 | 1.79 | 2.32 | 109.31 | 9.39 | - | |
X t 3 | −0.2512 | 15.7600 | 0.28 | 6.26 | 236.78 | 7.91 | 112.05 | |
−0.19 | X t | 0.0024 | 1.0954 | −0.23 | 0.03 | 1.31 | 4.61 | - |
X t 2 | 1.0878 | 2.3675 | 1.80 | 2.34 | 110.55 | 9.00 | - | |
X t 3 | −0.2538 | 15.4164 | 0.25 | 6.28 | 238.03 | 7.83 | 111.82 |
−0.18 | X t | 0.0020 | 1.0884 | −0.24 | 0.03 | 1.34 | 4.57 | - |
---|---|---|---|---|---|---|---|---|
X t 2 | 1.0808 | 2.3308 | 1.80 | 2.37 | 111.82 | 8.65 | - | |
X t 3 | −0.2561 | 15.0931 | 0.22 | 6.30 | 239.16 | 7.74 | 111.60 | |
−0.17 | X t | 0.0015 | 1.0816 | −0.24 | 0.02 | 1.37 | 4.52 | - |
X t 2 | 1.0741 | 2.2959 | 1.81 | 2.40 | 113.10 | 8.33 | - | |
X t 3 | −0.2583 | 14.7883 | 0.18 | 6.32 | 240.20 | 7.66 | 111.42 | |
−0.16 | X t | 0.0011 | 1.0752 | −0.24 | 0.01 | 1.40 | 4.48 | - |
X t 2 | 1.0677 | 2.2628 | 1.82 | 2.42 | 114.39 | 8.04 | - | |
X t 3 | −0.2603 | 14.5008 | 0.15 | 6.33 | 241.13 | 7.57 | 111.22 | |
−0.15 | X t | 0.0008 | 1.0689 | −0.24 | 0.01 | 1.44 | 4.43 | - |
X t 2 | 1.0615 | 2.2313 | 1.82 | 2.45 | 115.68 | 7.79 | - | |
X t 3 | −0.2621 | 14.2292 | 0.11 | 6.35 | 241.98 | 7.48 | 111.07 | |
−0.14 | X t | 0.0005 | 1.0629 | −0.25 | 0.00 | 1.47 | 4.37 | - |
X t 2 | 1.0555 | 2.2013 | 1.83 | 2.48 | 116.96 | 7.56 | - | |
X t 3 | −0.2638 | 13.9723 | 0.07 | 6.36 | 242.75 | 7.38 | 110.93 | |
−0.13 | X t | 0.0002 | 1.0571 | −0.25 | −0.00 | 1.50 | 4.31 | - |
X t 2 | 1.0498 | 2.1728 | 1.83 | 2.50 | 118.23 | 7.36 | - | |
X t 3 | −0.2652 | 13.7293 | 0.03 | 6.37 | 243.49 | 7.28 | 110.80 | |
−0.12 | X t | −0.0001 | 1.0516 | −0.25 | −0.00 | 1.53 | 4.25 | - |
X t 2 | 1.0443 | 2.1457 | 1.84 | 2.52 | 119.48 | 7.18 | - | |
X t 3 | −0.2666 | 13.4993 | −0.01 | 6.38 | 244.19 | 7.19 | 110.66 | |
−0.11 | X t | −0.0003 | 1.0463 | −0.25 | −0.01 | 1.56 | 4.19 | - |
X t 2 | 1.0390 | 2.1199 | 1.85 | 2.55 | 120.71 | 7.03 | - | |
X t 3 | −0.2677 | 13.2813 | −0.06 | 6.39 | 244.90 | 7.09 | 110.54 | |
−0.10 | X t | −0.0004 | 1.0411 | −0.26 | −0.01 | 1.59 | 4.13 | - |
X t 2 | 1.0339 | 2.0953 | 1.85 | 2.57 | 121.92 | 6.90 | - | |
X t 3 | −0.2688 | 13.0747 | −0.10 | 6.40 | 245.64 | 6.99 | 110.46 | |
0.10 | X t | 0.0046 | 0.9745 | −0.30 | −0.04 | 2.18 | 2.51 | - |
X t 2 | 0.9677 | 1.8045 | 1.93 | 2.98 | 142.60 | 6.84 | - | |
X t 3 | −0.2698 | 10.6906 | −1.12 | 6.62 | 292.71 | 5.10 | 110.13 | |
0.20 | X t | 4.19624 | 0.9627 | −0.33 | −0.01 | 2.61 | 1.77 | - |
X t 2 | 4.19624 | 1.7743 | 1.94 | 3.07 | 146.99 | 7.45 | - | |
X t 3 | 4.19624 | 10.4201 | −1.52 | 6.79 | 331.67 | 4.20 | 111.34 | |
0.21 | X t | 0.0149 | 0.9623 | −0.33 | −0.01 | 2.67 | 1.71 | - |
X t 2 | 0.9558 | 1.7750 | 1.94 | 3.08 | 147.10 | 7.51 | - | |
X t 3 | −0.2654 | 10.4221 | −1.55 | 6.80 | 334.75 | 4.10 | 111.50 | |
0.22 | X t | 0.0161 | 0.9620 | −0.34 | −0.00 | 2.72 | 1.66 | - |
X t 2 | 0.9556 | 1.7765 | 1.94 | 3.08 | 147.15 | 7.58 | - | |
X t 3 | −0.2651 | 10.4295 | −1.58 | 6.81 | 337.56 | 4.01 | 111.68 | |
0.23 | X t | 0.0174 | 0.9618 | −0.34 | 0.00 | 2.78 | 1.61 | - |
X t 2 | 0.9555 | 1.7786 | 1.94 | 3.08 | 147.18 | 7.65 | - | |
X t 3 | −0.2648 | 10.4424 | −1.60 | 6.81 | 340.07 | 3.92 | 111.89 |
0.24 | X t | 0.0187 | 0.9618 | −0.34 | 0.01 | 2.85 | 1.56 | - |
---|---|---|---|---|---|---|---|---|
X t 2 | 0.9555 | 1.7813 | 1.94 | 3.08 | 147.18 | 7.72 | - | |
X t 3 | −0.2645 | 10.4608 | −1.62 | 6.82 | 342.26 | 3.83 | 112.09 | |
0.25 | X t | 0.0200 | 0.9620 | −0.35 | 0.01 | 2.91 | 1.52 | - |
X t 2 | 0.9557 | 1.7848 | 1.94 | 3.08 | 147.17 | 7.80 | - | |
X t 3 | −0.2643 | 10.4851 | −1.64 | 6.82 | 344.13 | 3.75 | 112.28 | |
0.26 | X t | 0.0214 | 0.9623 | −0.35 | 0.02 | 2.98 | 1.49 | - |
X t 2 | 0.9561 | 1.7888 | 1.94 | 3.08 | 147.16 | 7.88 | - | |
X t 3 | −0.2641 | 10.5152 | −1.66 | 6.83 | 345.65 | 3.66 | 112.49 | |
0.27 | X t | 0.0229 | 0.9628 | −0.36 | 0.02 | 3.05 | 1.46 | - |
X t 2 | 0.9566 | 1.7935 | 1.94 | 3.08 | 147.18 | 7.96 | - | |
X t 3 | −0.2638 | 10.5514 | −1.67 | 6.83 | 346.83 | 3.57 | 112.71 | |
0.28 | X t | 0.0244 | 0.9634 | −0.36 | 0.03 | 3.12 | 1.43 | - |
X t 2 | 0.9573 | 1.7989 | 1.94 | 3.08 | 147.23 | 8.05 | - | |
X t 3 | −0.2636 | 10.5939 | −1.69 | 6.83 | 347.68 | 3.49 | 112.95 | |
0.29 | X t | 0.0259 | 0.9641 | −0.36 | 0.03 | 3.19 | 1.41 | - |
X t 2 | 0.9581 | 1.8048 | 1.94 | 3.08 | 147.33 | 8.14 | - | |
X t 3 | −0.2633 | 10.6429 | −1.69 | 6.82 | 348.21 | 3.41 | 113.22 | |
0.30 | X t | 0.0275 | 0.9651 | −0.37 | 0.04 | 3.27 | 1.40 | - |
X t 2 | 0.9591 | 1.8115 | 1.94 | 3.08 | 147.52 | 8.24 | - | |
X t 3 | −0.2630 | 10.6987 | −1.70 | 6.82 | 348.45 | 3.33 | 113.46 | |
0.31 | X t | 0.0291 | 0.9662 | −0.37 | 0.05 | 3.34 | 1.40 | - |
X t 2 | 0.9603 | 1.8187 | 1.94 | 3.09 | 147.79 | 8.35 | - | |
X t 3 | −0.2626 | 10.7616 | −1.70 | 6.82 | 348.44 | 3.26 | 113.74 | |
0.32 | X t | 0.0308 | 0.9675 | −0.38 | 0.05 | 3.42 | 1.40 | - |
X t 2 | 0.9617 | 1.8266 | 1.95 | 3.09 | 148.18 | 8.46 | - | |
X t 3 | −0.2622 | 10.8318 | −1.69 | 6.82 | 348.25 | 3.19 | 114.02 | |
0.33 | X t | 0.0326 | 0.9689 | −0.38 | 0.06 | 3.50 | 1.41 | - |
X t 2 | 0.9633 | 1.8352 | 1.95 | 3.10 | 148.72 | 8.59 | - | |
X t 3 | −0.2617 | 10.9098 | −1.68 | 6.83 | 347.93 | 3.12 | 114.35 | |
0.34 | X t | 0.0343 | 0.9706 | −0.38 | 0.06 | 3.58 | 1.43 | - |
X t 2 | 0.9650 | 1.8445 | 1.95 | 3.11 | 149.41 | 8.73 | - | |
X t 3 | −0.2611 | 10.9958 | −1.67 | 6.84 | 347.58 | 3.06 | 114.64 | |
0.35 | X t | 0.0362 | 0.9724 | −0.39 | 0.07 | 3.66 | 1.45 | - |
X t 2 | 0.9670 | 1.8544 | 1.96 | 3.12 | 150.28 | 8.87 | - | |
X t 3 | −0.2603 | 11.0904 | −1.66 | 6.85 | 347.31 | 3.01 | 114.99 | |
0.36 | X t | 0.0381 | 0.9744 | −0.39 | 0.08 | 3.74 | 1.49 | - |
X t 2 | 0.9691 | 1.8651 | 1.96 | 3.14 | 151.36 | 9.03 | - | |
X t 3 | −0.2594 | 11.1938 | −1.63 | 6.87 | 347.25 | 2.96 | 115.35 | |
0.37 | X t | 0.0400 | 0.9767 | −0.40 | 0.08 | 3.81 | 1.53 | - |
X t 2 | 0.9715 | 1.8765 | 1.97 | 3.16 | 152.67 | 9.21 | - | |
X t 3 | −0.2583 | 11.3067 | −1.61 | 6.90 | 347.55 | 2.91 | 115.69 |
0.38 | X t | 0.0420 | 0.9791 | −0.40 | 0.09 | 3.88 | 1.59 | - |
---|---|---|---|---|---|---|---|---|
X t 2 | 0.9741 | 1.8888 | 1.97 | 3.19 | 154.22 | 9.40 | - | |
X t 3 | −0.2569 | 11.4295 | −1.58 | 6.94 | 348.40 | 2.87 | 116.09 | |
0.39 | X t | 0.0440 | 0.9818 | −0.40 | 0.09 | 3.95 | 1.65 | - |
X t 2 | 0.9769 | 1.9019 | 1.98 | 3.22 | 156.05 | 9.61 | - | |
X t 3 | −0.2553 | 11.5629 | −1.54 | 6.99 | 349.99 | 2.84 | 116.48 | |
0.40 | X t | 0.0461 | 0.9847 | −0.41 | 0.10 | 4.02 | 1.73 | - |
X t 2 | 0.9800 | 1.9159 | 1.99 | 3.26 | 158.16 | 9.84 | - | |
X t 3 | −0.2534 | 11.7074 | −1.50 | 7.06 | 352.57 | 2.81 | 116.89 | |
0.41 | X t | 0.0482 | 0.9879 | −0.41 | 0.11 | 4.08 | 1.82 | - |
X t 2 | 0.9834 | 1.9309 | 1.99 | 3.30 | 160.59 | 10.09 | - | |
X t 3 | −0.2511 | 11.8638 | −1.45 | 7.14 | 356.40 | 2.79 | 117.31 | |
0.42 | X t | 0.0504 | 0.9913 | −0.41 | 0.11 | 4.13 | 1.92 | - |
X t 2 | 0.9870 | 1.9470 | 2.00 | 3.35 | 163.34 | 10.36 | - | |
X t 3 | −0.2484 | 12.0330 | −1.39 | 7.25 | 361.79 | 2.77 | 117.75 | |
0.43 | X t | 0.0526 | 0.9950 | −0.41 | 0.12 | 4.18 | 2.03 | - |
X t 2 | 0.9909 | 1.9643 | 2.01 | 3.40 | 166.42 | 10.65 | - | |
X t 3 | −0.2452 | 12.2158 | −1.33 | 7.38 | 369.08 | 2.76 | 118.22 | |
0.44 | X t | 0.0549 | 0.9990 | −0.41 | 0.12 | 4.21 | 2.15 | - |
X t 2 | 0.9951 | 1.9829 | 2.02 | 3.46 | 169.86 | 10.97 | - | |
X t 3 | −0.2416 | 12.4134 | −1.27 | 7.53 | 378.64 | 2.75 | 118.70 | |
0.45 | X t | 0.0572 | 1.0033 | −0.41 | 0.13 | 4.24 | 2.29 | - |
X t 2 | 0.9996 | 2.0030 | 2.03 | 3.53 | 173.66 | 11.32 | - | |
X t 3 | −0.2373 | 12.6270 | −1.19 | 7.71 | 390.89 | 2.75 | 119.20 | |
0.46 | X t | 0.0595 | 1.0079 | −0.42 | 0.14 | 4.25 | 2.44 | - |
X t 2 | 1.0044 | 2.0247 | 2.04 | 3.61 | 177.82 | 11.70 | - | |
X t 3 | −0.2323 | 12.8582 | −1.11 | 7.92 | 406.27 | 2.75 | 119.73 | |
0.47 | X t | 0.0620 | 1.0128 | −0.41 | 0.14 | 4.25 | 2.60 | - |
X t 2 | 1.0096 | 2.0482 | 2.05 | 3.69 | 182.34 | 12.10 | - | |
X t 3 | −0.2267 | 13.1088 | −1.03 | 8.16 | 425.24 | 2.75 | 120.29 | |
0.48 | X t | 0.0644 | 1.0181 | −0.41 | 0.15 | 4.24 | 2.78 | - |
X t 2 | 1.0152 | 2.0737 | 2.06 | 3.78 | 187.25 | 12.54 | - | |
X t 3 | −0.2202 | 13.3809 | −0.93 | 8.44 | 448.27 | 2.76 | 120.88 | |
0.49 | X t | 0.0670 | 1.0238 | −0.41 | 0.16 | 4.21 | 2.97 | - |
X t 2 | 1.0211 | 2.1016 | 2.07 | 3.87 | 192.53 | 13.02 | - | |
X t 3 | −0.2127 | 13.6772 | −0.83 | 8.75 | 475.84 | 2.78 | 121.52 | |
0.50 | X t | 0.0695 | 1.0298 | −0.41 | 0.16 | 4.16 | 3.18 | - |
X t 2 | 1.0275 | 2.1319 | 2.08 | 3.97 | 198.22 | 13.53 | - | |
X t 3 | −0.2043 | 14.0009 | −0.73 | 9.09 | 508.36 | 2.81 | 122.22 | |
0.60 | X t | 0.0980 | 1.1188 | −0.32 | 0.28 | 2.89 | 6.02 | - |
X t 2 | 1.1207 | 2.6703 | 2.27 | 5.60 | 312.17 | 20.67 | - | |
X t 3 | −0.0402 | 20.1794 | 0.86 | 13.91 | 1178.64 | 5.41 | 137.37 |
α % level | Values of β | |
---|---|---|
Q* | H 0 | |
5 | [ − 0.23 , 0.44 ] | [ − 0.18 , 0.22 ] |
10 | [ − 0.19 , 0.37 ] | [ − 0.29 , 0.38 ] |
satisfying (1.20). We have also determined the values of β for which the simple bilinear model (1.20) is normally distributed and in which the process can be determined as a LGWNP or not. We recommend that for proper comparison of SBWNP with LGWNP, the SBWNP should be considered for normality, white noise test and test of equality of variance of its third moment being equivalent to the theoretical values of the LGWNP.
Arimie, C.O., Iwueze, I.S., Ijomah, M.A. and Onyemachi, E. (2018) On the Use of Second and Third Moments for the Comparison of Linear Gaussian and Simple Bilinear White Noise Processes. Open Journal of Statistics, 8, 562-583. https://doi.org/10.4236/ojs.2018.83037
Simulated Random Digits; e t , e t ~ N ( 0 , 1 ) (Read Across).