Z. F. GUO ET AL.

Copyright © 2011 SciRes. TEL

17

The lag selection procedure is consistent if the prob-

ability of approaches one as.

ˆ

SSn

Theorem 1: Under our assumptions and[0,1]

, as

,

n

()

()

FPE SA

FPE SA

for any overfitting combinations.

12

{, ,...,}

m

Siii

The overfitting ()

PE S

0

asymptotically becomes lar-

ger than the correctly specified because the

penalty of the former converges at a rate slower than the

latter as long as

FPE( )S

. It should be noted that opt

h

used

for differs from opt . Unlike the unrestricted

FPE, however, the convergence rates of two bandwidths

are the same for additive models even if the dimensions

of the regressors are different, that is why

FPE( )Sh

0

is not

desirable in excluding overfitting models. Following the

same argument as in Guo and Shintani (2011) [4], we

can easily show that the FPE for underfitting case is lar-

ger that of a correctly fitting model. Then we have:

Theorem 2. Under our assumptions and [0,1]

, as

,

n

ˆ1

PS S

.

Remarks

The consistency of our procedure holds for both local

linear and local constant estimators.

If 0

, the probability of selecting the correct

model converges to one as the sample size increases.

If 0

, our criterion is asymptotically equivalent

to the asymptotic FPE.

While the FPE-like procedure by Guo and Shintani

(2011) [4] and our procedure are both consistent,

the latter procedure can be expected to perform

better in the finite sample because of better finite

sample performance of our procedure.

4. Conclusions

The better finite sample properties of the backfitting me-

thod over marginal integration have been often reported

in many simulation studies. Guo and Shintani (2011) [4]

propose a FPE-like procedure based on the marginal in-

tegration method due to its simplicity. Our paper pro-

poses a more effective lag selection criterion based on

the smooth backfitting estimator. The new criterion can

be expected to perform better in the finite sample.

5. References

[1] D. Tjøstheim and B. Auestad, “Nonparametric Identifica-

tion of Nonlinear Time Series: Selecting Significant

Lags,” Journal of the American Statistical Association,

Vol. 89, No. 428, 1994, pp. 1410-1419.

doi:10.2307/2291003

[2] R. Tschernig and L. Yang, “Nonparametric Lag Selection

for Time Series,” Journal of Time Series Analysis, Vol.

21, No. 4, 2000, pp. 457-585.

doi:10.1111/1467-9892.00193

[3] B. Cheng and H. Tong, “On Consistent Nonparametric

Order Determination and Chaos,” Journal of the Royal

Statistical Society series B (Methodological), Vol. 54, No.

2. 1992, pp. 427-449.

[4] Z. F. Guo and M. Shintani, “Nonparametric Lag Selec-

tion for Additive Models,” Economics Letters, Vol. 2, No.

2, 2011, pp. 131-134.

doi:10.1016/j.econlet.2011.01.014

[5] O. B. Linton and J. P. Nielsen, “A Kernel Method of

Estimating Structured Nonparametric Regression Based

on Marginal Integration,” Biometrika, Vol. 82, No. 1,

1995, pp. 93-100. doi:10.1093/biomet/82.1.93

[6] S. Sperlich, O. B. Linton and W. Härdle, “Integration and

Backfitting Methods in Additive Models-Finite Sample

Properties and Comparison,” Test, Vol. 8, No. 2, 1999, pp.

419-458. doi:10.1007/BF02595879

[7] E. Mammen, O. B. Linton and J. P. Nielsen, “The Exis-

tence and Asymptotic Properties of a Backfitting Projec-

tion Algorithm under Weak Conditions,” Annals of Sta-

tistics, Vol. 27, No. 5, 1999, pp. 1443-1490.

doi:10.1214/aos/1017939137

[8] J. P. Nielsen and S. Sperlich, “Smoothing Backfitting in

Practice,” Journal of the Royal Statistical Society Series

B, Vol. 67, No. 1, 2005, pp. 43-61.

doi:10.1111/j.1467-9868.2005.00487.x