 Applied Mathematics, 2011, 2, 1303-1308 doi:10.4236/am.2011.210181 Published Online October 2011 (http://www.SciRP.org/journal/am) Copyright © 2011 SciRes. AM Independence of the Residual Quadratic Sums in the Dispersion Equation with Noncentral χ2-Distribution Nikolay I. Sidnyaev, Kristina S. Andreytseva Bauman Moscow State Technical University, Moscow, Russia E-mail: sidnyaev@yandex.ru, 9259988800@mail.ru Received May 6, 2011; revised July 1, 2011; accepted July 8, 2011 Abstract A model adequacy test should be carried out on the basis of accurate aprioristic ideas about a class of ade-quate models, as in solving of practical problems this class is final. In article, the quadratic sums entering into the equation of the dispersive analysis are considered and their independence is proved. Necessary and sufficient conditions of existence of adequate models are resulted. It is shown that the class of adequate models is infinite. Keywords: Noncentral χ2-Distribution, Dispersion Analysis, Adequate Models, Quadratic Sums 1. Introduction The dispersive analysis is defined as the statistical method intended for estimation of influence of various factors on a result of experiment, so application area of this method becomes much wider. Unbiased estimate for unknown parameters is the sum of squares. The main idea of the dispersive analysis consists in splitting of this sum of squares of deviations into some components, each of which corresponds to the prospective reason of averages changing. Let’s consider decomposition of the residual sum of squares 01QQQ2 and we will prove independence of the summands Q1 and Q2. Two theorems and four auxiliary lemmas will be necessary for the proof. 2. Preliminaries Lemma 2.1. The rank of composition of two matrixes A and B is less or equal to minimal rank of matrixes A and B i.e. min, .rABrA rB Proof. By a rule of matrixes multiplication, columns of matrix AB are a linear combination of columns of a matrix A, then the number of linearly independent col-umns in AB can’t surpass the number of linearly inde-pendent columns in A; consequently rAB rA. Doing similar reasoning for the lines (the lines of AB are a linear combination of the lines B), we will receive that rAB rB. The lemma is proved. Consequence of the inertia law of square-law forms (about quantity of invariants): if QxAxn is the square-law form with n variables 1,xx and its rank is equal to r, rA r, then r linear combinations of variables 1,nxx exist, for example, such 1,,zrzthat 21iiiQrz and every 1i or . 1sWe will use the Kohran theorem as a simple conse-quence of the following theorem. Theorem 2.1. Let , where 211NiiyQ QjQ, 1,js, are the square-law form with rank jn from vari-ables 1,,Nyy. Then the condition 12 snn nNAy is a necessary and sufficient condition for existence of the orthogonal transformation translating a vec-tor z1,,yyNy into a vector ,,Nz1zz in such way, that 111211...221211... 1,,,ssnnnnniisiin innQzQ zQz   12,i Prove. Necessity. If such orthogonal transformation exists, then 1304 2izN. I. SIDNYAEV ET AL. 1211snnNiiiy. The left part is the square-law form of a rank N, and the right part is the square-law form of a rank 12snn n. By the lemma 2.2, ranks of square-law forms are equal, i.e. . 12 snnn NSufficiency. As the rank jQ is equal jn, then from a conse-quence of the inertia law of square-law forms, it follows that jn linear combinations 1,,jnzz of variables 1,,Nyy exist, such that 2jiiiQz where each 1ior . For indexes i have values ; for – etc. Now, if , then 1n1Qn11, 2,, nnN2Q111, ,2n1siiN linear combinations 1,,Nzz exist, which in matrix designations can be written so: . zAyUsing a diagonal matrix NND with diagonal ele-ments 1,,N, we receive that 211sNjiijiQzzDzyAD Ayy. On the other hand . As the sym- 211sNjijiQyymetric matrix of the square-law form is unique, it is con-cluded that ADA ID, hence, it is nondegenerate. Now we will prove that . Let I1k . Then under the formula 1yAz we can find the values of 1,,Nyy corresponding to values 0iz at and ik1kz, and for these values 22111NNiiikiiyz , that is impossible. Hence, and DIAAI. Last equality shows that transformation is orthogo-nal. The theorem is proved. zAyRemark. The condition makes the square- 1siinNlaw forms i positive definite, as at orthogonal trans-formation, it turns out that all their characteristic num-bers are equal 0 or 1. QTheorem 2.2 . Let random variables i, y1,iN, are independent and have normal distributions ,1iN accordingly. Let further 211NisiyQQ where , iQ1,i1siinN. If i is the parameter of noncentrality , iQthen the value i2 can be received by replacement jy on , i.e. if iQiiQyAy, then 2iiA, where ,,N1; . Ny1,,yyProof. Necessity. If 1,,sQQ2 are the independent random variables with -distribution with 1,,snn freedom degrees accordingly, then 1sjjQ has noncentral 2-distribu-tion with 1sjjn freedom degrees. As has non-central 2iy1Ni2-distribution with N freedom degrees, and 2y11jQNsiij, hence, . 1sjjnNSufficiency. Let 1sjjnN. Then at orthogonal trans- formation zAy (theorem 2.1), random variables 1,,Nzz will be independent and normally distributed. From parities (2.1) and definitions of noncentral 2- distributions follows that 1,,sQQ have independent noncentral 2-distributions with 1,,snn freedom degrees accordingly. The theorem is proved. 3. Auxiliary Theorems and Lemmas We will assume that the space of values of random vari-ables is split into finite r parts 1,,nss without the general points, and let —probabilities 1,,npp{}iiPPXS, 1iP. Let's assume that all i. Let i0p is the number of observed values of random variables—X belongs to set is. Let’s consider a vector (1,,r). As a divergence measure between empirical and theoretical distribution we will consider 21riiiicpn, where factors could be chosen random. Pearson has shown ([2,3]) that if iciincp, then received measure 22211rriniiinppn np in  (3.1) possesses extremely simple properties. s, —the square-law form from vari-ables 1,,Nyy of i rank. Then 1n,,sQQ have independent noncentral 2-distribution with 1,,snn freedom degrees accordingly, in only case, when Theorem 3.1. At , distribution n 2n aspires to distribution 2 with r – 1 degrees of freedom. On the basis of this theorem by the set significance value  we will find the number 2n from the condition Copyright © 2011 SciRes. AM N. I. SIDNYAEV ET AL. Copyright © 2011 SciRes. AM 130522P (3.2) iPPXSi will belong to set i. Therefore, on the basis of a lemma 2.1, the probability of that 1S values will belong to set , , 1Sr values will belong to set , is equal to rSThe hypothesis 0H is rejected, if 22n. At the proof of the theorem the following lemma is required to us. 1112!!! !rrrnPP  (3.3) Lemma 3.1. Let 1,,r—the whole non-negative numbers, and 12rn. Number of ways, by means of which n elements can be divided into r groups, the first of which contains 1 elements, the second elements— 2, , irrelements, is equal to This expression, as it is easy to see, is the general member of decomposition . Joint distribu-tion of a random vector 1nrPP1,,r is set by expes-sion (3.3) and is polynominal distribution. We will find the characteristic function with polynominal distributions. We have 1!!!rn. Proof. The first group of 1 elements can be chosen by 1nC ways. After the first group is formed, 1n elements remain. Therefore, the second group of 2 elements can be chosen by 21nC ways etc. After for-mation of r – 1 groups, 11rrn elements remain, which form the last group. Thus, the number of all possible ways by means of which n elements can be distributed on r groups, from which the first contains 1 elements, , contains irr elements, is equal to 1111 111,1011ee!ee !!ee.rrrr rirrit it itit itrrnnit itrMe MnPPPP  Let’s enter new quantities: iiiinpxnp, . 1,2,,ir21111rrnnnCC C2. Using the formula !!!knnCknk, we will receive Then obviously, 0iixp, 21rr2ix. We will find characteristic function of a random vector the lemma statement. 1,,rxxx. We have Proof. Result of any test with probability 1111111 111 11,1 1011101!,, eeee!!!eee ee!!rrrrriikr rkrkr rkriinpnp npit it itnpitxnp npr rrnnp it itit it itnp np npnpnpin tpkrrrnnttMMPPnPPPP   e (3.4) Further, for any fixed , we will receive 1,,nrtt1ln,,lne kkit nprk kttn Pintpk (3.5) From decompositions 23e1 2!xxxOx, 21ln 123xxxR, 3Rx, and from (2.5) follows that   2223/2 3/2223/21223/2 23/211e ,221ln,,ln 1211ln 2223kk kkk kit ititnp npnpkkkkk kkkkrkkk kkkkkkk kit tppppOnp pOnnnpittntptOn intpnnini nntptOntptOn Rinnnnn          221/21122kkkkktpttpOn  (3.6) 1306 N. I. SIDNYAEV ET AL. So, now we can receive that 22111,,221lim, ,eekkk rttp Qt trntt  (3.7) The square-law form 221,,rkkkQtttt p has a matrix Ipp  where I designates an indi-vidual matrix, and P is a vector—column, replacing 1r with new variables 1 by means of or-thogonal transformation, at which ,,tt,,ruurkutpk, we will receive 2122 221111 1,,rrr rrkkk krQttttpu uu  2r. So, the square-law form 1,,rQt tn,,r is non-negative and also has a rank r – 1, i.e. at , joint character-istic function of quantities 1xx aspires to the ex-pression exp1 2Q, which is characteristic function of some nonintrinsic normal distribution of a rank r – 1, in which all weight is concentrated to a hyperplane 0kkxp. From the continuity theorem follows that 1,,rxx have nonintrinsic normal distribution with zero average and a matrix of the second moments . From here we receive that the quantity 1k22rrx in a limit has dis-tribution 2 with freedom degrees. 1r 4. Noncentral χ2-Distribution Let’s consider that 12 —the independent ran-dom variables with normal distribution with an average ,,,nyy yi () and a dispersion 1, i.e. (1, 2,,in~iyN,1i) (). Then random variable distribution 1, 2,,in21niiuy is called as noncentral 2-distribution [1-3]. The quantity u represents radius of a hypersphere in n-dimensional space [1,4]. Random variable distribution u depends only on pa- rameters n and . Therefore it also names 1/221nii2as noncentral -distribution with n degrees of freedom and non-centrality parameter  [2,5]. In this case, fol-lowing , a random variable u we will designate 2;nu If, 0, i.e. 0i (), distribution of random variable u named as central 1, 2,,i2n-distribution or it is simple 2-distribution with n degrees of freedom and a random variable u we will designate 2nu. Let 22P;n. Quantity;n is named as a threshold or 20- percentage point of 2-distribution with n freedom degrees. Its values for various  and n . The mean and variance of a random variable 2;n are 222;;;2nnMnD n24 . If 1121;nuand 2222;nu —independent random variables, then from definition of noncentral 2-dis-tribution it follows that their sum 2;n12uu u has noncentral 2-distribution with 12 of freedom and parameters of not centrality nn n degrees 1/22122 . 5. Main Results For the proof of 1Q and 2 independence we will result following auxiliary statements. QLemma 5.1. The rank of the sum of square-law forms doesn’t surpass the sum of their ranks. Proof. It is enough to show that if 1A and 2A are matrixes of one order and the rank iA is equal to i, then n121rAAr r2i. For the vector space generated by columns A, we will choose basis from vectors i. As columns 1r2AA are equal to the sums of corre-sponding columns 1A and 2A, then they are linear combinations 12rr of vectors of two bases; hence, the number of linearly independent columns in 12AAcan’t surpass 12rr. Hence, rA12A1r2r. The lemma is proved. Consequence. If 211NiisyQQ, where the rank jQ is less or equal to jn,1,js and if 1nnn N2s, then jjrQ n, 1,js. Proof. It follows directly from a lemma 5.1. On the one hand 11 1ss sjjjjj jrQrQ nN and on the other hand 211sNjijirQ r yN. Hence, 1sjjrQ N. Under a condition of jj, rQ n1,j,s perform-ance of last equality is possible only when jjrQ n, Copyright © 2011 SciRes. AM N. I. SIDNYAEV ET AL. 13071,js, as proves a consequence. Lemma 5.2. If Q is the square-law form from vari-ables 1,,Nyy and can be expressed as the square-law form from the variables 1,,pzz which are linear com- binations of 1,,NyyQy, то . rQA yzpBProof. Let NNpp and zpNzCy, A and B are symmetric. Then from equality QyCBCy follows that ACBC rAr CBCrpNrC, and on a lemma 2.1 it is re-ceived: . As C – a matrix of the size , then . The lemma is proved.  rQ CpUsing resulted above the statement, we will start the proof of independence and . As 1Q2Q0ooQyyX y3, then 12yy QQQ (5.1) where 33ooQXyyAy13oooo; AXXX X. Let’s define ranks of square-law forms 1, 2 and 3. As 0, then Q QQ3rA p33rQnp0 1,2,5. We will pass to the analysis of the square-law form 2211lmnls llsQy y. Let’s enter variables lsls lzyy, 1,ln; 1,lsm. It is obvious that 2211lmnlslsQz . As 11lmllslyyms, then 1100llmmls llsssyy z , therefore 11llmlm lsszz. Thus, 21122 221111111lllmmnnnnls lmlslslsllslsQzzzz    1lm. Apparently from this expression, is the square- law form from variables 2Q2n1,ln; 1, 1lsm, 211nllnmNn. As variables are linear lszcombinations of , and applying a lemma 5.2, we re-ceive lsy22rQnN n. Following the similar scheme for and applying the lemma 5, we find 1Q11rQnnp0 . Really, square-law form 1 from variables ls after some transformations can be written down in a kind Q y1QzTz, z – n-dimensional vector, and 0rT pn. On the basis of a consequence of a lemma 5.1 as 123nnn N, we receive ; 10rQnprQN n2; 30rQ p. Regarding that random variables lsy, 1,ln, 1, lsm are independent and have normal distribution ,1lsN, where ls lls, then transition from equality (4.1) to equality 312222QQQyy2 allows to apply the Kohran theorem. Under this theorem random variables 12Q, 22Q and 32Q are independent and have noncentral 2-distributions with 0np, Nn and freedom degrees. Thus, independence of and Q also is proved. 0p1 2Remark. Applying the Kohran theorem to calculation of parameter of non-centrality Q22 of the square-law form 22Q, it is easy to be convinced that if the hypothe- sis 0H is true or not, then i.e. the quantity 22022Q2u has central 2-distribution: 2222221111 122111110lllmmnnls llllsls slmnlllsm   1lm 6. References  V. S. Asaturyan, “The Theory of Planning an Experi-ment,” Radio I Svyaz, Vol. 73, No. 3, 1983, pp. 35-241.  V. A. Kolemaev, O. V. Staroverov and A. S. Turun-daevski, “The Probability Theory and Mathematical Sta-tistics,” Vyishaya Shkola, Moscow, 1991, pp. 16-34.  O. I. Teskin, “Statistical Processing and Planning an Ex-periment,” MVTU, Moscow, 1982, pp. 12-26.  N. I. Sidnyaev, V. A. Levin and N. E. Afonina, “Mathe-matical Modeling of Intensity of Heat Transmission by Means of the Theory of Planning an Experiment,” Copyright © 2011 SciRes. AM N. I. SIDNYAEV ET AL. Copyright © 2011 SciRes. AM 1308 Inzhenerno Fizicheskii Gurnal (IFG), Vol. 75, No. 2, 2002, pp. 132- 138.  N. I. Sidnyaev, “The Theory of Planning Experiment and Analysis of Statistical Data,” URight, Moscow, 2011, pp. 95-220.