ASYMPTOTIC DERIVATION OF Z * STATISTIC

Abstract: In multivariate analysis, the stability of correlation matrix is a major issue. We can be seen it in the literature, the most popular tests and widely used are Jennrich test and Box M test which introduced by Jennrich in 1970 and Box in 1949. Jennrich test involve inverse of matrix and Box involve the determinant of the matrix. Under these conditions the computation of the tests are quite cumbersome when the data in high dimension. This encourage us to propose a new statistical test. Which is constructed based on upper-offdiagonal elements to overcome the difficulty.

To test the stability of correlation or covariance matrix the most and widely tests used are [2] and [8] as a example see [4], [12], [11], [18], [6], [5].These tests apply to several of independent samples of correlation matrices, which are drawn from a p-variate normal distribution.These tests constructed based on likelihood ratio test (LRT) where their distributions under H 0 were derived for asymptotic case.Moreover, those tests are to test the stability of the correlation matrix in simultaneous way.
However, usually the number of variables are large.This condition can make the computation of those tests tedious since those tests involves determinant and inverse of the matrix that makes the computation efficiency becomes low [7].
We constructed our test based on vec operator and commutation matrix.This statistical test is based on the linear transformation that change the matrix to vector where it is elements are the upper-off-diagonal only to ensure the nonsingularity of the matrix.

Asymptotic Derivation
To construct the statistical test we use the asymptotic distribution of the correlation matrix developed in [3] and [14].Let X 1 , X 2 , ... X n a random sample of size n drawn from p-variate normal distribution with positive definite covariance matrix Σ, where S d and Σ d is the diagonal elements of S and Σ respectively.Thus the correlation matrix R and Ω are

Asymptotic Distribution of vec(R)
The asymptotic sample correlation matrix R will be examined from p=2 to p>2.For that purpose we use the results presented in [1], see p. 132, Theorem 4.2.3.

Theorem 1. Let U(n) be a sequence of p-component random vectors and b a fixed vector such that
For p=2 the correlation matrix 1 r 12 r 21 1 this matrix is transforms into vector which is element is r 12 the upper-off-diagonal of the matrix.This transformation is developed in [16] and we developed it here.However, before we start we need to identify the distribution of correlation sample R. By using the vec operator and the commutation matrix, it is formulated in the following proposition.
Proposition 1.Let X 1 , X 2 , , X n be a random vector drawn from p-varite normal distribution of size n then According to Central Limit Theorem, we have the following proposition of covariance matrix of vec(S).
Proposition 2. If n → ∞, according to central limit Theorem, the asymptotic distribution of S is equal to

Asymptotic Distribution of v(R U )
The correlation matrix is a symmetric matrix, it has redundant elements [15].To eliminate those elements, we consider the upper off diagonal elements in the matrix only.We denote it v(R U ).For that we present linear transformation matrix T to eliminate the non-random elements in R.

2
; T a = (t a i,j ), each of size (k × p), where T 1 is zero matrix, where a = 2, 3, , p.Where C a 2 is the number of combinations of 2 out of a objects.We use this asymptotic distribution of R to drive the asymptotic distribution of our test.
Corollary 1.Let λ a matrix of size p×p such that vec(λ) = T t * T * vec(Ω).Then by using the corollary 1 we have the following We define D W as a matrix the diagonal elements are the diagonal elements of (Schott, 1997).

Proposed Test
As we mentioned above the limitations of those tests which involves the determinant of the sample matrix, i.e., generalized variance (GV) as a measure of multivariate dispersion measurement.Due to the application of this measure these test are quite cumbersome when the data in high dimension.Our proposed test used vector variance as a measure of multivariate dispersion, which is equal to a sum square of the elements of the sample correlation.
According to [13] by using multivariate statistical process control (MSPC), the hypothesis testing for the stability of correlation structure H 0 : Ω 1 = Ω 2 = ... = Ω m versus H 1 : Ω i = Ω j for at least one pair (i, j) is equivalently to repeated the test H 0 : Ω i = Ω 0 versus H 1 : Ω i = 0 where i = 1, 2, , m.Where Ω 0 is reference sample.The proposed statistical test is The null hypothesis H 0 is rejected at the significance level α when |Z i > z α 2 with (1 − α/2) t h quantile of standard normal distribution.However, in the case of Ω unknown, it must be estimated from independent random sample Ω where Ω = R the average of correlation matrices of R 1 , R 2 , , R m .

Conclusion
In this paper, the asymptotic distribution of correlation matrices was derived.
The distribution can be approximated by standard normal distribution.By using the upper-off-diagonal elements of the matrix to handle the case when p is large.