关键词 > ST323/ST412
ST323/ST412 Assignment 1 (2022-23)
发布时间:2022-10-29
Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit
ST323/ST412 Assignment 1 (2022-23)
Q1. Let p be the set of positive semi-definite (symmetric) p × p matrices, and write λj(A) for the jth largest eigenvalue of A e p, for each j = 1, . . . , p. Recall the definition of the Frobenius norm
|A|F = ← j(i)=1 Aij(2) . In this question we will compare | . |F to another norm on the set of positive
semi-definite matrices.
(a) Prove that for all A e p we have |A|F(2) = Tr(AT A).
(b) Use the Spectral Decomposition Theorem and part (a) to prove that |A|F(2) = λ 1 (A)2 + . . . + λp(A)2 .
Define the function f : p → R by f (A) = supxeRp :|x|=1 xT Ax.
(c) Prove that for all A, B e p we have f (A) 2 0 and
f (A + B) < f (A) + f (B).
(d) Give an expression for f (A) in terms of the spectral decomposition of A and show that |A|F < f (A) < |A|F
for all A e p . tions .]
[Hint: you may use Theorem 4 .2. 3 and its proof without repeating the calcula-
Q2. In this question we will show that, when we have an i.i.d. sample, the sample covariance matrix
is an unbiased estimator of the population covariance matrix. Recall the definition of the n × n centring matrix H = In _ n-1 11T and the fact that H is an orthogonal projection matrix (shown in Exercise 4.6).
(a) Give the eigenvalues of H and their multiplicities.
Let x be an n × p data matrix, where the rows of x are independent and identically distributed from some distribution on Rp with covariance matrix Σ .
(b) If u e Rn show that xT u has covariance matrix IuI2 Σ .
(c) Using the Spectral Decomposition Theorem and part (a), show that there exist u1 , . . . , un-1 satisfying IueI = 1 and ue(T)1 = 0 for all f = 1, . . . , n _ 1 such that
n-1
xT Hx = (ue(T)x)T (ue(T)x).
e=1
(d) Use (b) and (c) to show that E(xT Hx) = (n _ 1)Σ .
Q3. In this question we will prove part of Theorem 4.3.3 on Canonical Correlation Analysis. Singular
value decompositions and some of the ideas from the proof of Theorem 4.2.3 on Principal Component Analysis may be helpful.
Let X and Y be two random vectors taking values in Rp and Rq, respectively, where we assume Cov(X) = Ip and Cov(Y) = Iq . Write Σ for their p × q cross-covariance matrix.
(a) For a e Rp and b e Rq give, with proof, an expression for Cov(aT X, bTY) in terms of Σ .
(b) Clearly stating any results you use from the notes, use (a) to maximise Cov(aT X, bTY) over a e Rp with IaI = 1 and b e Rq with IbI = 1. Give the maximising values of a and b and the
maximal value of Cov(aT X, bTY).
In the final parts of the question we do not assume that Cov(X) = Ip and Cov(Y) = Iq . Write ΣX = Cov(X) and ΣY = Cov(Y) and assume that both these matrices are positive definite.
(c) For a e Rp and b e Rq give, with proof, an expression for the correlation (not covariance) Cor(aT X, bTY) in terms of ΣX , ΣY and Σ .
(a) By considering U = ΣX(-)1/2X and V = ΣY(-)1/2Y and using part (b), or otherwise, show that Cor(aT X, bTY) is maximised by taking a = ΣX(-)1/2c1 and b = ΣY(-)1/2d1 , where c1 and d1 are
the leading left- and right-singular vectors of ΣX(-)1/2ΣΣ