Stats 461-1 Homework 1
Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit
Stats 461-1 Homework 1
You are encouraged to collaborate with your classmates to solve the problems. But you need to write down the homework solutions on your own.
1 Basic Information Theory
Problem 1. (Basics of Entropy)
(a) Establish the following inequality:
2H(X1,X2 ,X3 ) ≤ H(X1,X2 ) + H(X1,X3 ) + H(X2,X3 ).
(b) Solve the following IMO problem (year 1992, Problem 5). Let S be a finite set of
points in three dimensional space. Let Sx,Sy,Sz be the set of points consisting of the orthogonal projections onto the yz-plane, xz-plane, and xy-plane. Show
|S|2 ≤ |Sx||Sy||Sz|.
Above, the notation |T| denote the cardinality of a set T.
Problem 2. (Computation of Mutual Information)
Let (Xp,Yp) be uniformly distributed in the unit ℓp ball: Bp = {(x,y) : |x|p +|y|p ≤ 1}. Also, define the ℓ∞ ball B∞ = {(x,y) : |x| ≤ 1, |y| ≤ 1}.
(a) Compute I(Xp;Yp) for p = 1/2, p = 1 and p = ∞ .
(b) Compute limp→0+ I(Xp;Yp).
Problem 3. (Computation of KL-Divergence)
(a) Prove the following identity:
Dkl(Binom(n,p)||Binom(n,q)) = nDkl(Ber(p)||Ber(q)). (b) Prove the following identity:
Dkl(Poisson(nλ1 )||Poisson(nλ2 )) = nDkl(Poisson(λ1 )||Poisson(λ2 )).
Problem 4. (Information Lost in Erasure Channel)
Let X,Y be a pair of random variables with I(X;Y) < ∞ . Let Z be obtained from Y by passing the letter through an erasure channel, i.e., X − Y − Z where
Z =
Prove that I(X;Z) = δI(X;Y).
Problem 5. (Hewitt-Savage 0-1 Law)
The Hewitt-Savage 0-1 law states that certain symmetric events under the i.i.d. model have no randomness. Let {Xi}i∈N be a sequence be i.i.d. random variables. Let E be an event determined by this sequence. We say E is exchangeable if it is invariant under permutation of finitely many indices in the sequence of {Xi}’s, e.g., the occurence of E is unchanged if we permute the values of (X1,X4,X7), etc.
Let’s prove the Hewitt-Savage 0-1 law information-theoretically in the following steps:
• (Warm-up) Verify E = {对iXi converges} and E = {limn→∞ 对iXi = E[X]} are exchangeable events.
• Let E be an exchangeable event and W = 1E is its indicator random variable. Show that for any k , I(W;X1 , ...,Xk) = 0.
• Since E is determined by the sequence {Xi}i≥1, use certain continuity property of mutual information to argue that I(W;W) = 0. Hence P(E) ∈ {0, 1}.
• (Application to random walk) Often after the application of Hewitt-Savage, fur- ther efforts are needed to determine whether the probability is 0 or 1.
As an example, consider Xi’s are iid ±1 and Sn = 对 Xi denotes the symmetric random walk. Verify that the event E = {Sn = 0 finitely often} is exchangeable. Now show that P(E) = 0. ((Hint: consider E+ = {Sn > 0 eventually} and E−
similarly. Apply Hewitt-Savage and invoke symmetry).
Problem 6. (Extremization of Entropy and Divergence)
• Find the continuous distribution X that
maximize h(X)
subject to E[mgX] = µ .
Above, m,g,µ are constants. According to Richard Feynman (the Feynman lec- tures on Physics, Volume I, Chapter 40), the density of the atmosphere has roughly distribution of this form.
• Find the distribution X that
minimize DKL(X||N(0, 1))
subject to E[X] ≥ t .
Above, t > 0 is given. The minimal value, due to Sanov’s theorem, can be inter- preted as the exponent of a large deviation. Indeed, compute limn→∞ − log P(n ≥ t) for independent copies X1 , . . . ,Xn ∼ N(0, 1) where n = (对i≤nXi)/n.
2023-04-15