PSTAT 120A, Summer 2022: Practice Problems 10
Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit
PSTAT 120A, Summer 2022: Practice Problems 10: Final Review, Part I
Week 2
Conceptual Review
(a) Review the conceptual questions from the previous Discussion Worksheets!
(b) What are the different notions of conditional p.m.f.’s?
(c) What is the difference between E[X | Y = y] and E[X | Y]?
(d) What is the Law of Iterated Expectation? What is the analog for variances?
1 Conditional Distributions and Expectations
Problem 1: Continuous Conditioning
Let (X, Y) be a continuous bivariate random vector with joint p.d.f. given by
fX,Y (x, y) =
where c > 0 is an as-of-yet undetermined constant.
(a) Find the value of c.
(b)
(c)
Find fY (y), the marginal p.d.f. of Y.
Find fX |Y (x | y), the conditional density of X given Y = y .
(d)
(e)
Find fX (x), and verify your answer to part (d).
Problem 2: Discrete Conditioning
Let (X, Y) be a discrete bivariate random vector with joint p.m.f. given by
fX,Y (x, y) =
where c > 0 is an as-of-yet undetermined constant.
(a) Find the value of c.
Find pY (y), the marginal p.m.f. of Y.
Find pX |Y (x | y), the conditional p.m.f. of X given Y = y .
|
|
= |
Compute pX (x), and determine whether or not X and Y are independent. Try
to make an argument using only your answer to part (c), and pX (x).
Problem 3: Iterations!
In each of the following parts, you will be provided with the conditional distribu-
tion of (X | Y) and the marginal distribution Y. Using the provided information,
compute E[X] and Var(X).
(a) (X | Y) ∼ Bin(Y, p); Y ∼ Pois(u)
(b) (X | Y) ∼ Exp(1/Y); Y ∼ Gamma(r, 入)
2 General Problems (includes Cond. Distn’s)
Problem 4: Axiomatic Proof
Given a probability space (Ω, F, P) and events A, B ∈ F, prove the following
identity:
P(A | BC) = 1 − +
Problem 5: Variance of Sums
Using only first principles (i.e. taking care not to use any previously-derived results
pertaining to variance of sums of random variables), derive an expression for
Var ( (−1)i Xi )
Simplify as much as you can.
Problem 6: Faces of the Same Die
A fair k−sided die is rolled n times, where where n and k are fixed natural numbers.
Problem 7: A Useful Result
Suppose X ∼ Pois(入) and Y ∼ Pois(u) with X ⊥ Y. Find P(X = k | X + Y = n)
distribution of (X | X + Y = n). Be sure to include any/all relevant parameter(s)!
Problem 8: Discrete Joint
Let (X, Y) be a discrete bivariate random vector with joint p.m.f. (probability mass
function) given by
pX,Y (x, y) =
where c > 0 is an as-of-yet undetermined constant.
(a) Find the value of c.
(b) Compute P(X = Y).
Problem 9: Discrete Convolution
Let X ∼ Geom(p1) and Y ∼ Geom(p2) with X ⊥ Y. Derive the p.m.f. of Z := X+Y.
Problem 10: Waitin’ in Line
Alex and Drew are waiting in two separate lines at Dean Coffee. Suppose that the
time it takes for Alex to reach the counter follows an Exp(入A) distribution and the
time it takes for Drew to reach the counter Exp(入D) distribution. Further suppose
that the two lines move independently of each other. What is the probability that
Alex reaches the counter before Drew does?
Problem 11: Gamma Gamma Gamma
(a) Show that for any r > 0, Γ(r) = (r − 1)Γ(r − 1)
Use part (a) to argue that Γ(n) = (n − 1)! whenever n ∈ N.
Show that Γ(1/2) = √ .
(d) Compute Γ(5/2).
Prove the following identity:
( 入 t ) r 't =0 = 入(n)
Hint: Don’t try to prove this directly; use probability!
(ASV, 9.21)
Let X1 , · · · , X500 be i.i.d. random variables with expected value 2 and variance 3.
The random variables Y1 , · · · , Y500 are independent of the Xi variables, also i.i.d.,
but they have expected value 2 and variance 2. Use the CLT to estimate
P ( Xi > Yi + 50)
Prove Wald’s Identity: if X1, X2 , · · · are i.i.d. random variables with finite mean, and N is a nonnegative integer-valued random variable independent of the Xi’s (also with finite mean), then
E | Xi | = E[N] · E[X1]
Note that we cannot directly apply linearity, since the upper index of summation is random.
Hint: Condition on {N = n}
2022-07-29