MATH5905 STATISTICAL INFERENCE Part one
Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit
DEPARTMENT OF STATISTICS
MATH5905 STATISTICAL INFERENCE
Part one: Decision theory. Bayes and minimax rules
1. Suppose d1 , d2 , d3 and d4 are nonrandomized decision rules with risks as given in the following table:
i |
1 |
2 |
3 |
4 |
R(θ1 , di ) R(θ2 , di ) |
0 6 |
1 5 |
2 3 |
3 5 |
a) Find the minimax rule(s) amongst the nonrandomized rules D = {d1 , d2 , d3 , d4 };
b) Obtain the minimax rule in the set of randomized rules D generated by the set of rules in D . State the minimax risk of this rule.
c) Find the Bayes rule and the Bayes risk for the prior ( , ) on (θ1 , θ2 ).
d) Express the randomized decision rule with risk point (2,5) using the given non-randomized decision rules.
e) Calculate all priors for which d1 is a Bayes rule.
2. A decision rule d is called admissible in a class of rules if there is no other decision rule d* in the class such that R(θ, d* ) < R(θ, d) for all θ e Θ and R(θ, d* ) < R(θ, d) for at least one value of θ e Θ . Let X be uniformly distributed on [0, θ] where θ e (0, o) is an unknown parameter (i.e., Θ = [0, o)). Let the action space be [0, o) and the loss function L(θ, a) = (θ - a)2 where a is the chosen action (the action now is estimation so a = d(X) for given observation X and decision d). Consider the set of decision rules du (x) = µx, µ > 0. For what value of µ is du unbiased? Show that µ = 3/2 is necessary condition for du to be admissible.
3. Suppose X1 , X2 , . . . , Xn have conditional joint density
fx1 ,x2 , . . . ,x石 0Θ(x1 , x2 , . . . , xn |θ) = θ n e_9 i(石)=1 zi , xi > 0 for i = 1, . . . , n; θ > 0
and a prior density is given by τ (θ) = ke_k9 , θ > 0, where k is a known constant.
i) Calculate the posterior density of Θ given X1 = x1 , X2 = x2 , . . . , Xn = xn .
ii) Find the Bayesian estimator of θ with respect to squared error loss.
4. Suppose a single observation x is available from the uniform distribution with a density f (x|θ) = I(z,l)(θ), θ > 0. The prior on θ is with a density τ (θ) = θ exp(-θ), θ > 0. Find the Bayes estimator of θ:
i) with respect to quadratic loss;
ii) with respect to absolute value loss L(θ, a) = |θ - a|.
iii)(*) with respect to the loss Ln (θ, a) = (θ - a)(η - I(θ - a < 0)) where η e (0, 1) is a fixed weight.
5. Let X1 , X2 , . . . , Xn be a random sample from the normal density with mean µ and variance 1. Consider estimating µ with a squared-error loss. Assume that the prior τ (µ) is a normal density with mean µ0 and variance 1. Show that the Bayes estimator of µ is u0 + n1 xi .
6. As part of a quality inspection program, five components are selected at random from a batch of
components to be tested. From past experience, the parameter θ (the probability of failure), has a beta distribution with density
τ (θ) = 30θ(1 - θ)4 , 0 < θ < 1.
We wish to test the hypothesis H0 : θ < 0.2 against H1 : θ > 0.2 using Bayesian hypothesis testing with a 0-1 loss. What is your decision if:
i) In a batch of five, no failures were found
ii) In a batch of five, one failure was found.
2023-04-06
Decision theory. Bayes and minimax rules