STAT 525 Homework 6
Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit
Homework 6
STAT 525
You are required to use LaTex to complete the homework.
1. (60 points) Consider the following 1D total variation denosing problem. We observe yi e R, i = 1. _ _ _ . n where
yi = μi + fi _
Here the mean μi = f (i/n) where f is piece-wise constant function on [0. 1] and fi follows standard Gaussian distribution. To recover the piece-wise constant signal, we consider the following total variation denosing formulation
n-+
i2+
where A is a tuning parameter. The accelerated dual proximal gradient method and ADMM can be used to solve this problem.
(a) Write out the steps for accelerated dual proximal gradient method and ADMM
applied to the total variation denosing problem in explicit details.
(b) Implement these two methods for total variation denosing problem. Please use R
or Python to implement it from the scratch and you need to submit your R or Python script.
(c) Conduct a suite of numerical experiments to compare these two methods.
2. (40 points) We continue considering the Lasso regression problem in Midterm Exam. We observe (xi . yi ), i = 1. _ _ _ . n, where xi e Rd and yi e R. We assume (xi . yi ) satisfies the following linear model
yi = xi8 + f
where f is standard Gaussian noise. To achieve sparsity over individual features, we consider a l+ penalty
h(8) = l8l+_
The lasso regression problem can be cast as the following optimization problem
min 1 ly _ x8l2 + 入h(8).
β 2
where 入 is tuning parameter. Lasso regression problem can be solved by subgradient descent, proximal gradient descent and accelerated proximal gradient descent methods, coordinate descent method, and ADMM.
(a) Write out the steps for coordinate descent method and ADMM applied to the
lasso problem in explicit details.
(b) Implement these five methods for lasso problem. Please use R or Python to
implement it from the scratch and you need to submit your R or Python script.
(c) Conduct a suite of numerical experiments to compare these five methods: subgra- dient descent, proximal gradient descent, accelerated proximal gradient descent methods, coordinate descent method and ADMM.
2022-11-30