Math 170A, FA 2022 Midterm review
Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit
Math 170A, FA 2022
Midterm review
Q1. (10pts) Consider the following matrix
A = 1
1
≠ 1
1.(、) .
(a) perform the following operations on the matrix A:
● Step 1: update row 2: add 3 ↓ row 1 to row 2.
● Step 2: swap row 3 and row 2.
Write down two matrices (L, P) perform these operations, so that PLA realizes this transformation. (2pts)
Your Answer:
L = ╱3(1) 0
(
0(0)、
1. ,
P = ╱0(1) 0(0)
(0 1
1(0)、
0. ,
Write down the resulting matrix PLA. (1pt)
Your Answer:
PLA = ╱0(2)
(
≠ 1
≠ 1
≠ 1
2(0) 、
≠ 1. .
(b) Determine if the matrix A is positive definite and symmetric. (4pts)
Your Answer:
Proof. First, it is clear that A is symmetric.
To show positive definite, we try to do the Cholesky decomposition on A:
Assume RT R = A where R = ╱T0(11) ( |
T12 T22 0 |
T13 、 . |
First, T 11(2) = 2, thus T11 = ^2 and T12 = ≠ , T13 = 0.
(c) Compute #A#F , #A#1 and #A#o . (3pts)
Your Answer:
#A#F = ìùi,jAij(2) =^3 ì 22 + 4 ì ( ≠ 1)2 = ^16 = 4
3
#A#1 = m x ù 怅Aj(a) i,j 怅 = max(3, 4, 3) = 4
i=1
3
#A#o = m x ù 怅Ai(a) i,j 怅 = max(3, 4, 3) = 4
j=1
Q2. (5pts) Vector norm
(a) Given the Cauchy-Schwarz inequality: for all x, y ↓ Rn ,
│ xiyj │ 女 ╱ xi(2)\1/2 ╱ yi(2)\1/2 ,
prove that
怅怅x怅怅2 = ╱ xi(2)\1/2
is a vector norm. (4pts)
Proof. To show # ì #2 is a norm, we need to prove that it satisfies the 3 properties of vector norms.
i. For x 0, at least one of xi 0, which means ( xi(2) > 0, thus #x#2 > 0. If x = 0, then #x#2 = (( 02 )1/2 = 0.
ii. For a ↓ R,
n n n
#ax#2 = (ù a2 x )i(2) 1/2 = (a2 ù x )i(2) 1/2 = 怅a怅(ù xi(2))1/2 = 怅a怅#x#2
i=1 i=1 i=1
iii. Lastly, we want to show the triangle inequality:
n n n
#x + y#2(2) =ù (xi + yi )2 =ù (xi(2) + y ) + 2 ùi(2) xiyi
i=1 n
n
i=1
n
i=1
n
女 ù xi(2) +ù yi(2) + 2(ù x )i(2) 1/2(ùyi(2))1/2
i=1 i=1 i=1 i=1
n n
= ((ù x )i(2) 1/2 + (ù yi(2))1/2)2 = (#x#2 + #y#2 )2
i=1 i=1
Take the square root of both side of the above inequality, we have #x + y#2 女 #x#2 + #y#2 .
(b) Given
x = 1.(、) ,
Compute 怅怅x怅怅1 , 怅怅x怅怅o . (1pt)
#x#1 =ù 怅xi 怅 = 2 + 1 + 0 = 3
i
#x#o = max辛怅2怅, 怅 ≠ 1怅, 怅0怅含 = 2
Q3. (10pts) An experiment has produced the following four points of 2-dimensional data:
(0, ≠ 1), (1, ≠3), (2, 1), and (4, 2).
a) Set up a least squares problem for the best line fit for this set of points, explicitly writing out the matrix A and the vector y . (3pts)
Your Answer: Suppose the line is given by P (t) = a0 + a1t. Then the residual
T is given by T = y ≠ Ax:
╱.、││ = ╱.3(1)、││ ≠ ╱.
2 │(│) ╱a(a)1(0)、
The least square problem is given by
min #y ≠ Ax#2(2)
.
b) Set up a least squares problem for the best quadratic fit for this set of points,
explicitly writing out the matrix A and the vector y . (3pts)
Your Answer:
Suppose the line is given by P (t) = a0 + a1t + a2t2 . Then the residual T is given
╱.、││ = ╱.3(1)、││ ≠ ╱.
The least square problem is given by
1(0) 、
4 │
16.
╱a(a)1(0)、
(a2 .
min #y ≠ Ax#2(2)
.
c) Based on your answer of b), use the classical Gram-Schmidt process to
calculate the reduced QR decomposation of A. (4pts)
Your Answer:
Let v1 = [1 1 1 1]T , v2 = [0 1 2 4]T , and v3 = [0 1 4 16]T . Then
1 1 ╱ 、
q1 = #v1 #2 v1 = 2 v1 = .(.) │(│) ,
q˜3 = v3 ≠ < v3 , q1 > q1 ≠ < v3 , q2 > q2 = ..(││.≠ 10.5..(││.≠ 12.2547..(0..075││.= ..(2(1)..47(9)││.
q3 = q˜3 = q˜3 =
Therefore, the QR decomposition is given by:
╱. 、││ = ╱. 0(0)..05(6) .005.7(3)、││
╱ 0.5641 、
( . .
╱2.00(0)00 ( 0 |
3.5000 2.9580 0 |
、 3.5456 . |
(Bonus) Based on your reduced QR decomposation of Q3. c), compute the minimizer
x that solve the least squares problem. (1pt) (Hint: understand how reduced QR can help solve the least square problems.)
Your Answer: x = [≠ 1.6818 0.3409 0.1591]T
2022-12-06