关键词 > AMATH242/CS371
AMATH 242/ CS 371 Assignment 4
发布时间:2022-07-19
Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit
AMATH 242/ CS 371 (Spring 2022)
Assignment 4
Due: Wednesday, July 20 by midnight
For this assignment, please submit pdfs/screenshots of your code along with the output/figures directly to Crowdmark. Additionally, submit the raw code to LEARN. In all figures, label your axes.
1. The following figures from the Census Bureau give the population of the United States:
Year |
Population |
1900 |
75,994,575 |
1910 |
91,972,266 |
1920 |
105,710,620 |
1930 |
122,775,046 |
1940 |
131,669,275 |
1950 |
150,697,361 |
1960 |
179,323,175 |
1970 |
203,235,298 |
Since there are eight points, there is a unique polynomial of degree 7 which interpolates the data. However, some of the ways of representing this polynomial are computationally more satisfactory than others. Here are three possibilities, each with x ranging over the interval 1900 ≤ x ≤ 1970:
7
y1 (x) =工 aj xj
j=0
7
y2 (x) =工 aj lj (x)
j=0
7
y3 (x) =工 aj πj (x)
j=0
where lj (x) =ui(7)=0,ij
are the Lagrange polynomials, and πj (x) =u
(x − xi ) are
the Newton polynomials.
In each case, the coefficients are found by solving an 8-by-8 system, but the matrices of various systems are quite different.
(a) Set up each of the three matrices.
(b) Find the estimate of each matrices’ condition number using the python function
numpy .linalg .cond.
(c) Use numpy .linalg .solve to find the coefficients and report them.
(d) Plot the data from the table. In the same figure, plot all three interpolating polyno- mials at values xx = numpy .linspace(1890,1980, 100). Describe your results.
2. It is well known that high degree polynomial interpolation with equally spaced points is susceptible to Runge phenomenon.
(a) Come up with an example to demonstrate Runge phenomenon, i.e., come up with a
function f(x), sample it at equally-spaced points, plot the result of Lagrange inter- polation on your data, and show the interpolating polynomial has high oscillations.
The standard example f(x) = and its close relatives should not be used. (b) Perform Lagrange interpolation with Chebyshev points on the same example you
came up with in (a) and plot the result, in order to see how Chebyshev points compare to equally spaced points.
3. Consider the following two data points
x |
f(x) |
f\ (x) |
0 |
-1 |
-2 |
1 |
1 |
-3 |
Write out the Hermite interpolating polynomial for this data.
4. We defined a cubic spline for the data set {(xi ,fi )} as the piecewise function
y(x) = yi (x), x ∈ [xi−1,xi]
that satisfies the following conditions,
yi (xi−1) = fi−1 , (i = 1, 2, . . . ,n)
yi (xi ) = fi , (i = 1, 2, . . . ,n)
y (x i ) = y
+1 (xi ), (i = 1, 2, . . . ,n − 1)
y(xi ) = y
(xi ), (i = 1, 2, . . . ,n − 1)
We wrote yi (x) in the following way
(xi − x)3 (x − xi−1)3
6hi 6hi
where hi = xi − xi−1 . From this we obtained the following equations for the coefficients,
fi−1 ai−1hi
hi 6
fi ai hi
hi 6
hi hi + hi+1 hi+1 fi+1 − fi fi − fi−1
The last equation (in addition to two extra boundary conditions) results in a tridiagonal linear system that can be solved to obtain the a coefficients. In class, we derived this system given free boundary conditions. For this question, derive the tridiagonal system for the a coefficients when clamped boundary conditions are imposed:
y (x0 ) = f
y(xn ) = f
5. Implement least-squares for polynomial regression. We have the following m degree poly-
nomial
model,
m
y(x) =工 aj xj
j=0
For a given data set {(xi ,fi )}, we want to compute the coefficients a = [a0 , . . . ,an]T that minimize the total squared error:
||f − Φa||2(2) (16)
The a which minimize this can be obtained by solving the normal equation derived in lecture.
(a) Write a function that takes as input m, x = [x0 ,x1 , . . . xn]T , and f = [f0 ,f1 , . . . fn]T ,
and returns the vector a by solving the normal equation with numpy .linalg .solve.
def polynomialRegression(x, f, m):
# x a vector of the x values of the data set
# f a vector of the y values of the data set # m is the degree of polynomial used for regression
### replace this ###
a = np .zeros(m+1)
######################
return a
(b) Use your function with the data from (1.) and m = 1 as input to obtain a linear
polynomial that fits the data. Plot the data points and resulting y(x) at values xx = numpy .linspace(1890,1980, 100).