Hello, dear friend, you can consult us at any time if you have any questions, add WeChat: daixieit


Computer Science 4252: Introduction to Computational Learning Theory

Problem Set #4 Fall 2021


Problem 1  A concept class C over a domain X is said to be completely ordered if (i) C contains at least two concepts; and (ii) for any c1, c2 ∈ C either c1 ⊆ c2 or c2 ⊆ c1.

(a) Show that if C is completely ordered then the VC dimension of C is 1.

(b) Show that if the VC dimension of C is 1 and ∅ ∈ C and X ∈ C then C is completely ordered.

(c) Let H1, H2, . . . , Hs be a sequence of concept classes, each of which is completely ordered over X, and let H = {h1 ∪ . . . ∪ hs | hi ∈ Hi}. Show that the VC dimension of H is at most s. Hint: How many size-1 subsets of a set S can be induced by intersecting with H?


Problem 2

(a) Let C be a concept class over a finite domain X with |X| = n and suppose that

VCDIM(C) = d; and

for every c ∈ C we have that |c| ≤ r.

Show that there must be two concepts c1, c2 ∈ C that have |c1 ∩ c2| ≤ r − d.

(b) Let X be a finite domain with |X| = 2k + 1 (in other words X contains an odd number of elements). Describe a concept class C over X for which (so half of all possible concepts over X belong to C) and VCDIM(C) = k. (Justify your answer.)


Problem 3  Let the domain X be the real line and let the concept class C be the class of all unions of at most k closed intervals (so for k = 2, an example of a concept in C would be c = [2, 5]∪[5.3, 9.4]).

(a) Determine the exact value of the VC dimension of C.

(b) Determine the exact value of the growth function . If you have trouble doing this, give the best asymptotic upper bound you can on .


Problem 4  In class we gave a lower bound on the sample complexity of PAC learning a concept class C in terms of the VC dimension of C and the accuracy parameter ε. In this problem you’ll analyze how the confidence parameter δ comes into the sample complexity of PAC learning.

(a) True or false: For any concept class C with |C| ≥ 2, any algorithm for PAC learning C to accuracy ε and confidence 1 − δ must use Ω(log(1/δ)/ε) examples in the worst case (for some distribution D). Justify your answer.

(b) Show that if C is a concept class with |C| ≥ 3, then there must be two points in X and two concepts c1, c2 ∈ C such that c1 and c2 agree on one of the points and disagree on the other.

(c) True or false: Same as part (a), but with “2” replaced by “3.” Justify your answer. (Hint for part (c): Throughout the problem for convenience you may assume that ε, δ < 1/4, and you may use the fact that 


Problem 5  Let C be a concept class over that is PAC learnable using poly examples. Show that C is learnable in the online mistake-bound model by an algorithm that makes at most poly(n) mistakes. (You may use any results proved in class, and your online learning algorithm need not be computationally efficient.)