If the terms of a series approach zero, must the series converge? Justify or provide a counterexample.
No, the series doesn't have to converge.
The harmonic series is the standard counterexample. The terms of the harmonic series approach zero, but the harmonic series itself diverges. The correct statement is that if a series converges, its terms must approach zero.
The example was sort of like comparing two improper integrals, but instead of two integrals we had one integral and one collection of rectangles trying to approximate the integral.
Purchase the Series Pass and get full access to this Calculus chapter. No limits found here.