Skip to main content

Section 6.2 Infinite Series

We know how to add up a finite set of numbers \(a_1 + a_2 + \cdots + a_n\text{.}\) But we'd like to use the completeness of \(\RR\) to make sense of infinite sums:

\begin{equation*} a_1 + a_2 + \cdots + a_j + \cdots\text{.} \end{equation*}

This turns out to take a little care: it is not always possible to assign a value to this infinite sum, and even when it is possible, the sum does not always behave in expected ways.

A good way to make sense of this is to form the new sequence of partial sums \((s_n)\text{,}\) where

\begin{equation*} s_n = a_1 + a_2 + \dots + a_n \end{equation*}

and to define the infinite sum

\begin{equation*} a_1 + a_2 + \cdots + a_j + \cdots = \sum_{j=1}^\infty a_j = \lim_{n \to \infty} s_n \end{equation*}

to be the limit of the sequence \((s_n)\) whenever it exists. The numbers \(s_n\) are called the partial sums of \(\sum_{j=1}^\infty a_j\text{.}\) If the sequence of partial sums \((s_n)\) does not converge, we assign no meaning to \(\sum_{j=1}^\infty a_j\text{.}\) More formally:

Definition 6.2.1.

Given a sequence \((a_{j})\text{,}\) we say that the infinite series \(\sum_{j=1}^\infty a_{j}\) converges if the sequence of partial sums

\begin{equation*} s_{n}=\sum_{j=1}^{n}a_{j} \end{equation*}

converges as \(n\rightarrow\infty\text{.}\) When \((s_n)\) converges we denote its limit by \(\sum_{j=1}^{\infty}a_{j}\text{.}\)

Alternatively, given the sequence \((a_n)\text{,}\) we can equivalently define the sequence \((s_n)\) recursively by \(s_1 = a_1\) and \(s_{n} = a_n + s_{n-1}\) for \(n \geq 2\text{.}\)

Decimal expansions can be thought of as special cases of infinite series: given a sequence \((b_j)\) with each \(b_j \in \{0, 1, \dots , 9\}\text{,}\) we can interpret the decimal expansion \(0.b_1b_2 \dots\) as the infinite series \(\sum_{j=1}^\infty \frac{b_j}{10^j}\text{.}\)

Sometimes we will also work with series which start from a different position, like \(\sum_{j=2}^{\infty} \frac{1}{j(j-1)}\text{,}\) and the definition is the same: we say this series is convergent if the partial sums \(\sum_{j=2}^{n} \frac{1}{j(j-1)}\) converge. Whether we use the dummy variable \(j\) or \(k\) (or anything else) doesn't matter: \(\sum_{j=2}^{\infty} \frac{1}{j(j-1)}\) has exactly the same meaning as \(\sum_{k=2}^{\infty} \frac{1}{k(k-1)}\) or \(\sum_{n=2}^{\infty} \frac{1}{n(n-1)}\text{.}\)

The quintessential example of a convergent series is the geometric series. Recall that for any real number \(r \neq 1\text{,}\)

\begin{equation} \sum_{j=0}^{n} r ^{j}=\frac{(1-r^{n+1})}{1-r}\tag{6.2.1} \end{equation}

If \(|r|\lt 1\text{,}\) then \(|r^{n+1}|=|r|^{n+1}\rightarrow 0\) by Example 5.4.4, and so \(r^{n+1}\rightarrow 0\) as well. Thus,

\begin{equation*} \lim_{n \to \infty} \sum_{j=0}^{n}r^{j} = \lim_{n\to\infty} \frac{(1-r^{n+1})}{1-r} = \frac{\lim_n (1-r^{n+1})}{1-r}=\frac{1- \lim_{n\to\infty} r^{n+1}}{1-r}=\frac{1}{1-r}\text{.} \end{equation*}

Thus, when \(|r|\lt 1\text{,}\) \(\sum_{j=0}^\infty a r^{j}\) is convergent, and \(\sum_{j=0}^{\infty} a r^{j}=\frac{a}{1-r}\text{.}\) We'll see below that this series converges if and only if \(|r|\lt 1\text{.}\)

One useful necessary (but not sufficient!) condition for a series to converge is the following:

Let \(s_n = \sum_{j=1}^n a_j\text{.}\) Then for some \(s \in \RR\text{,}\) \(s_n \to s\text{.}\) Hence \(a_n = s_n - s_{n-1} \to s-s =0\text{.}\)

As a consequence of this we see that the geometric series of Example 6.2.2 diverges when \(|r| \geq 1\) since the sequence \((ar^j)_{j=1}^\infty\) does not converge to \(0\) when \(|r| \geq 1\text{.}\)

Just as we had rules for manipulating limits of sequences, we also have rules for manipulating infinite sums. The one which follows says that the operation of taking infinite sums is linear. Its proof will use the rules for limits of sequences.

We observe that

\begin{align*} \sum_{j=1}^{\infty} (a_{j}+b_{j}) \amp = \lim_{n\rightarrow\infty} \sum_{j=1}^{n}(a_{j}+b_{j}) \amp =\lim_{n\rightarrow\infty} \left(\sum_{j=1}^{n} a_{j}+\sum_{j=1}^{n}b_{j}\right) =\lim_{n\rightarrow\infty} \sum_{j=1}^{n} a_{j}+\lim_{n\rightarrow\infty}\sum_{j=1}^{n}b_{j}\\ \amp =\sum_{j=1}^{\infty} a_{j}+\sum_{j=1}^{\infty}b_{j}\text{.} \end{align*}

Similarly,

\begin{equation*} \sum_{j=1}^{\infty} ca_{j} =\lim_{n\rightarrow\infty}\sum_{j=1}^{n} ca_{j} =c\lim_{n\rightarrow\infty}\sum_{j=1}^{n} a_{j} =c\sum_{j=1}^{\infty} a_{j}\text{.} \end{equation*}

What about a rule for products of series?

The idea is to write \(a_j = (a_j - b_j) + b_j\) and to use Proposition 6.2.4. We know that \(\sum_j b_j\) converges by hypothesis, and if we knew that \(\sum_j(a_j - b_j)\) converged we would be in business.

Note first that, since \(a_j \leq b_j\) for all \(j\text{,}\) we have \(t_n \geq 0\) for all \(n\text{,}\) and moreover the sequence \((t_n)\) given by

\begin{equation*} t_{n}=\sum_{j=1}^{n}(b_{j}-a_{j}) \end{equation*}

is increasing. Next, note that \(b_{j}-a_{j}\leq 2b_{j}\text{,}\) and so

\begin{equation*} t_{n}\leq \sum_{j=1}^{n}2b_{j}\leq \sum_{j=1}^{\infty}2b_{j} \end{equation*}

and since this latter sum converges by Proposition 6.2.4, this means that \((t_{n})\) is bounded above. Thus, by the MCT, \((t_{n})\) converges and \(\lim_{n\to\infty} t_n \geq 0\text{.}\) Therefore, by Proposition 6.2.4. \(\sum_{j=1}^\infty(a_j - b_j)\) converges to some nonpositive number. We conclude that \(\sum_{j=1}^\infty a_{j}\) converges, and the sum is \(\sum_{j=1}^\infty b_j + \sum_{j=1}^\infty (a_j - b_j)\text{,}\) whose value is at most \(\sum_{j=1}^\infty b_j\text{.}\) Finally, applying the same reasoning with \(-a_j\) in place of \(a_j\text{,}\) we conclude that

\begin{equation*} |\sum_{j=1}^\infty a_j| \leq \sum_{j=1}^\infty b_j\text{.} \end{equation*}

For example, since \(\left|\frac{\sin j}{2^{j}}\right|\leq \frac{1}{2^{j}}\) and \(\sum_{j=1}^\infty \frac{1}{2^{j}}\) converges by Example 6.2.2, we can deduce that \(\sum_{j=1}^\infty \frac{\sin j}{2^{j}}\) converges too.