Convergent series
In mathematics, a series is the sum of the terms of a sequence of numbers.
Given a sequence <math>\left \{ a_1,\ a_2,\ a_3,\dots \right \}</math>, the nth partial sum <math>S_n</math> is the sum of the first n terms of the sequence, that is,
- <math>S_n = \sum_{k=1}^n a_k.</math>
A series is convergent if the sequence of its partial sums <math>\left \{ S_1,\ S_2,\ S_3,\dots \right \}</math> converges. In more formal language, a series converges if there exists a limit <math>\ell</math> such that for any arbitrarily small positive number <math>\varepsilon > 0</math>, there is a large integer <math>N</math> such that for all <math>n \ge \ N</math>,
- <math>\left | S_n - \ell \right \vert \le \ \varepsilon.</math>
A sequence that is not convergent is said to be divergent.
Examples of convergent and divergent series
- The reciprocals of powers of 2 produce a convergent series (so the set of powers of 2 is "small"):
- <math>{1 \over 1}+{1 \over 2}+{1 \over 4}+{1 \over 8}+{1 \over 16}+{1 \over 32}+\cdots = 2.</math>
- The reciprocals of positive integers produce a divergent series:
- <math>{1 \over 1}+{1 \over 2}+{1 \over 3}+{1 \over 4}+{1 \over 5}+{1 \over 6}+\cdots. </math>
- Alternating the signs of the reciprocals of positive integers produces a convergent series:
- <math>{1 \over 1}-{1 \over 2}+{1 \over 3}-{1 \over 4}+{1 \over 5}-{1 \over 6}+\cdots = \ln 2. </math>
- The reciprocals of prime numbers produce a divergent series (so the set of primes is "large"):
- <math>{1 \over 2}+{1 \over 3}+{1 \over 5}+{1 \over 7}+{1 \over 11}+{1 \over 13}+\cdots .</math>
- The reciprocals of square numbers produce a convergent series (the Basel problem):
- <math>{1 \over 1}+{1 \over 4}+{1 \over 9}+{1 \over 16}+{1 \over 25}+{1 \over 36}+\cdots = {\pi^2 \over 6}.</math>
- Alternating the signs of the reciprocals of positive odd numbers produces a convergent series:
- <math>{1 \over 1}-{1 \over 3}+{1 \over 5}-{1 \over 7}+{1 \over 9}-{1 \over 11}+\cdots = {\pi \over 4}.</math>
Convergence tests
There are a number of methods of determining whether a series converges or diverges.
Comparison test. The terms of the sequence <math>\left \{ a_n \right \}</math> are compared to those of another sequence <math>\left \{ b_n \right \}</math>. If,
for all n, <math>0 \le \ a_n \le \ b_n</math>, and <math>\sum_{n=1}^\infty b_n</math> converges, then so does <math>\sum_{n=1}^\infty a_n</math>.
However, if,
for all n, <math>0 \le \ b_n \le \ a_n</math>, and <math>\sum_{n=1}^\infty b_n</math> diverges, then so does <math>\sum_{n=1}^\infty a_n</math>.
Ratio test. Assume that for all n, <math>a_n > 0</math>. Suppose that there exists <math>r</math> such that
- <math>\lim_{n \to \infty} \frac{a_{n+1}}{a_n} = r</math>.
If r < 1, then the series converges. If r > 1, then the series diverges. If r = 1, the ratio test is inconclusive, and the series may converge or diverge.
Root test or nth root test. Suppose that the terms of the sequence in question are non-negative, and that there exists r such that
- <math> \lim_{n \to \infty} \sqrt[n]{a_n} = r</math>
If r < 1, then the series converges. If r > 1, then the series diverges. If r = 1, the root test is inconclusive, and the series may converge or diverge.
The ratio test and the root test are both based on comparison with a geometric series, and as such they work in similar situations. In fact, if the ratio test works (meaning that the limit exists and is not equal to 1) then so does the root test; the converse, however, is not true. The root test is therefore more generally applicable, but as a practical matter the limit is often difficult to compute for commonly seen types of series.
Integral test. The series can be compared to an integral to establish convergence or divergence. Let <math>f(n) = a_n</math> be a positive and monotone decreasing function. If
- <math>\int_{1}^{\infty} f(x)\, dx = \lim_{t \to \infty} \int_{1}^{t} f(x)\, dx < \infty,</math>
then the series converges. But if the integral diverges, then the series does so as well.
Limit comparison test. If <math>\left \{ a_n \right \}, \left \{ b_n \right \} > 0</math>, and the limit <math>\lim_{n \to \infty} \frac{a_n}{b_n}</math> exists and is not zero, then <math>\sum_{n=1}^\infty a_n</math> converges if and only if <math>\sum_{n=1}^\infty b_n</math> converges.
Alternating series test. Also known as the Leibniz criterion, the alternating series test states that for an alternating series of the form <math>\sum_{n=1}^\infty a_n (-1)^n</math>, if <math>\left \{ a_n \right \}</math> is monotone decreasing, and has a limit of 0, then the series converges.
Cauchy condensation test. If <math>\left \{ a_n \right \}</math> is a monotone decreasing sequence, then <math> \sum_{n=1}^\infty a_n </math> converges if and only if <math> \sum_{k=1}^\infty 2^k a_{2^k} </math> converges.
Conditional and absolute convergence
For any sequence <math>\left \{ a_1,\ a_2,\ a_3,\dots \right \}</math>, <math>a_n \le \ \left | a_n \right \vert</math> for all n. Therefore,
- <math>\sum_{n=1}^\infty a_n \le \ \sum_{n=1}^\infty \left | a_n \right \vert.</math>
This means that if <math>\sum_{n=1}^\infty \left | a_n \right \vert</math> converges, then <math>\sum_{n=1}^\infty a_n</math> also converges (but not vice-versa).
If the series <math>\sum_{n=1}^\infty \left | a_n \right \vert</math> converges, then the series <math>\sum_{n=1}^\infty a_n</math> is absolutely convergent. An absolutely convergent sequence is one in which the length of the line created by joining together all of the increments to the partial sum is finitely long. The power series of the exponential function is absolutely convergent everywhere.
If the series <math>\sum_{n=1}^\infty a_n</math> converges but the series <math>\sum_{n=1}^\infty \left | a_n \right \vert</math> diverges, then the series <math>\sum_{n=1}^\infty a_n</math> is conditionally convergent. The path formed by connecting the partial sums of a conditionally convergent series is infinitely long. The power series of the logarithm is conditionally convergent.
The Riemann series theorem states that if a series converges conditionally, it is possible to rearrange the terms of the series in such a way that the series converges to any value, or even diverges.
Uniform convergence
- Main article: uniform convergence.
Let <math>\left \{ f_1,\ f_2,\ f_3,\dots \right \}</math> be a sequence of functions. The series <math>\sum_{n=1}^\infty f_n</math> is said to converge uniformly to f if the sequence <math>\{s_n\}</math> of partial sums defined by
- <math> s_n(x) = \sum_{k=1}^n f_k (x)</math>
converges uniformly to f.
There is an analogue of the comparison test for infinite series of functions called the Weierstrass M-test.
Cauchy convergence criterion
The Cauchy convergence criterion states that a series
- <math>\sum_{n=1}^\infty a_n</math>
converges if and only if the sequence of partial sums is a Cauchy sequence. This means that for every <math> \varepsilon > 0, </math> there is a positive integer <math>N</math> such that for all <math>n \geq m \geq N</math> we have
- <math> \left| \sum_{k=m}^n a_k \right| < \varepsilon, </math>
which is equivalent to
- <math>\lim_{n \to \infty \atop m\to \infty} \sum_{k=n}^{n+m} a_k = 0.</math>
References
- Rudin, Walter (1976). Principles of Mathematical Analysis. McGrawHill.
- Spivak, Michael (1994). Calculus (3rd ed.). Houston, Texas: Publish or Perish, Inc. ISBN 0-914098-89-6.
External links
- Chase, Robert (2007). More plots on convergence
- Weisstein, Eric (2005). Riemann Series Theorem. Retrieved May 16, 2005.