# Limits of Infinite Sequences

## Key Questions

• The limit of an infinite sequence tells us about the long term behaviour of it.

Given a sequence of real numbers ${a}_{n}$, it's limit ${\lim}_{n \to \infty} {a}_{n} = \lim {a}_{n}$ is defined as the single value the sequence approaches (if it approaches any value) as we make the index $n$ bigger. The limit of a sequence does not always exist. If it does, the sequence is said to be convergent, otherwise it's said to be divergent.

Two simple examples:

• Consider the sequence $\frac{1}{n}$. It's easy to see that it's limit is $0$. In fact, given any positive value close to $0$, we can alway find a great enough value of $n$ such that $\frac{1}{n}$ is less than this given value, wich means that it's limit must be less or equal to zero. Also, every term of the sequence is greater then zero, so it's limit must be greater or equal to zero. Therefore, it is $0$.

• Take the constant sequence $1$. That is, for any given value of $n$, the term ${a}_{n}$ of the sequence is equal to $1$. It's clear that no matter how big we make $n$ the value of the sequence is $1$. So it's limit is $1$.

For a more rigorous definition, let ${a}_{n}$ be a sequence of real numbers (that is, $\forall n \in \mathbb{N} : {a}_{n} \in \mathbb{R}$) and $\epsilon \in \mathbb{R}$. Then the number $a$ is said to be the limit of the sequence ${a}_{n}$ if and only if:

$\forall \epsilon > 0 \exists N \in \mathbb{N} : n > N \implies | {a}_{n} - a | < \epsilon$

This definition is equivalent to the informal definition given above, except that we don't need to impose unicity for the limit (it can be deduced).

• A sequence is said to be convergent if it's limit exists.

Else, it's said to be divergent.

It must be emphasized that if the limit of a sequence ${a}_{n}$ is infinite, that is ${\lim}_{n \to \infty} {a}_{n} = \infty$ or ${\lim}_{n \to \infty} {a}_{n} = - \infty$, the sequence is also said to be divergent.

A few examples of convergent sequences are:

• $\frac{1}{n}$, with ${\lim}_{n \to \infty} \frac{1}{n} = 0$
• The constant sequence $c$, with $c \in \mathbb{R}$ and ${\lim}_{n \to \infty} c = c$
• ${\left(1 + \frac{1}{n}\right)}^{n}$, with ${\lim}_{n \to \infty} {\left(1 + \frac{1}{n}\right)}^{n} = e$ where $e$ is the base of the natural logarithms (also called Euler's number).

Convergent sequences play a very big role in various fields of Mathematics, from estabilishing the foundations of calculus, to solving problems in Functional Analysis, to motivating the development of Toplogy.

• In general, there is no process that gives you the limit of any convergent sequence. That does not mean, however, that limits cannot be found.

For example, take the sequence ${a}_{n} = \frac{1}{n}$. It's easily seen that ${\lim}_{n \to \infty} {a}_{n} = 0$ (just note that, for any positive number $\epsilon$ close to $0$, we can find a value of $n$ big enough such that $\frac{1}{n} < \epsilon$).

Generally, the question of the limit of a convergent sequence is not trivial at all, and there are many examples of problems of that kind that went unresolved for years or are still to be resolved.

The question of whether a sequence is convergent or not is easier to answer, even without knowing it's limit (for the case of a convergent sequence), due to Cauchy's criterion.

One famous example of a enduring question is the Basel problem. It consists of the following:

What is the value of the infinite sum of the reciprocals of the squares of the natural numbers?

Or equivalently:

sum_(k=1)^(oo) 1/(k^2) = lim_(n to oo) sum_(k=1)^(n) 1/(k^2)=?

We can reformulate this problem in terms of sequences, by defining:

${s}_{n} = {\sum}_{k = 1}^{n} \frac{1}{{k}^{2}}$

Then the question becomes a problem of finding the limit of a sequence:

lim_(n to oo) s_n = ?

It was first posed in 1644 and was solved by Leonhard Euler only in 1735 (91 yeas later), using Taylor polynomials to repesent functions.