Limits of Infinite Sequences

Key Questions

  • The limit of an infinite sequence tells us about the long term behaviour of it.

    Given a sequence of real numbers #a_n#, it's limit #lim_(n to oo) a_n = lim a_n# is defined as the single value the sequence approaches (if it approaches any value) as we make the index #n# bigger. The limit of a sequence does not always exist. If it does, the sequence is said to be convergent, otherwise it's said to be divergent.

    Two simple examples:

    • Consider the sequence #1/n#. It's easy to see that it's limit is #0#. In fact, given any positive value close to #0#, we can alway find a great enough value of #n# such that #1/n# is less than this given value, wich means that it's limit must be less or equal to zero. Also, every term of the sequence is greater then zero, so it's limit must be greater or equal to zero. Therefore, it is #0#.

    • Take the constant sequence #1#. That is, for any given value of #n#, the term #a_n# of the sequence is equal to #1#. It's clear that no matter how big we make #n# the value of the sequence is #1#. So it's limit is #1#.

    For a more rigorous definition, let #a_n# be a sequence of real numbers (that is, #forall n in NN : a_n in RR#) and #epsilon in RR#. Then the number #a# is said to be the limit of the sequence #a_n# if and only if:

    #forall epsilon>0 exists N in NN : n>N => |a_n - a| < epsilon#

    This definition is equivalent to the informal definition given above, except that we don't need to impose unicity for the limit (it can be deduced).

  • A sequence is said to be convergent if it's limit exists.

    Else, it's said to be divergent.

    It must be emphasized that if the limit of a sequence #a_n# is infinite, that is #lim_(n to oo) a_n = oo# or #lim_(n to oo) a_n = -oo#, the sequence is also said to be divergent.

    A few examples of convergent sequences are:

    • #1/n#, with #lim_(n to oo) 1/n = 0#
    • The constant sequence #c#, with #c in RR# and #lim_(n to oo) c = c#
    • #(1+1/n)^n#, with #lim_(n to oo) (1+1/n)^n = e# where #e# is the base of the natural logarithms (also called Euler's number).

    Convergent sequences play a very big role in various fields of Mathematics, from estabilishing the foundations of calculus, to solving problems in Functional Analysis, to motivating the development of Toplogy.

  • In general, there is no process that gives you the limit of any convergent sequence. That does not mean, however, that limits cannot be found.

    For example, take the sequence #a_n = 1/n#. It's easily seen that #lim_(n to oo) a_n = 0# (just note that, for any positive number #epsilon# close to #0#, we can find a value of #n# big enough such that #1/n < epsilon#).

    Generally, the question of the limit of a convergent sequence is not trivial at all, and there are many examples of problems of that kind that went unresolved for years or are still to be resolved.

    The question of whether a sequence is convergent or not is easier to answer, even without knowing it's limit (for the case of a convergent sequence), due to Cauchy's criterion.

    One famous example of a enduring question is the Basel problem. It consists of the following:

    What is the value of the infinite sum of the reciprocals of the squares of the natural numbers?

    Or equivalently:

    #sum_(k=1)^(oo) 1/(k^2) = lim_(n to oo) sum_(k=1)^(n) 1/(k^2)=?#

    We can reformulate this problem in terms of sequences, by defining:

    #s_n=sum_(k=1)^(n) 1/(k^2)#

    Then the question becomes a problem of finding the limit of a sequence:

    #lim_(n to oo) s_n = ?#

    It was first posed in 1644 and was solved by Leonhard Euler only in 1735 (91 yeas later), using Taylor polynomials to repesent functions.

Questions