Using the definition of convergence, how do you prove that the sequence #{5+(1/n)}# converges from n=1 to infinity?

1 Answer
Jun 29, 2018

Let:

#a_n = 5+1/n#

then for any #m,n in NN# with #n > m#:

#abs (a_m-a_n) = abs( (5+1/m) -(5+1/n))#

#abs (a_m-a_n) = abs( 5+1/m -5-1/n)#

#abs (a_m-a_n) = abs( 1/m -1/n) #

as #n > m => 1/n < 1/m#:

#abs (a_m-a_n) = 1/m -1/n#

and as #1/n > 0#:

#abs (a_m-a_n) < 1/m#.

Given any real number #epsilon > 0#, choose then an integer #N>1/epsilon#.

For any integers #m,n > N# we have:

#abs (a_m-a_n) < 1/N#

#abs (a_m-a_n) < epsilon#

which proves Cauchy's condition for the convergence of a sequence.