How do you apply the ratio test to determine if #sum_(n=2)^oo 10^n/(lnn)^n# is convergent to divergent?

1 Answer
Dec 31, 2017

The Ratio Test can be used to show that this series converges.

Explanation:

Let #a_{n}=10^{n}/((ln(n))^{n})#. Then

#a_{n+1}/a_{n}=10^{n+1}/((ln(n+1))^{n+1}) * ((ln(n))^{n})/10^{n}#

#=10/ln(n+1) * (ln(n)/ln(n+1))^{n}#

Note that #(ln(n)/ln(n+1))^{n} <= 1# for all #n geq 2# (L'Hopital's Rule can be used to help show that the limit of this expression is 1, but it is enough to note that it is bounded).

All this implies that

#a_{n+1}/a_{n}=10/ln(n+1) * (ln(n)/ln(n+1))^{n}->0=L < 1# as #n->infty#.

Therefore, the Ratio Test implies that #sum_{n=2}^{infty}a_{n}=sum_{n=2}^{infty}10^{n}/((ln(n))^{n})# converges.