How do you prove: #lim_(x->0) sin(x)/x = 1# without using l'hopital's rule (or the derivative of sin(x) at all)?

I saw a proof which uses the limit definition of a derivative to prove #d/(dx)sin(x)=cos(x)#
but part of the proof relied upon assuming that:
#lim_(x->0) sin(x)/x = 1#.
It is not shown explicitly in the proof how this limit is evaluated.
The only way I know how to evaluate that limit is using l'hopital's rule which means the derivative of #sin(x)# is already assumed to be #cos(x)# and will obviously lead to some circular logic thereby invalidating the proof.

I was hoping someone might be able to present a method which does not take the derivative of #sin(x)# to find the limit (and is also a bit more algebraically rigorous than sticking numbers in a spreadsheet and seeing they approach one). Thanks!

For context the proof I saw is here:

1 Answer
Jan 2, 2018

Use the squeeze theorem.

Explanation:

A proof using geometry and the squeeze theorem is here:
https://socratic.org/questions/how-do-you-use-the-squeeze-theorem-to-find-lim-sin-x-x-as-x-approaches-zero

The Squeeze Theorem can be proved directly from the definition of the limit of a function.

Outline:

Suppose (1) #f(x) <= g(x) <= h(x)# for all #x# in some open interval containing #c# (except possibly at #c#)

also suppose that (2) #lim_(xrarrc)f(x) = lim_(xrarra)g(x) = L#

Given #epsilon > 0#

By (2) we can make #abs(f(x)-L) < epsilon# which makes

#f(x) <= g(x) <= h(x)# and
#L-epsilon < f(x) < L+epsilon# (for #abs(x-c) <# some #delta_1)#)

By (2) we can make also make

#f(x) <= g(x) <= h(x)# and
#L-epsilon < g(x) < L+epsilon# (for #abs(x-c) <# some #delta_2)#)

Now using #delta = min{delta_1, delta_2}# we get, for #abs(x-c) < delta#

#L-epsilon f(x) < g(x) < h(x) < L + epsilon#

So #abs(g(x) - L) < epsilon#

(OK I was going to type an outline but that's a fairly complete proof.)