An arrow is shot horizontally off of a tower that is 33m high. The arrow lands on the ground 31m from the base of the tower. How long did it take to hit the ground? What was the arrow's original velocity? How far from the tower was the arrow after 0.13s?

1 Answer

a) it took the arrow 2.594 seconds to hit the ground
b) #11.95ms^-1#
c) 1.5535m

Explanation:

a) as the arrow was shot horizontally, the vertical component of its initial velocity will be zero. We will calculate the time that the arrow is in the air using the formula #s=ut+1/2at^2#
as the initial vetical velocity is zero, u=0.

Hence #33=1/2(9.81)t^2#

and #t=sqrt(66/9.81)=2.594#seconds

b) during the time that the arrow is in the air it manages to travel 31m from the base of the tower. As a result, the initial horizontal velocity can be calculated like this;

#v=s/t=31/2.594=11.95ms^-1#

c) similar to part (b) we use the arrow's horizontal velocity and the time it is in the air to calculate distance.

#s=vt=11.95xx0.13=1.55m#

Hope this helps :)