A projectile is shot from the ground at a velocity of #5 m/s# and at an angle of #(2pi)/3#. How long will it take for the projectile to land?

1 Answer
Jan 7, 2017

#t~~0.88s#

Explanation:

Assuming the launch and landing points are at equal altitudes, we can use a kinematic equation to determine the flight time from the launch of the projectile to its maximum altitude (#v_f=0#) and multiply by two to get the total flight time.

#v_f=v_i+a_yDeltat#

#=> Deltat=(v_f-v_i)/a_y#

#=>Deltat=(-v_i)/a_y#

We know that when an object is in free fall, the acceleration is equal to #-9.8m/s^2# (vertically).

Because the projectile is launched at an angle, we will need to break the velocity up into components. This can be done using basic trigonometry.

enter image source here
Where #v# is the initial velocity, and #v_x# and #v_y# are the horizontal and vertical components of the velocity, respectively. We can see that:

#v_x=vcos(theta)#
#v_y=vsin(theta)#

We will only require the #y# component.

#v_y=5*sin((2pi)/3)= (5sqrt(3))/2m/s#

We can now calculate the rise time of the projectile.

#Deltat=(-v_i)/a_y#

#Deltat=(-(5sqrt(3))/2m/s)/(-9.8m/s^2)#

#Deltat=~~0.44s#

The total flight time is then #~~0.88s#