A projectile is shot from the ground at a velocity of #52 m/s# and at an angle of #(pi)/2#. How long will it take for the projectile to land?

1 Answer
Jan 18, 2017

#t~~11s#

Explanation:

We are given the initial velocity of the projectile as well as the launch angle. We assume air resistance is negligible, so the only acceleration experienced by a simple projectile is that of gravity, which acts only in the vertical direction (i.e. downwards).

We would typically break the launch velocity up into parallel and perpendicular components given the launch angle, but this particular launch angle is #pi/2#, which is equivalent to #90^o#. This projectile is launched straight up! This simplifies the problem tremendously, because it means that the initial velocity is entirely vertical-no need for trigonometry.

We use kinematics. We have the acceleration #(-g)#, and the initial velocity. We want to calculate the time the projectile spends in the air. Assuming the launch and landing altitudes are equal, we can calculate for the rise time of the projectile, then multiply this by two to get the total flight time.

Because the projectile will momentarily stop at its maximum altitude before falling, we say that the final velocity for the launch is #0#. This allows us to use the following kinematic equation to calculate the rise time:

#v_f=v_i+a_yDeltat#

Where #v_i# is the initial velocity, #v_f# is the final velocity, #a_y# is the acceleration, and #Deltat# is the rise time.

We first solve for #Deltat#:

#Deltat=(v_f-v_i)/(a_y)#

We know #v_f=0#, so:

#=>Deltat=(-v_i)/(a_y)#

Using our known values:

#Deltat=(-52m/s)/(-9.8m/s^2)=5.3s#

This is the time elapsed as the projectile goes from launch site to maximum altitude. The fall time will be the same, so we multiply by two.

#t_(t o t)=2xxt_(rise)=2*5.3=10.6s#

Thus, it will take #~~11s# for the projectile to land.

--

Note that the rise and fall times are equal only if the projectile lands at the same height is was launched from.