A driver in a car traveling at a speed of 21.8 m/s sees a car 101 m away on the road. How long will it take for the car to uniformly decelerate to a complete stop in exactly 99 m?

1 Answer
Apr 5, 2018

#9.1s#

Explanation:

Since we know that velocity is uniformly decelerating, we can take the average velocity:

#(V_i + V_f)/2#

Letting #V_f# equal 0 m/s, we get #1/2*21.8m#/#s# or #10.9m#/#s#.

Now we know #V = d/t# and rearranging this gives #t = d/V#.

Substitute our velocity of 10.9m/s for V and 99m for d:
#t=(99m)/(10.9m*s^-1)# (Sorry for the #s^-1# but otherwise I couldn't write m/s)

Finally we get #t=9.1s# (with significant figure rounding).

Hope this helps!

Cheers