An object is thrown vertically from a height of #7# #m# at # 4# #ms^-1#. How long will it take for the object to hit the ground?

1 Answer
May 7, 2016

Assuming the object has initial upward velocity #->t =1.67# #s#
Assuming the object has initial downward velocity #->t =0.85# #s#

Explanation:

We have been given the following informations

The object is thrown *with velocity #u=4# #ms^-1#
from a height of #h= 7# #m# and if it takes time t sec to reach the ground then to apply the equation of motion we may go as follows

Assuming the object has initial upward velocity

  • #u =+4# #ms^-1-> "upward taken +ve"#
  • #h= -7# #m ->"downward taken -ve"#
  • Acceleration due to gravity #g=-9.8# #ms^-2 ->"downward taken -ve"#

So the equation, #h=ut+1/2g t^2# becomes
#=>-7=4xxt-1/2xx9.8t^2#
#=>4.9t^2-4t-7=0#

#t=(-(-4)+sqrt((-4)^2-4xx4.9(-7)))/(2xx4.9)=1.67s# [negative value of t neglected]

Assuming the object has initial downward velocity

  • #u =+3# #ms^-1-> "downward taken +ve"#
  • #h= +7# #m ->"downward taken +ve"#
  • Acceleration due to gravity #g=+9.8m/s^2 ->"downward taken +ve"#

So the equation,#h=ut+1/2g t^2# becomes
#=>7=4xxt+1/2xx9.8t^2#
#=>4.9t^2+4t-7=0#

#t=(-4+sqrt((4)^2-4xx4.9(-7)))/(2xx4.9)=0.85s# [negative value of t neglected]