I am on the moon where gravity is 1.6 m/s/s downward. If I dropped a brick from a height of 3 m, how long did it take to hit the surface?

1 Answer
Aug 4, 2017

#t = 1.94# #"s"#

Explanation:

We're asked to find the time it takes an object to hit the ground, given its height and downward acceleration.

To do this, we can use the equation

#y = y_0 + v_(0y)t - 1/2g t^2#

where

  • #y# is the final height (#0#, ground level)

  • #y_0# is the initial height (given as #3# #"m"#)

  • #v_(0y)# is the initial #y#-velocity (#0# since it was dropped from a state of rest)

  • #t# is the time (what we're trying to find)

  • #g# is the acceleration due to gravity (given as #1.6# #"m/s"^2#)

Plugging in known values, we have

#0 = 3color(white)(l)"m" + (0)t - 1/2(1.6color(white)(l)"m/s"^2)t^2#

#(0.8color(white)(l)"m/s"^2)t^2 = 3color(white)(l)"m"#

#t = sqrt((3color(white)(l)"m")/(0.8color(white)(l)"m/s"^2)) = color(red)(ulbar(|stackrel(" ")(" "1.94color(white)(l)"s"" ")|)#