Question #2e587
1 Answer
Explanation:
We're asked to find the time, in seconds, it takes an object to fall to Earth's surface from a height of
To do this, we can use the kinematics equation
#ul(y = y_0 + v_(0y)t  1/2g t^2#
where

#y# is the height at time#t# (which is#0# , groundlevel) 
#y_0# is the initial height (given as#250# #"m"# ) 
#v_(0y)# is the initial velocity (it dropped from a state of rest, so this is#0# ) 
#t# is the time (what we're trying to find) 
#g = 9.81# #"m/s"^2#
Sine the initial
#y = y_0  1/2g t^2#
Let's solve this for our unknown variable,
#yy_0= 1/2g t^2#
#2(yy_0) = g t^2#
#t^2 = (2(yy_0))/g#
#color(red)(t = sqrt((2(yy_0))/g)#
Plugging in known values:
#t = sqrt((2(0250color(white)(l)"m"))/(9.81color(white)(l)"m/s"^2)) = color(blue)(ulbar(stackrel(" ")(" "7.14color(white)(l)"s"" "))#
So ultimately, if you're ever given a situation where you're asked to find the time it takes an object to fall a certain distance (with
#color(red)(t = sqrt((2*"height")/g)#