Question #cd938

1 Answer
Jun 13, 2017

1.531.53 "s"s

Explanation:

We're (essentially) asked to find the time when the rock hits the ground, given it fell from a height of 11.511.5 "m"m.

This rock is undergoing free-fall; i.e. it is under the sole influence of Earth's gravitational force, and falling toward Earth's surface with an acceleration of -gg, which is -9.8"m/s"^29.8m/s2

To find the time tt when the rock reaches a position of y = -11.5y=11.5 "m"m, we can use the equation

y = y_0 + v_(0y)t - 1/2g t^2y=y0+v0yt12gt2

where

  • yy is the position at time tt (-11.511.5 "m"m),

  • y_0y0 is the initial position (00 "m"m),

  • v_(0y)v0y is the initial yy-velocity, which is 00 since the rock was merely dropped,

  • tt is the time, in "s"s (what we must find), and

  • gg is 9.89.8 "m/s"^2m/s2 (the minus sign in front of the 1/212 indicates that this acceleration is downward)

Plugging in known values, we have

-11.511.5 "m" = 0m=0 "m" + (0m+(0 "m/s)"t - (4.9"m/s"^2)t^2m/s)t(4.9m/s2)t2

(4.9"m/s"^2)t^2 = 11.5(4.9m/s2)t2=11.5 "m"m

t^2 = (11.5"m")/(4.9"m/s"^2)t2=11.5m4.9m/s2

t = sqrt((11.5cancel("m"))/(4.9cancel("m")"/s"^2)) = color(red)(1.53 color(red)("s"

When dropped from a well with a depth of 11.5 meters, the rock will thus take 1.53 seconds to reach the bottom of the well.