# Question #ee74d

An object is released vertically downwards from a known height $h , \text{units (feet)}$ with initial velocity $u = 0 , \text{units (feet per second)}$ at time $t = 0 , \text{units second)}$ and allowed to fall freely under the force of gravity as shown in the figure below.
As it reaches ground ($h = 0$), the time is noted. Let it be $t$.
$h = u t + \frac{1}{2} g {t}^{2}$
Inserting given values and solving for $g$ acceleration due to gravity we get
$h = 0 \times t + \frac{1}{2} g {t}^{2}$
$\implies g = \frac{2 h}{t} ^ 2$, ${\text{feet per second}}^{2}$