An object is thrown vertically at a height of 14 m at 1 m/s. How long will it take for the object to hit the ground?

1 Answer
Apr 12, 2016

t=1,59 " s"
t=1,69 " s"

Explanation:

" if object is thrown downward:"

v_i=1m/s
y=14m
g=9,81m/s^2

y=v_i*t+1/2*g*t^2
14=1*t+1/2*9,81*t^2

4,905t^2+t-14=0

Delta=sqrt(1^2+4*4,905*14)

Delta=sqrt(1+274,68)

Delta=sqrt(275,68)
Delta=16,60
t=(-1+16,60)/(2*4,905)

t=(15,60)/(9,81)
t=1,59 " s"
"if object is thrown upward :"
t_u=v_i/g" "t_u=1/(9,81)" "t_u=0,10" s"
"elapsed time to reach peak point"
h=v_i^2/(2*g)

h=1/(2*9,81)" "h=0,05" m"
h_t=14+0,05=14,05" meters total height"

14,05=1/2*g*t^2" "28,1=9,81*t^2

t^2=(28,1)/(9,81)

t^2=2,86

t=1,69 " s"