A baseball is a square that is90 feet on each side.what is the distance a catcher had to to throw the ball from home to second base?

1 Answer
Jun 28, 2018

See a solution process below:

Explanation:

Because the baseball field is a square, the angles at the corner are right angles.

Therefore, the throw from home plate to second base makes a hypotenuse for a right triangle.

enter image source here

Therefore, we can use the Pythagorean Theorem to find the distance between home plate and second base.

The Pythagorean Theorem states:

#c = sqrt(a^2 + b^2)# Where:

#c# is the length of the hypotenuse of the right triangle

#a# and #b# are the length of the sides of the right triangle

Substituting #90" ft"# for #a# and #b# and calculate #c# gives:

#c = sqrt((90" ft")^2 + (90" ft")^2)#

#c = sqrt(8100" ft"^2 + 8100" ft"^2)#

#c = sqrt(8100" ft"^2 xx 2)#

#c = sqrt(8100" ft"^2)sqrt(2)#

#c = 90sqrt(2)" ft"#

#c ~= 127" ft"#