The time required to finish a test is normally distributed (i.e. in a Gaussian distribution) with a mean of 60 minutes and a standard deviation of 10 minutes. What is the probability that a student will finish the test in less than 70 minutes?

1 Answer
Aug 30, 2017

About #84.13%# that a student finishes in less than #70# minutes.


Consider the Gaussian distribution function:

#f(t) = 1/sqrt(2pisigma^2) e^(-(t-t_"avg")^2//2sigma^2)#

You were given the average time and the standard deviation in the time:

  • #t_"avg" = "60 minutes"#
  • #sigma = "10 minutes"#

So now we have an explicit expression for #f(t)#:

#f(t) = 1/sqrt(200pi) e^(-(t-60)^2//200)#

This function, when integrated over a certain range, will give the area under the curve, and that gives the probability of one student finishing in the allotted time.

In other words,

#P = int_(0)^(t) f(t)dt#

is the probability that one student will finish the test within #t# minutes.

Thus, in order to finish the test in less than #70# minutes, we consider the range #t in [0,70)#:

#int_(0)^(70) f(t)dt = ???#

This is a non-elementary integral, so you can either use a calculator or Wolfram Alpha. And so:

#1/sqrt(200pi) int_(0)^(70) e^(-(t-60)^2//200)dt#

#= 0.841345#

So, there is about a #color(blue)(84.13%)# chance that a student finishes in less than #70# minutes.