Understanding the Gradient function

Key Questions

  • The gradient function, or the idea of the gradient function, is vital for understanding calculus.

    When we're dealing with a linear graph, the gradient function is simply calculating rise/run. In the real world, graphs don't always behave in a linear fashion, so we need a more accurate representation of the gradient function. However, we still need the basic idea of that linear gradient to explore how exactly we find out the gradient of a non-linear function at any point.

  • The gradient function is a precursor to the fundamental idea of a derivative.

    We know that the gradient over an interval can be found by calculating rise/run of any function, but most often in the real world, these functions don't behave in straight lines and so the gradient function is often very wrong.

    The idea is to shrink the "run" portion to the smallest size you can think of, and even smaller. That way, you are calculating more and more precise gradients for the function at a certain point. Eventually, calculus provides tools to be able to shrink that interval down to zero, but the gradient function is a good start.

  • The gradient function is a simple way of finding the slope of a function at any given point.

    Usually, for a straight-line graph, finding the slope is very easy. One simply divides the "rise" by the "run" - the amount a function goes "up" or "down" over a certain interval. For a curved line, the technique is pretty similar - pick an interval, and calculate the amount of "rise" or "fall" the graph undergoes over this interval. We want to make the interval rather small, however - otherwise we can end up with some pretty funny values!

    Take, for instance, the function of sin(x). We know that

    • #sin(0) = 0#
    • #sin(pi) = 0#

    If we were to calculate rise/run in this case, we'd get #(0-0)/(pi-0)#, giving us a slope of 0. But we know that's not the case, because the graph of sin(x) behaves very differently! So we need to make the interval as small as possible in order to make the gradient function work for us.