How do you show that one of the roots of #x^2+ax+b=0# is #a#, if and only if #b=0#?

1 Answer
Mar 27, 2018

First of all, the root would be #x=-a#. Treat the equation as if it is a full quadratic, and then attempt to solve when #bne0# as well as when #b=0# to prove it.

Explanation:

Let's write out the quadratic formula first:

#x=(-b+-sqrt(b^2-4ac))/(2a)#

This assumes the equation looks like: #ax^2+bx+c# in this scenario:

#a=1#
#b=a#
#c=b#

Filling out the equation:

#x=(-a+-sqrt(a^2-4b))/(2)#

Now, if #bne0#, the solution for the zeroes can be given using the above equation. However, if #b=0#, we can move forward:

#x=(-a+-sqrt(a^2-4(0)))/(2)=(-a+-sqrt(a^2))/(2)#

#x=(-a+-a)/2#

#x={0,-a}#