How do you show that one of the roots of x^2+ax+b=0x2+ax+b=0 is aa, if and only if b=0b=0?

1 Answer
Mar 27, 2018

First of all, the root would be x=-ax=a. Treat the equation as if it is a full quadratic, and then attempt to solve when bne0b0 as well as when b=0b=0 to prove it.

Explanation:

Let's write out the quadratic formula first:

x=(-b+-sqrt(b^2-4ac))/(2a)x=b±b24ac2a

This assumes the equation looks like: ax^2+bx+cax2+bx+c in this scenario:

a=1a=1
b=ab=a
c=bc=b

Filling out the equation:

x=(-a+-sqrt(a^2-4b))/(2)x=a±a24b2

Now, if bne0b0, the solution for the zeroes can be given using the above equation. However, if b=0b=0, we can move forward:

x=(-a+-sqrt(a^2-4(0)))/(2)=(-a+-sqrt(a^2))/(2)x=a±a24(0)2=a±a22

x=(-a+-a)/2x=a±a2

x={0,-a}x={0,a}