If the two equation x^2+ax+b=0 and x^2+bx+a=0 have a common root,show that other roots are the roots of the equation x^2+x+ab=0?

1 Answer
Jan 12, 2018

Proved.

Explanation:

Let The roots of the equation #x^2 + ax + b = 0# be #alpha# and #beta#.

Similarly, Let the roots of the equation #x^2 + bx + a = 0# be #alpha# and #gamma#. [There is a common root.]

So, #alpha + beta# = #-a rArr alpha = -(a + beta)#

And #alpha + gamma# = #-b rArr alpha = -(b + gamma)#

So, #-(a + beta) = -(b + gamma)#

#rArr a + beta = b + gamma#

#rArr beta - gamma = b -a#

#rArr (beta - gamma)^2 = b^2 - 2ab + a^2#...............(i)

Again, #alphabeta = b rArr beta = b/alpha#

And, #alphagamma = a rArr gamma = a/alpha#

Then, #beta - gamma = (b -a)/alpha# and #betagamma = (ab)/alpha^2#

Putting this in eq (i)

#((b-a)/alpha)^2 = (b-a)^2#

#rArr 1/alpha^2 = 1#

#rArr alpha = +-1#

So, #betagamma = +-ab# and #beta + gamma = (b + a)/alpha = +-(a + b)#

So, The Required Equation, which has #beta and gamma# as roots is

#x^2 +- (a + b)x +- ab = 0#

But #(a + b) = +- 1# [As, #alpha = +-1, beta = b# and #gamma = a#]

So, The equation becomes #x^2 +-x +-ab = 0#

Hence Proved.