How do you evaluate #\sqrt { \frac { x } { y } } + \sqrt { \frac { y } { x } } = \frac { 10} { 3}#?

1 Answer
Sep 28, 2017

Multiply by the product of the denominators and then square both sides.

Explanation:

Given: #sqrt(x/y)+sqrt(y/x) = 10/3#

It is implicit that both x and y must not be 0:

#sqrt(x/y)+sqrt(y/x) = 10/3; x!=0, y!=0#

Multiply both sides by #3sqrt(xy)#

#3(x+y) = 10sqrt(xy); x!=0, y!=0#

Square both sides:

#9(x^2 + 2xy+y^2) = 100xy; x!=0, y!=0#

#9x^2+18xy+9y^2 = 100xy;x!=0, y!=0#

#9x^2-92xy+9y^2 = 0;x!=0, y!=0#

The general Cartesian form for a conic section is:

#Ax^2+Bxy+Cy^2+Dx+Ey+F = 0#

But, when #F = 0#, the conic section degenerates into, a point, a line, two intersecting lines, or two parallel lines. In this case, it degenerates into two intersecting lines:

graph{9x^2-92xy+9y^2 = 0 [-10, 10, -5, 5]}

This is the graph of the last equation, above, but you can obtain the same graph by graphing #y = 9x# and #y =1/9x# except that division by 0 in the original equation prevents you from evaluating them at #x = 0#