How do you factor x^3+x^2-x-1?

1 Answer
Apr 10, 2015

The result is x^3+x^2-x-1 = (x-1)·(x+1)^2

The reason is the following:

First, you apply Ruffini's Rule trying to divide the polynome by any of the divisors of the independent term; I tried to do it by (-1) and it worked (remember that the sign of the divisor is changed when applying Ruffini's Rule):

                 |  1         1        -1        -1
                 |
       1         |            1        2         1

                    1         2        1         0

By doing this we have obtained that
x^3+x^2-x-1 = (x-1)·(x^2+2x+1)

And now it is easy to see that x^2+2x+1= (x+1)^2 (it is a "Notable Product").
(If you would not realise of that, you can always use the formula to solve second-degree equations: x=(-b+-sqrt(b^2-4ac))/(2a), and in this case you would obtain the single solution x=(-1), which you must again change to x+1 when you factorize and raise to square).

So, summarizing, the final result is: x^3+x^2-x-1= (x-1)·(x+1)^2