How do you factor #x^2 + 2x +3#?

2 Answers
Jun 11, 2016

There are no integer factors for this expression.

Explanation:

There are no integers which are factors.

We would need factors of 3 which add up to 2. There are simply none.

Jun 11, 2016

#x^2+2x+3=(x+1+isqrt2)(x+1-isqrt2)#

Explanation:

Zeros of #ax^2+bx+c# are given by quadratic formula #(-b+-sqrt(b^2-4ac))/(2a)#, however, such a quadratic function can be factorized, if the discriminant #(b^2-4ac)# is square of a rational number.

In #x^2+2x+3#, discriminant is #2^2-4*1*3=4-12=-8# and hence negative. So its zeros are two complex conjugate numbers given by quadratic formula i.e.

#(-2+-sqrt(2^2-4*1*3))/2# or

#(-2+-sqrt(-8))/2# or

#-1+-isqrt2# i.e. #-1-isqrt2# and #-1+isqrt2#

Now, if #alpha# and #beta# are zeros of quadratic polynomial, then its factors are #(x-alpha)(x-beta)#

Hence factors of #x^2+2x+3# are #(x+1+isqrt2)# and #(x+1-isqrt2)# and

#x^2+2x+3=(x+1+isqrt2)(x+1-isqrt2)#