How do you factor #((1, 2, -3), (0, 1, 3), (0, 0, 1))# into a product of elementary matrices?
1 Answer
Jan 6, 2017
#((1, 2, -3),(0, 1, 3),(0, 0, 1)) = ((1, 2, 0),(0, 1, 0),(0, 0, 1))((1, 0, -9),(0, 1, 0),(0, 0, 1))((1, 0, 0),(0, 1, 3),(0, 0, 1))#
Explanation:
Given:
#((1, 2, -3),(0, 1, 3),(0, 0, 1))#
We can describe the process of making this matrix into the identity matrix as follows:
(1) Subtract
#((1, 0, -9),(0, 1, 3),(0, 0, 1))#
(2) Add
#((1, 0, 0),(0, 1, 3),(0, 0, 1))#
(3) Subtract
#((1, 0, 0),(0, 1, 0),(0, 0, 1))#
Reversing the steps and expressing the row operations as elementary matrices we arrive at the following product:
#((1, 2, 0),(0, 1, 0),(0, 0, 1))((1, 0, -9),(0, 1, 0),(0, 0, 1))((1, 0, 0),(0, 1, 3),(0, 0, 1))#