# Is matrix multiplication commutative?

Then teach the underlying concepts
Don't copy without citing sources
preview
?

#### Explanation

Explain in detail...

#### Explanation:

I want someone to double check my answer

3
Dec 11, 2014

In general, matrix multiplication is not commutative. There are some exceptions, however, most notably the identity matrices (that is, the n by n matrices ${I}_{n}$ which consist of 1s along the main diagonal and 0 for all other entries, and which act as the multiplicative identity for matrices)

In general, when taking the product of two matrices $A$ and $B$, where $A$ is a matrix with $m$ rows and $n$ columns and $B$ is a matrix with $n$ rows and $p$ columns, the resultant matrix $A B$ will possess $m$ rows and $p$ columns. The multiplication cannot occur at all if the number of columns in $A$ is not equal to the number of rows in $B$, so if $A B$ exists, the only way for $B A$ to exist at all would be if $p = m$, thus making matrix $A$ a $m$x$n$ matrix and matrix $B$ a $n$x$m$ matrix. However, if $p = n$, it is still quite possible for $A B \ne B A$

Recall how each entry in the matrix product is determined. When matrices $A$ and $B$ are multiplied into matrix $A B$, each entry in the new matrix is formed from the entries in the old matrix. Specifically, to find matrix entry $A {B}_{i j}$ (that is, the entry in the $i$ row and $j$ column of matrix $A B$), we take the dot product of row $i$ of matrix $A$, and column $j$ of matrix $B$.

$A {B}_{i j} = {\sum}_{k = 1}^{n} {A}_{i k} {B}_{k j}$

As an example, consider the 3x3 matrices $A$ and $B$, with
$A = \left(\begin{matrix}{a}_{11} & {a}_{12} & {a}_{13} \\ {a}_{21} & {a}_{22} & {a}_{23} \\ {a}_{31} & {a}_{32} & {a}_{33}\end{matrix}\right)$ and $B = \left(\begin{matrix}{b}_{11} & {b}_{12} & {b}_{13} \\ {b}_{21} & {b}_{22} & {b}_{23} \\ {b}_{31} & {b}_{32} & {b}_{33}\end{matrix}\right)$.
Then $A {B}_{2 , 3}$ is simply the dot product of the second row of $A$ and the third column of $B$, or $\left({a}_{21} \setminus {a}_{22} \setminus {a}_{23}\right) \setminus . \setminus \left(\begin{matrix}{b}_{13} \\ {b}_{23} \\ {b}_{33}\end{matrix}\right) = \left({a}_{21} \cdot {b}_{13}\right) + \left({a}_{22} \cdot {b}_{23}\right) + \left({a}_{23} \cdot {b}_{33}\right)$

However, $B {A}_{2 , 3}$ would be the dot product of row 2 of matrix $B$ and column 3 of matrix $A$, or...
$\left({b}_{21} \setminus {b}_{22} \setminus {b}_{23}\right) \setminus . \setminus \left(\begin{matrix}{a}_{13} \\ {a}_{23} \\ {a}_{33}\end{matrix}\right) = \left({b}_{21} \cdot {a}_{13}\right) + \left({b}_{22} \cdot {a}_{23}\right) + \left({b}_{23} \cdot {a}_{33}\right)$

If matrix multiplication were commutative, we would expect $B {A}_{2 , 3} = A {B}_{2 , 3}$ (among other things). Since these expressions may very well not be equal according to our work thus far, we can safely conclude that matrix multiplication is not necessarily commutative.

• 28 minutes ago
• 31 minutes ago
• 32 minutes ago
• 33 minutes ago
• 6 minutes ago
• 16 minutes ago
• 18 minutes ago
• 21 minutes ago
• 24 minutes ago
• 27 minutes ago
• 28 minutes ago
• 31 minutes ago
• 32 minutes ago
• 33 minutes ago