How to prove this theorem? Let #A# be an #n×n# matrix and #lambda# an eigenvalue of #A#. Then #lambda+mu# is an eigenvalue of the matrix #M = A+muI#, where #I# is the #n × n# unit matrix?

1 Answer
Apr 24, 2018

If #lamda# is an eigenvalue of #bb(A)# with corresponding eigenvector #bb(ulv)# then we know that (by definition)

# bb(A) bb(ulv) = lamda bb(ulv)# ..... [A]

Now, let us determine if #lamda+mu# is an eigenvalue of the given matrix:

# bb(M) =bb(A)+mu bb(I) #

Consider:

# bb(M) bb(ul v) = (bb(A)+mu bb(I)) bb(ul v) #

# \ \ \ \ \ \ \ = bb(A)bb(ul v) + mu bb(I)bb(ul v) \ \ \ # (by linearity)

# \ \ \ \ \ \ \ = lamdabb(ul v) + mu bb(ul v) \ \ \ # (by [A] and identity properties)

# \ \ \ \ \ \ \ = (lamda + mu )bb(ul v) \ \ \ # (by linearity)

Thus we have shown that #lamda+mu# is an eigenvalue of the matrix # bb(M)# with corresponding eigenvector #bb(ulv)#. QED.