# MATLAB TUTORIAL, part 2.1: Matrix Roots

Matrix Roots
Square roots of the matrices are identified in the cusomary manner:

${\bf R} = \sqrt{\bf A} = {\bf A}^{1/2} \qquad \Longrightarrow \qquad {\bf R}\, {\bf R} = {\bf A} .$
The square roots of a matrix is not unique, and most matrices have several different square roots; some even have infinite number of square roots, but some nilpotent matrices have no root. Actually, the $$n \times n$$ identity matrix has infinitely many square roots for $$n \ge 2.$$ Recall that an involutory matrix is a matrix that is its own inverse. That is, multiplication by matrix A is an involution if and only if $${\bf A}^2 = {\bf I}.$$ Involutory matrices are all square roots of the identity matrix. Another example of the involutory matrix gives the Householder matrix (or Householder reflection) $${\bf P} = {\bf I} - 2\,{\bf v}\,{\bf v}^T ,$$ where v is a n-column vector of unit length and I is the identity matrix. It is named in honor of the American mathematician Alston S. Householder (1904--1993). In $$\mathbb{R}^2$$ the Householder matrix represents a reflection about the line through the origin that is orthogonal to v, and in $$\mathbb{R}^3$$ it represents a reflection about the plane theough the origin that is orthogonal to v. We present some examples.

Example: Each of the following matrices is an involutory one (an involutory matrix is a matrix that is its own inverse):

$\begin{bmatrix} 1&0&0 \\ 0&0&1 \\ 0&1&0 \end{bmatrix} , \qquad \begin{bmatrix} 1&0&0 \\ 0&-1&0 \\ 0&0&-1 \end{bmatrix} ,\qquad \frac{1}{3} \begin{bmatrix} 1&-2&-2 \\ -2&1&-2 \\ -2&-2&1 \end{bmatrix} .$

■:

Example: The following $$2 \times 2$$ matrices (identity matrix and negative identity matrix) have infinite many roots depending on two parameters (denoted as a and b):

\begin{align*} {\bf I}^{1/2} &= \begin{bmatrix} 1&0 \\ 0& 1 \end{bmatrix}^{1/2} = \frac{1}{b} \begin{bmatrix} ab&b^2 \\ 1-a^2& -ab \end{bmatrix} , \quad b\ne 0.\; \\ {\bf I}^{1/2} &= \begin{bmatrix} 1&0 \\ 0& 1 \end{bmatrix}^{1/2} = \begin{bmatrix} 1&0 \\ a& -1 \end{bmatrix} , \\ \left( -{\bf I} \right)^{1/2} &= \begin{bmatrix} -1&0 \\ 0& -1 \end{bmatrix}^{1/2} = \frac{1}{b} \begin{bmatrix} ab&-b^2 \\ 1+a^2& -ab \end{bmatrix} , \quad b\ne 0. \end{align*}

■:

Example: The following $$2 \times 2$$ nilpotent matrices have no square root

${\bf A} = \begin{bmatrix} 0&1 \\ 0& 0 \end{bmatrix} \qquad \mbox{and} \qquad {\bf B} = \begin{bmatrix} 1&1 \\ -1& -1 \end{bmatrix} .$
Both matrices satisfy the equation $${\bf A}^2 = {\bf 0} \quad\mbox{and}\quad {\bf B}^2 = {\bf 0} .$$ They have a double eigenvalue $$\lambda =0 ,$$ with zero trace. However, we can define the following matrix-functions:
${\bf \Phi}_A (t) = \cos \left( \sqrt{\bf A} \,t \right) , \quad {\bf \Phi}_B (t) = \cos \left( \sqrt{\bf B} \,t \right) ,\qquad \mbox{and} \qquad {\bf \Psi}_A (t) = \frac{\sin \left( \sqrt{\bf A} \,t \right)}{\sqrt{\bf A}} , \quad {\bf \Psi}_B (t) = \frac{\sin \left( \sqrt{\bf B} \,t \right)}{\sqrt{\bf B}} .$
To determine these functions, we first calculate the resolvents:
${\bf R}_{\lambda} ({\bf A}) = \left( \lambda{\bf I} - {\bf A} \right)^{-1} = \frac{1}{\lambda^2} \begin{bmatrix} \lambda&1 \\ 0 & \lambda \end{bmatrix} , \qquad {\bf R}_{\lambda} ({\bf B}) = \left( \lambda{\bf I} - {\bf B} \right)^{-1} = \frac{1}{\lambda^2} \begin{bmatrix} \lambda+1&1 \\ -1 & \lambda -1\end{bmatrix} , \qquad$
Then
\begin{align*} {\bf \Phi}_A (t) = \cos \left( \sqrt{\bf A} \,t \right) &= \frac{{\text d}}{{\text d}\lambda} \, \cos \left( \sqrt{\lambda}\,t \right) \begin{bmatrix} \lambda & 1 \\ 0 &\lambda \end{bmatrix}_{\lambda =0} = \begin{bmatrix} 1 & -t^2 /2 \\ 0 &1 \end{bmatrix} , \\ {\bf \Phi}_B (t) = \cos \left( \sqrt{\bf B} \,t \right) &= \frac{{\text d}}{{\text d}\lambda} \, \cos \left( \sqrt{\lambda}\,t \right) \begin{bmatrix} \lambda +1 & 1 \\ -1 &\lambda -1 \end{bmatrix}_{\lambda =0} = \begin{bmatrix} 1 - t^2 /2& -t^2 /2 \\ t^2 /2 &1 +t^2 /2 \end{bmatrix} , \\ {\bf \Psi}_A (t) = \frac{\sin \left( \sqrt{\bf A} \,t \right)}{\sqrt{\bf A}} &= \frac{{\text d}}{{\text d}\lambda} \, \frac{\sin \left( \sqrt{\lambda}\,t \right)}{\sqrt{\lambda}} \begin{bmatrix} \lambda & 1 \\ 0 &\lambda \end{bmatrix}_{\lambda =0} = \begin{bmatrix} t & -t^3 /6 \\ 0 &t \end{bmatrix} , \\ {\bf \Psi}_B (t) = \frac{\sin \left( \sqrt{\bf B} \,t \right)}{\sqrt{\bf B}} &= \frac{{\text d}}{{\text d}\lambda} \, \frac{\sin \left( \sqrt{\lambda}\,t \right)}{\sqrt{\lambda}} \begin{bmatrix} \lambda +1 & 1 \\ -1 &\lambda -1\end{bmatrix}_{\lambda =0} = \begin{bmatrix} t - t^3 /6 & -t^3 /6 \\ t^3 /6 &t + t^3 /6 \end{bmatrix} . \end{align*}
These matrix-functions are solutions to the following initial value problems:
$\ddot{\bf \Phi}_A (t) + {\bf A} \, {\bf \Phi}_A = {\bf 0} , \quad {\bf \Phi}_A (0) = {\bf I}, \quad \dot{\bf \Phi}_A (0) = {\bf 0};$
$\ddot{\bf \Psi}_A (t) + {\bf A} \, {\bf \Psi}_A = {\bf 0} , \quad {\bf \Psi}_A (0) = {\bf 0}, \quad \dot{\bf \Psi}_A (0) = {\bf I};$
and similar for matrix B.

Let us show that matrix B has no square root under the field of real numbers by contradiction. Suppose that opposite is true and there exists a matrix $${\bf K} = \begin{bmatrix} a& b \\ c & d \end{bmatrix} ,$$ for some constants a, b, c, and d, such that $${\bf K}^2 = \begin{bmatrix} a^2 +bc& ab+bd \\ ac + cd& bc+d^2 \end{bmatrix} = {\bf B} = \begin{bmatrix} 1& 1 \\ -1 & -1 \end{bmatrix} .$$ From this matrix equation, we get four algebraic equations:

$a^2 + bc =1, \quad b(a+d) =1, \quad c(a+d) =-1, \quad bc+ d^2 =-1.$
Therefore, $$c=-b$$ and the matrix K becomes $${\bf K} = \begin{bmatrix} a& b \\ -b & d \end{bmatrix} .$$ So instead of four equations, we have three:
$a^2 - b^2 =1, \quad b(a+d) =1, \quad d^2 -b^2 =-1 .$
Since $$b^2 = a^2 -1 = (a-1)(a+1) = d^2 +1 \ge 1,$$ we get
$b= \sqrt{a^2 -1} , \qquad d= \sqrt{a^2 -2} .$
Therefore, $$a^2 >2$$ and we have from the equation $$b(a+d) =1$$ that
$\sqrt{a^2 -1} \left( a + \sqrt{a^2 -2} \right) =1 ,$
which has no solution among real numbers satisfying the enequality $$a > \sqrt{2} .$$ ■:

Example: The following 3-by-3 (nilpotent) matrix

${\bf A} = \begin{bmatrix} 0&0&b \\ 0&0&0 \\ 0&0&0 \end{bmatrix} , \quad b\ne 0 ,$
has infinite many square roots depending on two parameters, k and a:
${\bf A}^{1/2} = \begin{bmatrix} 0&a&k \\ 0&0&b/a \\ 0&0&0 \end{bmatrix} , \quad a\ne 0 .$
■:

Example: Example 1.6.1: Consider the $$3 \times 3$$ matrix $${\bf A} = \begin{bmatrix} 1&4&16 \\ 18&20&4 \\ -12&-14&-7 \end{bmatrix}$$ that has three distinct eigenvalues

A = {{1,4,16},{18,20,4},{-12,-14,-7}}
Eigenvalues[A]
Out[2]= 9, 4, 1
Eigenvectors[A]
Using eigenvectors, we build the transition matrix of its eigenvectors:
${\bf S} = \begin{bmatrix} 1&4&4 \\ -2&-5&-4 \\ 1&2&1 \end{bmatrix} , \quad\mbox{with} \quad {\bf S}^{-1} = \begin{bmatrix} -3&-4&-4 \\ 2&3&4 \\ -1&-2&-3 \end{bmatrix} .$

Then we are ready to construct eight square roots of this positive definite matrix:

$\sqrt{\bf A} = {\bf S} \sqrt{\Lambda} {\bf S}^{-1} = \begin{bmatrix} 1&4&4 \\ -2&-5&-4 \\ 1&2&1 \end{bmatrix} \begin{bmatrix} \pm 3&0&0 \\ 0&\pm 2&0 \\ 0&0&\pm 1 \end{bmatrix} \begin{bmatrix} -3&-4&-4 \\ 2&3&4 \\ -1&-2&-3 \end{bmatrix} ,$
with appropriate choice of roots on the diagonal. In particular,
$\sqrt{\bf A} = \begin{bmatrix} 3&4&8 \\ 2&2&-4 \\ -2&-2&1 \end{bmatrix} , \quad \begin{bmatrix} 21&28&32 \\ -34&-46&-52 \\ 16&22&25 \end{bmatrix} , \quad \begin{bmatrix} -11&-20&-32 \\ 6&14&28 \\ 0&-2&-7 \end{bmatrix} , \quad \begin{bmatrix} 29&44&56 \\ -42&-62&-76 \\ 18&26&31 \end{bmatrix} .$
■:


Complete

Applications