Matrix Algebra

Sage is a powerful system for studying and exploring many different areas of mathematics. This page presents some topics from Linear Algebra needed for construction of solutions to systems of linear ordinary differential equations and some applications.

 

Special Matrices

There are many special types of matrices that are encounted frequently in engineering analysis. An important example is the identity matrix given by

\[ {\bf I} = \left[ \begin{array}{cccc} 1&0& \cdots & 0 \\ 0&1& \cdots & 0 \\ \vdots& \vdots & \ddots & \vdots \\ 0&0& \cdots & 1 \end{array} \right] . \]
If it is necesssary to identify the number n of rows or columns in the (sqaure) identity matrix, we put subscript: \( {\bf I}_n . \)

A square matrix A is symmetric if \( {\bf A} = {\bf A}^{\mathrm T} .\) A square matrix A is self-adjoint if \( {\bf A} = {\bf A}^{\ast} ,\) where \( {\bf A}^{\ast} = \overline{\bf A}^{\mathrm T} \) is an adjoint matrix. When all entries of the matrix A are real, \( {\bf A}^{\ast} = {\bf A}^{\mathrm T} . \) A matrix A is skew-symmetric (also called antisymmetric) if \( {\bf A} = -{\bf A}^{\mathrm T} , \) so \( {\bf x}^{\mathrm T} {\bf A}\, {\bf x} = 0 \) for all real x. All main diagonal entries of a skew-symmetric matrix must be zero, so the trace is zero. If \( {\bf A} = ( a_{ij}) \) is skew-symmetric, \( a_{ij} = −a_{ji}; \) hence \( a_{ii} = 0. \) A skew-symmetric matrix is determined by \( n(n − 1)/2 \) scalars (the number of entries above the main diagonal); a symmetric matrix is determined by \( n(n + 1)/2 \) scalars (the number of entries on or above the main diagonal). A square matrix in which all the entries above the main diagonal are zero is called lower triangular, and a square matrix in which all the entries below the main diagonal are zero is called upper triangular.

Example. The following 3 by 3 matrices are examples of symmetric and skew-symmetric metrices:

\[ \begin{bmatrix} 2&-3 &5 \\ -3& 7& 8 \\ 5&8& -3 \end{bmatrix} \qquad \mbox{and} \qquad \begin{bmatrix} 0&-3 &5 \\ 3& 0& -8 \\ -5&8&0 \end{bmatrix} . \]

Let A be an \( n\times n \) skew-symmetric matrix. The determinant of A satisfies

\[ \det {\bf A} = \det {\bf A}^{\mathrm T} = \det \left( -{\bf A} \right) = (-1)^n \det {\bf A} . \]
In particular, if n is odd, the determinant vanishes. The nonzero eigenvalues of a real skew-symmetric matrix are purely imaginary.

Theorem:

  • The transpose of a lower triangular matrix is an upper triangular, and the transpose of an upper triangular matrix is a lower triangular.
  • The product of lower traingular matrices is lower triangular, and the product of upper triangular matrices is upper triangular.
  • A triangular matrix is invertible if and only if its diagonal entries are all nozero.
  • The inverse of an inversible lower triangular matrix is lower triangular, and the inverse of an invertible upper triangular matrix is upper triangular.

Theorem: Every square matrix A can be expressed uniquely as the sum of two matrices S and V, where \( {\bf S} = \frac{1}{2} \left( {\bf A} + {\bf A}^{\mathrm T} \right) \) is symmetric and \( {\bf V} = \frac{1}{2} \left( {\bf A} - {\bf A}^{\mathrm T} \right) \) is skew-symmetric.

Theorem: The product of two symmetric matrices is symmetric if and only if the matrices commute.

Theorem: The product of a matrix and its transpose is symmetric.

Theorem: If A is an invertible matrix, then \( {\bf A}\, {\bf A}^{\mathrm T} \) and \( {\bf A}^{\mathrm T} {\bf A} \) are invertible symmetric matrices. ■

Name
Explanation
Description
Band matrix A square matrix whose non-zero entries are confined to a diagonal band.  
Bidiagonal matrix A band matrix with elements only on the main diagonal and either the superdiagonal or subdiagonal  
Binary matrix
or Boolean
A matrix whose entries are all either 0 or 1.  
Defective iff the geometric and algebraic multiplicity differ for at least one eigenvalue  
Diagonal matrix A square matrix with all entries outside the main diagonal equal to zero  
Elementary If it is obtained from an identity matrix by performing a single elementary row operation  
Hadamard matrix A square matrix with entries +1, −1 whose rows are mutually orthogonal.  
Hermitian or
self-adjoint
A square matrix which is equal to its conjugate transpose \( {\bf A} = {\bf A}^{\ast} .\)
Hessenberg is like a triangular matrix except that the elements adjacent to the main diagonal can be non-zero: \( A[i,j] =0 \) whenever \( i>j+1 \) or \( i < j-1 . \)    
Hollow matrix A square matrix whose main diagonal comprises only zero elements  
Idempotent \( {\bf P}^2 = {\bf P} \) (projection)   
Involuntory \( {\bf A}^2 = {\bf I} ,\) the identity matrix.   \( {\bf A}^2 = {\bf I} ,\)  
Logical matrix A matrix with all entries either 0 or 1  
Markov or Stochastic A matrix of non-negative real numbers, such that the entries in each row sum to 1.  
Nilpotent matrix \( {\bf P}^k = {\bf 0} \) for some positive integer k   
Normal matrix \( {\bf A}^{\ast} {\bf A}= {\bf A}\,{\bf A}^{\ast} .\)    
Orthogonal A real square matrix A is orthogonal if \( {\bf A}^{\mathrm T} \, {\bf A} = {\bf I} . \)   \( {\bf A}^{-1} = {\bf A}^{\mathrm T} \)  
Pascal matrix A matrix containing the entries of Pascal's triangle. \( a_{i,j} = \binom{i}{j} .\)  
Permutation matrix if its columns are a permutation of the columns of the identity matrix \( {\bf P}^{-1} = {\bf P}^{\mathrm T} \)  
Positive A real matrix is positive if all its elements are strictly >0  
Positive definite if all eigenvalues are positive \( {\bf x}^{\mathrm T} \,{\bf A} \, {\bf x} >0 \)  
Singular matrix if it has no inverse \( \det {\bf A} = 0. \)  
Triangular matrix A matrix with all entries above the main diagonal equal to zero (lower triangular) or with all entries below the main diagonal equal to zero (upper triangular).  
Unitary matrix A square matrix whose inverse is equal to its conjugate transpose, \( {\bf A}^{-1} = {\bf A}^{\ast} .\)  
Vandermonde A row consists of 1, a, a², a³, etc., and each row uses a different variable. \( v_{i,j} = a_i^{n-j} \)  

Example. The general \( n \times n \) Vandermonde matrix (named after Alexandre-Théophile Vandermonde (1735--1796) who was a French musician, mathematician, and chemist) has the form:

\[ {\bf V}_n = \left[ \begin{array}{ccccc} 1&1&1& \cdots & 1 \\ a_1 & a_2 & a_3 & \cdots & a_n \\ a_1^2 & a_2^2 & a_3^2 & \cdots & a_n^2 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ a_1^{n-1} & a_2^{n-1} & a_3^{n-1} & \cdots & a_n^{n-1} \end{array} \right] . \]
If \( a_1 , a_2 , \ldots , a_n \) are distinct real numbers, then its determinant is
\[ \det {\bf V}_n = (-1)^{n+1} (a_1 - a_2 ) \left( a_2 - a_3 \right) \cdots \left( a_{n-1} - a_n \right) \det {\bf V}_{n-1} . \]

Example. Let A be the 2-by-3 matrix

\[ {\bf A} = \begin{bmatrix} 1 & -2&4 \\ 3 & 0&-5 \end{bmatrix} \]
Then
\begin{align*} {\bf A} {\bf A}^{\mathrm T} &= \begin{bmatrix} 1 & -2&4 \\ 3 & 0&-5 \end{bmatrix} \begin{bmatrix} 1 & 3 \\ 2 & 0 \\ 4&-5 \end{bmatrix} = \begin{bmatrix} 21&-17 \\ -17 & 34 \end{bmatrix} , \\ {\bf A}^{\mathrm T} {\bf A} &= \begin{bmatrix} 1 & 3 \\ 2 & 0 \\ 4&-5 \end{bmatrix} \begin{bmatrix} 1 & -2&4 \\ 3 & 0&-5 \end{bmatrix} = \begin{bmatrix} 10&-2&-11 \\ -2&4&-8 \\ -11&-8&41 \end{bmatrix} . \end{align*}
Observe that \( {\bf A} {\bf A}^{\mathrm T} \) and \( {\bf A}^{\mathrm T} {\bf A} \) are symmetrical matrices, as expected. Moreover, these two matrices share the same eigenvalues \( \frac{5}{2} \left( 11 \pm \sqrt{53} ,\right) \) but \( {\bf A}^{\mathrm T} {\bf A} \) has additional eigenvalue 0 because it is a singular matrix.

Example. An example of an orthogonal matrix of second order is the following:

\[ {\bf A} = \begin{bmatrix} 0.6 & -0.8 \\ 0.8 & 0.6 \end{bmatrix} \qquad \Longrightarrow \qquad {\bf A}^{-1} = \begin{bmatrix} 0.6 & 0.8 \\ -0.8 & 0.6 \end{bmatrix} = {\bf A}^{\mathrm T} . \]

Example of involuntory matrices:

\[ \begin{bmatrix} 1 & 0 & 0 \\ 0 & -1 & 0 \\ 0&0& -1 \end{bmatrix} , \qquad \begin{bmatrix} 1 & 0 & 0 \\ 0 & 0 & 1 \\ 0&1& 0 \end{bmatrix} . \]
The \( 2 \times 2 \) matrix \( \begin{bmatrix} a & b \\ c&-a \end{bmatrix} \) is involutory provided that \( a^2 +bc =1 . \)

Theorem: If A is an \( n \times n \) matrix, then A is involutory if and only if \( \frac{1}{2} \left( {\bf I} + {\bf A} \right) \) is idempotent.

Example The involutory matrix

\[ {\bf A} = \begin{bmatrix} 4 & -1 \\ 15 & -4 \end{bmatrix} \qquad \Longrightarrow \qquad {\bf B} = \frac{1}{2} \left( {\bf I} + {\bf A} \right) = \frac{1}{2} \begin{bmatrix} 5 & -1 \\ 15 & -3 \end{bmatrix} , \]
where B is idempotent because \( {\bf B}^2 = {\bf B} \) and \( {\bf A}^2 = {\bf I} . \)

Theorem: Let A be \( n \times n \) orthogonal matrix, so \( {\bf A}^{-1} = {\bf A}^{\mathrm T} . \)

  • For every \( {\bf v} \in \mathbb{R}^n \) we have \( \| {\bf v} \| = \| {\bf A}\, {\bf v} \| .\)
  • For every pair of vectors \( {\bf u}, \ {\bf v} \in \mathbb{R}^n , \) the angle between u and v equals the angle between Av and Au.

 

Applications