Preface


This is a tutorial made solely for the purpose of education and it was designed for students taking Applied Math 0340. It is primarily for students who have some experience using Mathematica. If you have never used Mathematica before and would like to learn more of the basics for this computer algebra system, it is strongly recommended looking at the APMA 0330 tutorial. As a friendly reminder, don'r forget to clear variables in use and/or the kernel.

Finally, the commands in this tutorial are all written in bold black font, while Mathematica output is in normal font. This means that you can copy and paste all commands into Mathematica, change the parameters and run them. You, as the user, are free to use the scripts for your needs to learn the Mathematica program, and have the right to distribute this tutorial and refer to this tutorial as long as this tutorial is accredited appropriately.

Return to computing page for the first course APMA0330
Return to computing page for the second course APMA0340
Return to Mathematica tutorial for the first course APMA0330
Return to Mathematica tutorial for the second course APMA0340
Return to the main page for the course APMA0340
Return to the main page for the course APMA0330
Return to Part II of the course APMA0340

Special Matrices


There are many special types of matrices that are encounted frequently in engineering analysis. An important example is the identity matrix given by

\[ {\bf I} = \left[ \begin{array}{cccc} 1&0& \cdots & 0 \\ 0&1& \cdots & 0 \\ \vdots& \vdots & \ddots & \vdots \\ 0&0& \cdots & 1 \end{array} \right] . \]
If it is necesssary to identify the number n of rows or columns in the (sqaure) identity matrix, we put subscript: \( {\bf I}_n . \)

A square matrix A is symmetric if \( {\bf A} = {\bf A}^{\mathrm T} .\) A square matrix A is self-adjoint if \( {\bf A} = {\bf A}^{\ast} ,\) where \( {\bf A}^{\ast} = \overline{\bf A}^{\mathrm T} \) is an adjoint matrix. When all entries of the matrix A are real, \( {\bf A}^{\ast} = {\bf A}^{\mathrm T} . \) A matrix A is skew-symmetric (also called antisymmetric) if \( {\bf A} = -{\bf A}^{\mathrm T} , \) so \( {\bf x}^T {\bf A}\, {\bf x} = 0 \) for all real x. All main diagonal entries of a skew-symmetric matrix must be zero, so the trace is zero. If \( {\bf A} = ( a_{ij}) \) is skew-symmetric, \( a_{ij} = −a_{ji}; \) hence \( a_{ii} = 0. \) A skew-symmetric matrix is determined by \( n(n − 1)/2 \) scalars (the number of entries above the main diagonal); a symmetric matrix is determined by \( n(n + 1)/2 \) scalars (the number of entries on or above the main diagonal).

Example. The following 3 by 3 matrices are examples of symmetric and skew-symmetric metrices:

\[ \begin{bmatrix} 2&-3 &5 \\ -3& 7& 8 \\ 5&8& -3 \end{bmatrix} \qquad \mbox{and} \qquad \begin{bmatrix} 0&-3 &5 \\ 3& 0& -8 \\ -5&8&0 \end{bmatrix} . \]

Let A be an \( n\times n \) skew-symmetric matrix. The determinant of A satisfies

\[ \det {\bf A} = \det {\bf A}^{\mathrm T} = \det \left( -{\bf A} \right) = (-1)^n \det {\bf A} . \]
In particular, if n is odd, the determinant vanishes. The nonzero eigenvalues of a real skew-symmetric matrix are purely imaginary.

Theorem: Every square matrix A can be expressed uniquely as the sum of two matrices S and V, where \( {\bf S} = \frac{1}{2} \left( {\bf A} + {\bf A}^T \right) \) is symmetric and \( {\bf V} = \frac{1}{2} \left( {\bf A} - {\bf A}^T \right) \) is skew-symmetric.

Name
Explanation
Description
Band matrix A square matrix whose non-zero entries are confined to a diagonal band.  
Bidiagonal matrix A band matrix with elements only on the main diagonal and either the superdiagonal or subdiagonal  
Binary matrix
or Boolean
A matrix whose entries are all either 0 or 1.  
Defective iff the geometric and algebraic multiplicity differ for at least one eigenvalue  
Diagonal matrix A square matrix with all entries outside the main diagonal equal to zero  
Elementary If it is obtained from an identity matrix by performing a single elementary row operation  
Hadamard matrix A square matrix with entries +1, −1 whose rows are mutually orthogonal.  
Hermitian or
self-adjoint
A square matrix which is equal to its conjugate transpose \( {\bf A} = {\bf A}^{\ast} .\)
Hessenberg is like a triangular matrix except that the elements adjacent to the main diagonal can be non-zero: \( A[i,j] =0 \) whenever \( i>j+1 \) or \( i < j-1 . \)    
Hollow matrix A square matrix whose main diagonal comprises only zero elements  
Idempotent or Projection \( {\bf P}^2 = {\bf P} .\)    
Logical matrix A matrix with all entries either 0 or 1  
Markov or Stochastic A matrix of non-negative real numbers, such that the entries in each row sum to 1.  
Nilpotent matrix \( {\bf P}^k = {\bf 0} \) for some integer k   
Normal matrix \( {\bf A}^{\ast} {\bf A}= {\bf A}\,{\bf A}^{\ast} .\)    
Orthogonal A real square matrix A is orthogonal if \( {\bf A}^{\mathrm T} \, {\bf A} = {\bf I} . \)   \( {\bf A}^{-1} = {\bf A}^{\mathrm T} \)  
Pascal matrix A matrix containing the entries of Pascal's triangle. \( a_{i,j} = \binom{i}{j} .\)  
Permutation matrix if its columns are a permutation of the columns of the identity matrix \( {\bf P}^{-1} = {\bf P}^{\mathrm T} \)  
Positive A real matrix is positive if all its elements are strictly >0  
Positive definite if alll eigenvalues are positive \( {\bf x}^{\mathrm T} \,{\bf A} \, {\bf x} >0 \)  
Singular matrix if it has no inverse \( \det {\bf A} = 0. \)  
Triangular matrix A matrix with all entries above the main diagonal equal to zero (lower triangular) or with all entries below the main diagonal equal to zero (upper triangular).  
Unitary matrix A square matrix whose inverse is equal to its conjugate transpose, \( {\bf A}^{-1} = {\bf A}^{\ast} .\)  
Vandermonde A row consists of 1, a, a², a³, etc., and each row uses a different variable. \( v_{i,j} = a(i)^{n-j} \)  

Example. The general \( n \times n \) Vandermonde matrix (named after Alexandre-Théophile Vandermonde (1735--1796) who was a French musician, mathematician, and chemist) has the form:

\[ {\bf V}_n = \left[ \begin{array}{ccccc} 1&1&1& \cdots & 1 \\ a_1 & a_2 & a_3 & \cdots & a_n \\ a_1^2 & a_2^2 & a_3^2 & \cdots & a_n^2 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ a_1^{n-1} & a_2^{n-1} & a_3^{n-1} & \cdots & a_n^{n-1} \end{array} \right] . \]
If \( a_1 , a_2 , \ldots , a_n \) are distinct real numbers, then its determinant is
\[ \det {\bf V}_n = (-1)^{n+1} (a_1 - a_2 ) \left( a_2 - a_3 \right) \cdots \left( a_{n-1} - a_n \right) \det {\bf V}_{n-1} . \]

Example. An example of an orthogonal matrix of second order is the following:

\[ {\bf A} = \begin{bmatrix} 0.6 & -0.8 \\ 0.8 & 0.6 \end{bmatrix} \qquad \Longrightarrow \qquad {\bf A}^{-1} = \begin{bmatrix} 0.6 & 0.8 \\ -0.8 & 0.6 \end{bmatrix} = {\bf A}^{\mathrm T} . \]

 

 

 

Return to Mathematica page

Return to the main page (APMA0330)
Return to the Part 1 Matrix Algebra
Return to the Part 2 Linear Systems of Equations
Return to the Part 3 Linear Systems of Ordinary Differential Equations
Return to the Part 4 Non-linear Systems of Ordinary Differential Equations
Return to the Part 5 Numerical Methods
Return to the Part 6 Fourier Series
Return to the Part 7 Partial Differential Equations