Sage is a powerful system for studying and exploring many different areas of mathematics. This page presents some topics from Linear Algebra needed for construction of solutions to systems of linear ordinary differential equations and some applications.
There are many special types of matrices that are encounted frequently in engineering analysis. An important example is the identity matrix given by
A square matrix A is symmetric if \( {\bf A} = {\bf A}^{\mathrm T} .\) A square matrix A is self-adjoint if \( {\bf A} = {\bf A}^{\ast} ,\) where \( {\bf A}^{\ast} = \overline{\bf A}^{\mathrm T} \) is an adjoint matrix. When all entries of the matrix A are real, \( {\bf A}^{\ast} = {\bf A}^{\mathrm T} . \) A matrix A is skew-symmetric (also called antisymmetric) if \( {\bf A} = -{\bf A}^{\mathrm T} , \) so \( {\bf x}^{\mathrm T} {\bf A}\, {\bf x} = 0 \) for all real x. All main diagonal entries of a skew-symmetric matrix must be zero, so the trace is zero. If \( {\bf A} = ( a_{ij}) \) is skew-symmetric, \( a_{ij} = −a_{ji}; \) hence \( a_{ii} = 0. \) A skew-symmetric matrix is determined by \( n(n − 1)/2 \) scalars (the number of entries above the main diagonal); a symmetric matrix is determined by \( n(n + 1)/2 \) scalars (the number of entries on or above the main diagonal). A square matrix in which all the entries above the main diagonal are zero is called lower triangular, and a square matrix in which all the entries below the main diagonal are zero is called upper triangular.
Example. The following 3 by 3 matrices are examples of symmetric and skew-symmetric metrices:
Let A be an \( n\times n \) skew-symmetric matrix. The determinant of A satisfies
Theorem:
Theorem: Every square matrix A can be expressed uniquely as the sum of two matrices S and V, where \( {\bf S} = \frac{1}{2} \left( {\bf A} + {\bf A}^{\mathrm T} \right) \) is symmetric and \( {\bf V} = \frac{1}{2} \left( {\bf A} - {\bf A}^{\mathrm T} \right) \) is skew-symmetric.
Theorem: The product of two symmetric matrices is symmetric if and only if the matrices commute.
Theorem: The product of a matrix and its transpose is symmetric.
Theorem: If A is an invertible matrix, then \( {\bf A}\, {\bf A}^{\mathrm T} \) and \( {\bf A}^{\mathrm T} {\bf A} \) are invertible symmetric matrices. ■
Name |
Explanation |
Description |
Band matrix | A square matrix whose non-zero entries are confined to a diagonal band. | |
Bidiagonal matrix | A band matrix with elements only on the main diagonal and either the superdiagonal or subdiagonal | |
Binary matrix or Boolean |
A matrix whose entries are all either 0 or 1. | |
Defective | iff the geometric and algebraic multiplicity differ for at least one eigenvalue | |
Diagonal matrix | A square matrix with all entries outside the main diagonal equal to zero | |
Elementary | If it is obtained from an identity matrix by performing a single elementary row operation | |
Hadamard matrix | A square matrix with entries +1, −1 whose rows are mutually orthogonal. | |
Hermitian or self-adjoint |
A square matrix which is equal to its conjugate transpose | \( {\bf A} = {\bf A}^{\ast} .\) |
Hessenberg | is like a triangular matrix except that the elements adjacent to the main diagonal can be non-zero: \( A[i,j] =0 \) whenever \( i>j+1 \) or \( i < j-1 . \) | |
Hollow matrix | A square matrix whose main diagonal comprises only zero elements | |
Idempotent | \( {\bf P}^2 = {\bf P} \) (projection) | |
Involuntory | \( {\bf A}^2 = {\bf I} ,\) the identity matrix. | \( {\bf A}^2 = {\bf I} ,\) |
Logical matrix | A matrix with all entries either 0 or 1 | |
Markov or Stochastic | A matrix of non-negative real numbers, such that the entries in each row sum to 1. | |
Nilpotent matrix | \( {\bf P}^k = {\bf 0} \) for some positive integer k | |
Normal matrix | \( {\bf A}^{\ast} {\bf A}= {\bf A}\,{\bf A}^{\ast} .\) | |
Orthogonal | A real square matrix A is orthogonal if \( {\bf A}^{\mathrm T} \, {\bf A} = {\bf I} . \) | \( {\bf A}^{-1} = {\bf A}^{\mathrm T} \) |
Pascal matrix | A matrix containing the entries of Pascal's triangle. | \( a_{i,j} = \binom{i}{j} .\) |
Permutation matrix | if its columns are a permutation of the columns of the identity matrix | \( {\bf P}^{-1} = {\bf P}^{\mathrm T} \) |
Positive | A real matrix is positive if all its elements are strictly >0 | |
Positive definite | if all eigenvalues are positive | \( {\bf x}^{\mathrm T} \,{\bf A} \, {\bf x} >0 \) |
Singular matrix | if it has no inverse | \( \det {\bf A} = 0. \) |
Triangular matrix | A matrix with all entries above the main diagonal equal to zero (lower triangular) or with all entries below the main diagonal equal to zero (upper triangular). | |
Unitary matrix | A square matrix whose inverse is equal to its conjugate transpose, | \( {\bf A}^{-1} = {\bf A}^{\ast} .\) |
Vandermonde | A row consists of 1, a, a², a³, etc., and each row uses a different variable. | \( v_{i,j} = a_i^{n-j} \) |
Example. The general \( n \times n \) Vandermonde matrix (named after Alexandre-Théophile Vandermonde (1735--1796) who was a French musician, mathematician, and chemist) has the form:
Example. Let A be the 2-by-3 matrix
Example. An example of an orthogonal matrix of second order is the following:
Example of involuntory matrices:
Theorem: If A is an \( n \times n \) matrix, then A is involutory if and only if \( \frac{1}{2} \left( {\bf I} + {\bf A} \right) \) is idempotent.
Example The involutory matrix
Theorem: Let A be \( n \times n \) orthogonal matrix, so \( {\bf A}^{-1} = {\bf A}^{\mathrm T} . \)