The numbers \( a_{ij} \) are known as the coefficients of the system. The matrix
\( {\bf A} = [\,a_{ij}\,] , \) whose \( (i,\, j) \) entry is the
coefficient \( a_{ij} \) of the system of linear equations is called the coefficient matrix and is denoted by
Let \( {\bf x} =\left[ x_1 , x_2 , \ldots x_n \right]^T \) be the vector of unknowns. Then the product
\( {\bf A}\,{\bf x} \) of the \( m \times n \) coefficient matrix A and the
\( n \times 1 \) column vector x is an \( m \times 1 \) matrix
whose entries are the right-hand sides of our system of linear equations.
If we define another column vector \( {\bf b} = \left[ b_1 , b_2 , \ldots , b_m \right]^T \)
whose components are the right-hand sides \( b_{i} ,\) the system is equivalent to the vector equation
\[
{\bf A} \, {\bf x} = {\bf b} .
\]
We say that \( s_1 , \ s_2 , \ \ldots , \ s_n \) is a solution of the system if all
\( m \) equations hold true when
Sometimes a system of linear equations is known as a set of simultaneous equations; such terminology emphasises that a solution is an assignment of values to each of the
\( n \) unknowns such that each and every equation holds with this assignment. It is also referred to simply as a linear system.
as you can easily verify by substituting these values into the equations. Every equation is satisfied for these values of \( x_1 , \ x_2 , \ x_3 , \ x_4 , \ x_5 . \)
However, there is not the only solution to this system of equations. There are many others.
has no solution. There are no numbers we can assign to the unknowns
\( x_1 , \ x_2 , \ x_3 , \ x_4 , \ x_5 \)
so that all four equations are satisfied. ■
In general, we say that a linear system of algebraic equations is consistent if it has at least one solution and inconsistent if it has no solution. Thus, a consistent linear system has either one solution or infinitely many solutions---there are no other possibilities.
Theorem: Let A be an \( m \times n \) matrix.
(Overdetermined case) If m > n, then the linear system \( {\bf A}\,{\bf x} = {\bf b} \)
is inconsistent for at least one vector b in \( \mathbb{R}^n . \)
(Underdetermined case) If m < n, then for each vector b in \( \mathbb{R}^m \)
the linear system \( {\bf A}\,{\bf x} = {\bf b} \) is either inconsistent or has infinite many solutions.
Theorem: If A is an \( m \times n \) matrix with
\( m < n, \) then \( {\bf A}\,{\bf x} = {\bf 0} \) has
infinitely many solutions.
Theorem: A system of linear equations \( {\bf A}\,{\bf x} = {\bf b} \) is consistent if and only if b is in the column space of A.
Theorem: A linear system of algebraic equations \( {\bf A}\,{\bf x} = {\bf b} \) is consistent if and only if the vector b is orthogonal
to every solution y of the adjoint homogeneous equation \( {\bf A}^T \,{\bf y} = {\bf 0} \) or \( {\bf y}^T {\bf A} = {\bf 0} . \) ■
Sage uses standard commands to solve linear systems of algebraic equations:
\[
{\bf A} \, {\bf x} = {\bf b} .
\]
Here A is m-by-n matrix, b is a given vector of size m and column-vector x of size n is to be determined. An augmented matrix is a matrix
obtained by appending the columns of two given matrices, usually for the purpose of performing the same elementary row operations on each of the given matrices.
We associate with the given system of linear equations \( {\bf A}\,{\bf x} = {\bf b} \) an augmented matrix by appending the column-vector b to the matrix A.
Such matrix is denoted by \( {\bf M} = \left( {\bf A}\,\vert \,{\bf b} \right) \) or \( {\bf M} = \left[ {\bf A}\,\vert \,{\bf b} \right] : \)
Theorem: Let A be an \( m \times n \) matrix. A linear system of equations
\( {\bf A}\,{\bf x} = {\bf b} \) is consistent if and only if the rank of the coefficient matrice A is the same as the rank of the augmented matrix \( \left[ {\bf A}\,\vert \, {\bf b} \right] . \) ■
The actual use of the term augmented matrix appears to have been introduced by the American mathematician Maxime Bocher (1867--1918) in 1907.
We show how it works by examples. The system can be solved using solve command
However, this is somewhat complex and we use vector approach:
If there is no solution, Sage returns an error:
If a square matrix is nonsingular (invertible), then solutions of the vector equation \( {\bf A}^{\mathrm T} {\bf x}^{\mathrm T} = {\bf b} \)
can be obtained by applying the inverse matrix:
\( {\bf x} = {\bf A}^{-1} {\bf b} . \) We will discuss this method later in the section devoted to inverse matrices.