Vladimir Dobrushkin
https://math.uri.edu/~dobrush/

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation; with no Invariant Sections, no Front-Cover Texts, and no Back-Cover Texts. A copy of the license is included in the appendix entitled GNU Free Documentation License.

Eigenvalues and Eigenvectors

Semyon Gershgorin.

The Gershgori Circle Theorem, a very well-known result in linear algebra today, stems from the paper of Semyon Aronovich Gershgorin in 1931 (Über die Abgrenzung der Eigenwerte einer Matrix, Dokl. Akad. Nauk (A), Otd. Fiz.-Mat. Nauk (1931), 749--754) where, given an arbitrary n×n complex matrix, easy arithmetic operations on the entries of the matrix produce n disks, in the complex plane, whose union contains all eigenvalues of the given matrix.

Semyon Aronovich Gershgorin (1901--1933) was a Soviet (born in Pruzhany, Belarus, Russian Empire) mathematician. His name appears in various different transliterations in addition to the one we have chosen to use. His first name is often written as Semën or Semen, his patronymic is sometimes written as Aranovič, Aronivič or Aronovich, while his family name appears as Geršagorin, Gerschgorin or Gerszgorin. The family name is originally Yiddish and, somewhat confusingly, transliterations from the Yiddish can be given in the form Hirshhorn or Hirschhorn.

Semyon Gershgorin studied at Petrograd Technological Institute from 1923, and defended an outstanding thesis submitted to the Division of Mechanics. The papers he published at this time are (all in Russian): Instrument for the integration of the Laplace equation (1925); On a method of integration of ordinary differential equations (1925); On the description of an instrument for the integration of the Laplace equation (1926); and On mechanisms for the construction of functions of a complex variable (1926). He became Professor at the Institute of Mechanical Engineering in Leningrad in 1930, and from 1930 he worked in the Leningrad Mechanical Engineering Institute on algebra, theory of functions of complex variables, numerical methods and differential equations. He became head of the Division of Mechanics at the Turbine Institute as well as teaching innovative courses at Leningrad State University. However, his most famous paper, published in 1931 and now known as his Circle Theorem.

Let \( {\bf A} = \left[ a_{i,j} \right] \) be a square \( n\times n \) matrix with complex or real entries. Also, In or simply I denote the n × n identity matrix, whose diagonal entries are all unity and whose off-diagonal entries are all zero.For additional notation, which is used throughout, the spectrum σ(A) of a square matrix A is the collection of all eigenvalues of A. We call

\[ r_i \left({\bf A} \right) = \sum_{j \in [1..n] \backslash \{ i \}} \, | a_{i,j} | \qquad i \in [1..n] , \]
the i-th deleted absolute row sum of A. Here \( [1..n] = \{ 1, 2, \ldots , n \) is the set of first n integers. When n = 1, it is understood that \( r_1 \left({\bf A} \right) = 0 . \)

Further, we set

\[ \begin{cases} \Gamma_i \left( {\bf A} \right) &= \left\{ z \in \mathbb{C} \,:\, |z - a_{i,i} | \le r_i \left( {\bf A} \right) \right\} , \quad i \in [1..n], \\ \Gamma \left( {\bf A} \right) &= \cup_{i\in [1..n]} \, \Gamma_i \left( {\bf A} \right) . \end{cases} \]
If A is a square \( n\times n \) matrix (or linear transformation), then a nonzero vector \( {\bf x} \in \mathbb{R}^n \) is called eigenvector of A (or a matrix operator TA) if Ax is a scalar multiple of x; that is,
\[ {\bf A}\,{\bf x} = \lambda \,{\bf x} \]
for some scalar λ. The scalar λ is called an eigenvalue of A (or of matrix operator TA), and x is said to be an eigenvector corresponding to λ.

In general, the image of a vector x under multiplication by a square matrix A differs from x in both magnitude and direction. However, in the special case where x is an eigenvector of A, multiplication by A leaves the direction unchanged.

Let I denote the identity operator (matrix) on vector space V (\( \mathbb{R}^n \) ), then \( \lambda \in \mathbb{C} \) is an eigenvalue of the linear operator T (corresponding matrix operator TA) if and only if \( \lambda {\bf I} - T \) is not injective (one-to-one).

Example: The span of the empty set \( \varnothing \) consists of a unique element 0. Therefore, \( \varnothing \) is linearly independent and it is a basis for the trivial vector space consisting of the unique element---zero. Its dimension is zero.

 

Example: In \( \mathbb{R}^n , \) the vectors \( e_1 [1,0,0,\ldots , 0] , \quad e_2 =[0,1,0,\ldots , 0], \quad \ldots , e_n =[0,0,\ldots , 0,1] \) form a basis for n-dimensional real space, and it is called the standard basis. Its dimension is n.

 

Example: Let us consider the set of all real \( m \times n \) matrices, and let \( {\bf M}_{i,j} \) denote the matrix whose only nonzero entry is a 1 in the i-th row and j-th column. Then the set \( {\bf M}_{i,j} \ : \ 1 \le i \le m , \ 1 \le j \le n \) is a basis for the set of all such real matrices. Its dimension is mn.

 

Example: The set of monomials \( \left\{ 1, x, x^2 , \ldots , x^n \right\} \) form a basis in the set of all polynomials of degree up to n. It has dimension n+1. ■

 

Example: The infinite set of monomials \( \left\{ 1, x, x^2 , \ldots , x^n , \ldots \right\} \) form a basis in the set of all polynomials. ■

 

Theorem: Let V be a vector space and \( \beta = \left\{ {\bf u}_1 , {\bf u}_2 , \ldots , {\bf u}_n \right\} \) be a subset of V. Then β is a basis for V if and only if each vector v in V can be uniquely decomposed into a linear combination of vectors in β, that is, can be uniquely expressed in the form

\[ {\bf v} = \alpha_1 {\bf u}_1 + \alpha_2 {\bf u}_2 + \cdots + \alpha_n {\bf u}_n \]
for unique scalars \( \alpha_1 , \alpha_2 , \ldots , \alpha_n . \)