Functions are used throughout mathematics, physics, and engineering to study the structures of sets and relationships between sets. You are familiar with the notation y = f(x), where f is a function that acts on numbers, signified by the input variable x, and produces numbers signified by the output variable y. It is a custom to write an input variable to the right of function following all European languages that perform writing in left to right.
In general, a function f : X ↦ Y is a rule that associates with each x in the set X a unique element y = f(x) in Y. We say that fmaps the setX into the set Y and maps the element x to the element y. The set X is the domain of f and the set Y is called range or codomain. The set of all outputs a particular function actually uses from the set X is its image. In linear algebra, we are interested in functions that maps vectors into vectors preserving vector operations.
A function T : ℝ^{m} ↦ ℝ^{n} is a linear transformation if it satisfies two conditions:
T(v + u) = T(v) + T(u) Preservation of addition;
T(kv) = kT(v) Preservation of scalar multiplication;
for all vectors v, u in ℝ^{n} and for all scalars k.
AS it is clear from the definition above, we can similar extend it for complex fields ℂ^{m} ↦ ℂ^{n} or rational fields ℚ^{m} ↦ ℚ^{n}. Actually, we can extend this definition for arbitrary vector spaces:
Let V and U be vector spaces over a scalar field 𝔽 (which is either ℂ or ℝ or ℚ). A function T : V ↦ U is called a linear transformation (also called linear mapping or vector space homomorphism) if T preserves vector addition and scalar multiplication.
Theorem 1:
Let V be a finite-dimensional vector space of dimension n≥1, and let
β = { v_{1}, v_{2}, … , v_{n} } be a basis for V. Let U be any vector space, and let { u_{1}, u_{2}, … , u_{n} } be a list of vectors from U. The function T : V ↦ U defined by
is a linear transformation for any n scalars 𝑎_{1}, 𝑎_{2}, … , 𝑎_{n}.
Because β is a basis in V, there are
unique scalars 𝑎_{1}, 𝑎_{2}, … , 𝑎_{n} such that arbitrary vector v ∈ V is represented as a linear combination of basis vectors: v = 𝑎_{1}v_{1} + 𝑎_{2}v_{2} + ⋯ + 𝑎_{n}v_{n}. So there is a unique
corresponding element
To show that T is a linear transformation, take any vectors v = 𝑎_{1}v_{1} + 𝑎_{2}v_{2} + ⋯ + 𝑎_{n}v_{n} and w = b_{1}v_{1} + b_{2}v_{2} + ⋯ +
b_{n}v_{2} in V and any real number k. We have
\[
T \left( x \begin{bmatrix} -1 \\ \phantom{-}2 \end{bmatrix} + y \begin{bmatrix} 1 \\ 1 \end{bmatrix} \right) = x\begin{bmatrix} -1 \\ \phantom{-}1 \\ \phantom{-}2 \end{bmatrix} + y \begin{bmatrix} 2 \\ 1 \\ 3 \end{bmatrix} =
\]
To prove that T is a linear transformation, all we need to
do is say: by Theorem 1, T is a linear transformation.
An alternative way to write T is as follows:
provides a transformation from ℘_{≤n} into ℘_{≤n-1}
Let ℳ_{m×n} is a set of all m × n matrices with entries from the field 𝔽. Then transformation gives a linear transformation from ℳ_{m×n} into ℳ_{n×m}.
Let ℭ^{∞}[𝕋] be set of infinitely differentiable periodic functions on the unit circle 𝕋 (one-dimensional torus). Then expansion of a function from ℭ^{∞}[𝕋] into the Fourier series
provides a linear transformation from ℭ^{∞}[𝕋] into the set of infinite sequences.
Isometric transformations
A transformation is isomeric when ∥A x∥ = ∥ x∥.
This implies that the eigenvalues of an isometric transformation are given by λ = exp(jφ). Then also we have ⟨ Ax , Ay ⟩ = ⟨ x m y ⟩.
When W is an invariant subspace of the isometric transformation A with dim(A) < ∞, then also W^{⊥} is also invariant subspace.
Orthogonal transformations
A transformation A is orthogonal if A is isometric and its inverse exists.
For an orthogonal transformation O, the identity O^{T}O = I, so O^{T} = O^{−1}. If A and B are orthogonal, then AB and A^{−1} are also orthogonal.
Let A : V → V be orthogonal with dim(V) <
∞, then A is direct orthogonal if det(A) = +1. Matrix A describes a rotation. In particular, A provides a rotation of ℝ² through angle φ, it is given by
So the rotation angle φ is determined by trace tr(A) = 2cos(φ) with 0 ≤ φ ≤ π. Let λ₁ and λ₂ be the roots of the characteristic equation. Then Re(λ₁) = Re(λ₂) = cos(φ) and λ₁ = exp(jφ) and
λ₂ = exp(−jφ).
In ℝ³, λ₁ = 1, λ₂ = λ₃* = exp(jφ). A rotation over eigenspace corresponding λ₁ is given by matrix
A transformation A is called mirrored orthogonal if det(A) = −1. Vectors from E_{−1} are mirrored by A with respect to the invariant subspace
E^{⊥}_{−1}.
A mirroring in ℝ² in <\( \left( \cos \left( \frac{1}{2}\,\varphi \right) , \sin \left( \frac{1}{2}\,\varphi \right) \right) \) > is given by
Mirrored orthogonal transformations in ℝ³ are rotational mirroring rotations of axis < a > through angle φ and mirror plane
< a >^{⊥}. The matrix of such transformation is given by
For all orthogonal transformations in ℝ³, O(x)×O(y) = O(x×y).
ℝ^{n} (n < ∞) can be decomposed in invariant subspaces with dimension 1 or 2 for each orthogonal transformation.
Unitary transformations
Let V be complex vector space with inner product. A linear transformation U of V is called unitary if it is isometric and its inverse exists.
An n × n matrix U is unitary if U*U = I, the identity matrix. Its determinant is det(U) = ±1. Each isometric transformation in a finite dimensional complex vector space is unitary.
Theorem 1:
For an n × n matrix A, the following statements are equivalent:
A is unitary.
The columns of A form an orthonormal set.
The rows of matrix A form an orthonormal set.
Symmetric transformations
A transformation of ℝ^{n} is called symmetric if ⟨ Ax , y ⟩ = ⟨ x , Ay ⟩ for any vectors x and y from the vector space.
A square matrix A is symmetric if A^{T} = A. A linear transformation is symmetric if its matrix with respect to an arbitrary basis is symmetric. All eigenvalues of a symmetric transformation are real. Eigenvectors corresponding to distinct eigenvalues are orthogonal. If A is symmetric, then A^{T} = A = A* for any orthogonal basis. The product A^{T}A is symmetric if ^{T} is.
Self-adjoint transformations
A transformation H : ℂ^{n} → ℂ^{n} is called self-adjoint or Hermitian if
⟨ Ax , y ⟩ = ⟨ x , Ay ⟩ for any vectors x and y from the vector space.
A product AB of two self-adjoint matrices A and B is self-adjoint if its commutator is zero, [A, B] = AB − BA = 0.
Eigenvalues of any self-adjoint matrix are real numbers.
Normal transformations
A linear transformation A is called normal if A*A = AA*.
Let the different roots of the characteristic equation of normal matrix A be β_{i} with multiplicities n_{i}.
Than the dimension of each eigenspace V_{i} equalsn_{i}. These eigenspaces are mutually perpendicular and each vector x∈V can be written in exactly one way as