basis for a subspace:
A basis for a subspace W is a set of
vectors v1, ...,vk in W such that:
characteristic
polynomial of a matrix:
The characteristic polynomial of a n by n matrix
A is the polynomial in t given by the formula
det(A - t*I).
column space of a matrix:
The column space of a matrix is the subspace
spanned by the columns of the matrix considered as
vectors. See also: row space.
consistent linear system:
A system of linear equations is consistent if it has at least one
solution. See also: inconsistent.
defective matrix:
A matrix A is defective if A has an eigenvalue whose
geometric multiplicity is less than its
algebraic multiplicity.
diagonalizable matrix:
A matrix is diagonalizable if it is dimension of a subspace:
The dimension of a subspace W is the
number of vectors in any basis of W.
(If W is the subspace {0}, we say that its dimension is 0.)
row echelon form
of a matrix:
A matrix is in row echelon form if:
reduced row echelon form of a matrix:
A matrix is in reduced row echelon form if:
eigenspace of a matrix:
The eigenspace associated with the eigenvalue c of a matrix A is the
null space of A - c*I.
eigenvalue of a matrix:
An eigenvalue of a n by n matrix A is a scalar
c such that A*x = c*x holds for some nonzero vector
x (where x is an n-tuple). See also: eigenvector.
eigenvector of a matrix:
An eigenvector of a n by n matrix A is a nonzero
vector x such that A*x = c*x holds
for some scalar c. See also: eigenvalue.
equivalent linear systems:
Two systems of linear equations in n unknowns are equivalent if
they have the same set of solutions.
homogeneous linear system:
A system of linear equations A*x = b is homogeneous if b = 0.
inconsistent linear system:
A system of linear equations is inconsistent if it has no solutions.
See also: consistent.
inverse of a matrix:
The matrix B is an inverse for the matrix A if A*B =
B*A = I.
invertible matrix:
A matrix is invertible if it has an inverse.
least-squares
solution of a linear system:
A least-squares solution to a system of linear equations
A*x = b is a vector x that minimizes the
length of the vector A*x - b.
linear combination of vectors:
A vector v is a linear combination of the vectors v1, ..., vk if there
exist scalars a1, ..., ak such that v = a1*v1 + ... + ak*vk.
linearly dependent vectors:
The vectors v1, ..., vk are linearly
dependent if the equation a1*v1 + ... + ak*vk = 0 has a solution where not all the scalars a1, ...,
ak are zero.
linearly
independent vectors:
The vectors v1, ..., vk are linearly
independent if the only solution to the equation
a1*v1 + ... + ak*vk = 0 is the
solution where all the scalars a1, ..., ak are zero.
linear transformation
:
A linear transformation from V to W is a function
T from V to W such that:
algebraic multiplicity
of an eigenvalue:
The algebraic multiplicity of an eigenvalue c of a matrix
A is the number of times the factor (t-c) occurs in the
characteristic polynomial of A.
geometric multiplicity of an eigenvalue:
The geometric multiplicity of an eigenvalue
c of a matrix A is the dimension
of the eigenspace of c.
nonsingular matrix:
An n by n matrix A is nonsingular if the only
solution to the equation A*x = 0 (where x
is an n-tuple) is x = 0. See also:
singular.
null space of a matrix:
The null space of a m by n matrix A is the set of
all n-tuples x such that A*x = 0.
null
space of a linear transformation:
The null space of a linear
transformation T is the set of vectors v in its
domain such that T(v) = 0.
nullity of a matrix:
The nullity of a matrix is the dimension of
its null space.
nullity of a linear transformation:
The nullity of a dimension of its
null space.
orthogonal set of
vectors:
A set of n-tuples is orthogonal if the dot product of any two
of them is 0.
orthogonal matrix:
A matrix A is orthogonal if A is invertible and its orthogonal linear transformation:
A linear transformation T
from V to W is orthogonal if T(v)
has the same length as v for all vectors v in V.
orthonormal set of vectors:
A set of n-tuples is orthonormal if it is
orthogonal and each vector has length 1.
range of a matrix:
The range of a m by n matrix A is the set of all
m-tuples A*x, where x is any n-tuple.
range
of a linear transformation:
The range of a linear
transformation T is the set of all vectors
T(v), where v is any vector in its domain.
rank of a matrix:
The rank of a matrix is the number of nonzero rows in any
row equivalent matrix that is in
row echelon form.
rank of
a linear transformation:
The rank of a linear transformation (and hence of any matrix regarded as a
linear transformation) is the
dimension of its range. Note: A theorem tells us that the two
definitions of rank of a matrix are equivalent.
row equivalent matrices:
Two matrices are row equivalent if one can be obtained from the other by a
sequence of elementary row operations:
The elementary row operations performed on a matrix are:
row space of a matrix:
The row space of a matrix is the subspace
spanned by the rows of the matrix considered as vectors.
See also: similar matrices:
Matrices A and B are similar if there is a square
nonsingular matrix S such that
S^(-1)*A*S = B.
singular matrix:
An n by n matrix A is singular if the equation
A*x = 0 (where x is an
n-tuple) has a nonzero solution for x. See also: nonsingular.
span of a set of vectors:
The span of the vectors v1, ..., vk is the
subspace V consisting of all linear combinations of v1,
..., vk. One also says that the subspace V is
spanned by the vectors v1, ..., vk and
that these vectors span V.
subspace:
A subset W of n-space is a subspace if:
symmetric matrix:
A matrix is symmetric if it equals its transpose.