Current project pages
Public trees
Abstract vector spaces
A consistent system with more variables than equations has infinitely many solutions.
Adding a multiple of one row to another row does not change the determinant.
Addition
A determinant is a multilinear function
A diagonalizable matrix is diagonalized by a matrix having the eigenvectors as columns.
Adjoining an element not in the span of a linearly independent set gives another linearly independent set.
A homogeneous system has a nontrivial solution if and only if it has a free variable.
A homogeneous system with more variables than equations has infinitely many solutions.
Algebraic properties of R^n (or C^n)
Algorithm for computing an LU decomposition
A linear system is equivalent to a matrix equation.
A linear system is equivalent to a vector equation.
A linear transformation has a representation as an upper triangular matrix.
A linear transformation has the same eigenvalues and eigenvectors as any matrix representation.
A linear transformation is determined by its action on a basis.
A linear transformation is diagonalizable if there is a basis such that each element is an eigenvector of the transformation.
A linear transformation is given by a matrix whose columns are the images of the standard basis vectors
A linear transformation is given by a matrix with respect to a given basis.
A linear transformation is given by multiplying by its matrix representation with respect to bases of the spaces
A linear transformation is injective on its generalized range space.
A linear transformation is invertible if and only if it is injective and surjective
A linear transformation is onto if and only if its rank equals the number of rows in any matrix representation.
A linear transformation is surjective if and only if the columns of its matrix span the codomain.
A linear transformation is surjective if and only if the image of a basis is a spanning set
A linear transformation is surjective if and only if the rank equals the dimension of the codomain.
A linear transformation maps 0 to 0.
A linear transformation of a linear combination is the linear combination of the linear transformation
A linear transformation on a finite dimentional nontrivial vector space has at least one eigenvalue.
All echelon forms of a linear system have the same free variables
A matrix and its transpose have the same determinant.
A matrix and its transpose have the same eigenvalues/characteristic polynomial.
A matrix A with real entries has orthonormal columns if and only if A inverse equals A transpose.
A matrix equation is equivalent to a linear system
A matrix is called ill-conditioned if it is nearly singular
A matrix is nilpotent if and only if its only eigenvalue is 0.
A matrix is orthogonally diagonalizable if and only if it is normal (The principal axis theorem).
A matrix is orthogonally diagonalizable if and only if it is symmetric.
A matrix of rank k is equivalent to a matrix with 1 in the first k diagonal entries and 0 elsewhere.
A matrix turns into its adjoint when moved to the other side of the standard inner product on C^n.
A matrix with a 0 row/column has determinant 0
A matrix with real entries and orthonormal columns preserves dot products.
A matrix with real entries and orthonormal columns preserves norms.
A matrix with real entries has eigenvalues occurring in conjugate pairs.
A matrix with two equal rows/columns has determinant 0
An eigenspace of a matrix is a nontrivial subspace.
An eigenspace of a matrix is the null space of a related matrix.
An n-by-n matrix is diagonalizable if and only if it has n linearly independent eigenvectors.
An n-by-n matrix is diagonalizable if and only if the characteristic polynomial factors completely
An n-by-n matrix is diagonalizable if and only if the sum of the dimensions of the eigenspaces equals n.
An n-by-n matrix is diagonalizable if and only if the union of the basis vectors for the eigenspaces is a basis for R^n (or C^n).
An n-by-n matrix nas n (complex) eigenvalues
An n-by-n matrix with n distinct eigenvalues is diagonalizable.
A nonempty subset of a vector space is a subspace if and only if it is closed under linear combinations
A nonsingular matrix can be written as a product of elementary matrices.
An orthogonal set of nonzero vectors is linearly independent.
Any linearly independent set can be expanded to a basis for the (sub)space
Any matrix times the 0 matrix equals the 0 matrix.
Any vector space is the direct sum of the generalized kernel and gneralized range of a linear transformation on that space.
Application Leontief input-output analysis
Applications
Applications of band matrices
Applications to cubic spline
Applications to differential equations
Applications to error-correcting code
Applications to Markov chains
Applications to voting and social choice
A scalar multiple of a linear transformation is a linear transformation
A set is a basis if each vector can be written uniquely as a linear combination.
A set is linearly independent if and only if the set of coordinate vectors with respect to any basis is linearly independent.
A set of nonzero vectors contains (as a subset) a basis for its span.
A set of two vectors is linearly dependent if and only if neither is a scalar multiple of the other.
A set of vectors containing fewer elements than the dimension of the space cannot span
A set of vectors containing more elements than the dimension of the space must be linearly dependent
A set of vectors is linearly independent if and only if the homogeneous linear system corresponding to the matrix of column vectors has only the trivial solution.
A set of vectors is linearly independent if and only if the matrix of column vectors in reduced row-echelon form has every column as a pivot column.
A subset of a linearly independent set is linearly independent.
A vector can be written uniquely as a linear combination of vectors from independent subspaces.
A vector can be written uniquely as a sum of a vector in a subspace and a vector orthogonal to the subspace.
A vector is in the orthogonal complement of a subspace if and only if it is orthogonal to every vector in a basis of the subspace.
Axioms of a vector space
Bases
Basic properties
Basic properties of linear transformations
Basic terminology
Basic terminology and notation
Block matrices
Canonical forms of matrices
Change of coordinates matrices are invertible
Characteristic and minimal polynomials
C^n is a vector space.
Cofactors
Composition
Conjugating by a change of coordinates matrix converts matrix representations with respect to different bases.
Conjugation
Container for Linear Algebra
Coordinates
Coordinate vector spaces
Cramer's rule
Definition and terminology
Definition of 0 matrix
Definition of 0/trivial subspace
Definition of 0 vector
Definition of adjoint (conjugate transpose)
Definition of adjugate/classical adjoint of a matrix
Definition of (algebraic) multiplicity of an eigenvalue
Definition of a lower triangular matrix
Definition of angle between vectors
Definition of an upper triangular matrix
Definition of applying a polynomial to a linear transformation
Definition of applying a polynomial to a square matrix
Definition of augmented matrix (of a linear system)
Definition of automorphism of a vector space
Definition of a vector being orthogonal to a subspace
Definition of band matrix
Definition of basic/dependent/leading variable in a linear system
Definition of basis of a vector space (or subspace)
Definition of block diagonal matrix
Definition of block/partitioned matrix
Definition of change of coordinates matrix between two bases
Definition of change-of-coordinates matrix relative to a given basis of R^n (or C^n)
Definition of characteristic equation of a matrix
Definition of characteristic polynomial of a linear transformation
Definition of characteristic polynomial of a matrix
Definition of Cholesky decomposition
Definition of codomain of a linear transformation
Definition of coefficient matrix of a linear system
Definition of coefficients of a linear equation
Definition of cofactor/submatrix of a matrix
Definition of column rank of a matrix
Definition of column space of a matrix
Definition of column vector
Definition of complement of a subspace
Definition of composition of linear transformations
Definition of conjugate of a matrix
Definition of conjugate of a vector in C^n
Definition of consistent linear system
Definition of constant vector of a linear system
Definition of coordinates relative to a given basis
Definition of coordinate vector/mapping/representation relative to a given basis
Definition of cross product
Definition of determinant of a matrix as a cofactor expansion across the first row
Definition of determinant of a matrix as a product of the diagonal entries in a non-scaled echelon form.
Definition of diagonalizable linear transformation
Definition of diagonalizable matrix
Definition of diagonal matrix
Definition of dimension of a vector space (or subspace)
Definition of dimension of a vector space (or subspace) being finite or infinite
Definition of direct sum of subspaces
Definition of distance
Definition of distance between vectors
Definition of domain of a linear transformation
Definition of echelon form of a linear system
Definition of (echelon matrix/matrix in (row) echelon form)
Definition of eigenspace of a linear transformation
Definition of eigenspace of a matrix
Definition of eigenvalue/characteristic value of a linear transformation
Definition of eigenvalue of a matrix
Definition of eigenvector/characteristic vector of a linear transformation
Definition of eigenvector of a matrix
Definition of elementary matrix
Definition of entry/component of a vector
Definition of equality of matrices
Definition of equality of vectors
Definition of equation operations on a linear system
Definition of equivalent matrices
Definition of equivalent systems of linear equations
Definition of extended reduced row echelon form of a matrix
Definition of free/independent variable in a linear system
Definition of generalized inverse of a matrix
Definition of generalized kernel/null space of linear transformation
Definition of generalized range space of a linear transformation
Definition of geometric multiplicity of an eigenvalue
Definition of Gram-Schmidt process
Definition of Hermitian/self-adjoint matrix
Definition of Hessenberg form
Definition of homogeneous linear system of equations
Definition of how the action of a linear transformation on a basis extends to the whole space
Definition of identity linear transformation
Definition of identity matrix
Definition of ill-conditioned linear system
Definition of image (of a point) under a linear transformation
Definition of inconsistent linear system
Definition of independent subspaces
Definition of index of nilpotency
Definition of inner/dot product on C^n
Definition of inner/dot product on R^n
Definition of inner product
Definition of inner product space
Definition of intersection of subspaces
Definition of invariant subspace of a linear transformation.
Definition of inverse of a linear transformation
Definition of inverse of a matrix
Definition of invertible linear transformation
Definition of invertible matrix
Definition of invertible/nonsingular linear transformation
Definition of isomorphic/isomorphism between vector spaces
Definition of Jordan form
Definition of kernel/null space of linear transformation
Definition of kernel of linear transformation
Definition of leading entry in a row of a matrix
Definition of least-squares error of a linear system
Definition of least-squares solution to a linear system
Definition of left inverse of a matrix
Definition of length/norm of a vector
Definition of linear combination of vectors
Definition of linear dependence relation on a set of vectors
Definition of linear equation
Definition of linearly dependent set of vectors: one of the vectors can be written as a linear combination of the other vectors
Definition of linearly independent set of vectors: if a linear combination is 0
Definition of linear transformation/homomorphism
Definition of LU decomposition
Definition of Markov matrix
Definition of matrix
Definition of matrix diagonalization
Definition of matrix equation
Definition of matrix in reduced row echelon form
Definition of matrix multiplication
Definition of matrix multiplication in terms of column vectors
Definition of matrix null space (left)
Definition of matrix null space (right)
Definition of matrix representation of a linear system
Definition of matrix representation of a linear transformation
Definition of matrix representation of a linear transformation from a vector space to itself
Definition of matrix representation of a linear transformation with respect to bases of the spaces
Definition of matrix-scalar multiplication
Definition of matrix-vector product
Definition of m by n matrix
Definition of minimal polynomial of a linear transformation
Definition of minimal polynomial of a matrix
Definition of multilinear function
Definition of nilpotent linear transformation
Definition of nilpotent matrix
Definition of nonsingular matrix: matrix is invertible
Definition of nonsingular matrix: the associated homogeneous linear system has only the trivial solution
Definition of nontrivial solution to a homogeneous linear system of equations
Definition of normal matrix
Definition of norm/length of a vector
Definition of (not necessarily orthogonal) projection onto a component of a direct sum
Definition of nullity of a linear transformation
Definition of nullity of a matrix
Definition of one-to-one/injective linear transformation
Definition of onto/surjective linear transformation
Definition of orthogonal basis of a (sub)space
Definition of orthogonal complement of a subspace
Definition of orthogonally diagonalizable matrix
Definition of orthogonal matrix
Definition of (orthogonal) projection of one vector onto another vector
Definition of (orthogonal) projection onto a subspace
Definition of orthogonal set of vectors
Definition of orthogonal subspaces
Definition of orthogonal vectors
Definition of orthonormal basis of a (sub)space
Definition of orthonormal set of vectors
Definition of parallel vectors
Definition of permutation matrix
Definition of pivot
Definition of pivot column
Definition of pivot position
Definition of positive-definite matrix
Definition of pre-image (of a point) under a linear transformation
Definition of pre-image of linear transformation
Definition of QR decomposition
Definition of quadratic form
Definition of range of a linear transformation
Definition of range of linear transformation
Definition of rank factorization of a matrix
Definition of rank of a linear transformation
Definition of rank of a matrix
Definition of rational form
Definition of reduced LU decomposition
Definition of reduced row echelon form of a matrix
Definition of right inverse of a matrix
Definition of R^n (or C^n)
Definition of (row) echelon form of a matrix
Definition of row equivalent matrices
Definition of row operations on a matrix
Definition of row reduce a matrix
Definition of row space of a matrix
Definition of scalar
Definition of scalar multiple of a linear transformation
Definition of Schur triangulation
Definition of similarity transform
Definition of similar matrices
Definition of singular matrix (not nonsingular)
Definition of singular value decomposition (SVD)
Definition of size of a matrix
Definition of size of a vector
Definition of skew-symmetric matrix
Definition of solution set of a system of linear equations
Definition of solution to a linear equation
Definition of solution to a system of linear equations
Definition of solution vector of a linear system
Definition of spanning/generating set for a space or subspace
Definition of span of a set of vectors
Definition of square matrix
Definition of subspace
Definition of subspace spanned by a set of a set of vectors
Definition of sum of linear transformations
Definition of sum of matrices
Definition of sum of subspaces
Definition of symmetric matrix
Definition of system of linear equations
Definition of the 0/trivial subspace
Definition of the determinant in terms of the effect of elementary row operations
Definition of the imaginary part of a vector in C^n
Definition of the least-squares linear fit to 2-dimensional data
Definition of the (main) diagonal of a matrix
Definition of the real part of a vector in C^n
Definition of the standard basis of the m by n matrices
Definition of the standard basis of the polynomials of degree at most n
Definition of the standard matrix for a linear transformation
Definition of the standard/natural basis of R^n (or C^n)
Definition of trace of a matrix
Definition of transpose of a matrix
Definition of trivial linear dependence relation on a set of vectors
Definition of trivial solution to a homogeneous linear system of equations
Definition of unitary matrix
Definition of unit matrix
Definition of unit vector
Definition of Vandermonde matrix
Definition of vector
Definition of vector addition
Definition of vector-scalar multiplication
Definition of vector sum/addition
Definition of weights in a linear combination of vectors
Description of a basis for the null space of a matrix from the reduced row-echelon form.
Description of a spanning set for the null space of a matrix from the reduced row-echelon form.
Description of the Gram-Schmidt process
Determinants
Determinants and operations on matrices
Determinants axiomatically
Determine if a particular set of vectors in R^3 in linearly independent
Determine if a particular set of vectors spans R^3
Determine if a particular vector is in the span of a set of vectors
Determine if a particular vector is in the span of a set of vectors in R^2
Determine if a particular vector is in the span of a set of vectors in R^3
Dimension
Distinct eigenvalues of a Hermitian matrix have orthogonal eigenvectors.
Each vector can be written uniquely as a linear combination of vectors from a given basis.
Echelon matrices
Eigenspaces
Eigenvalues and eigenvectors
Eigenvalues and operations on matrices
Eigenvectors of a symmetric matrix with different eigenvalues are orthogonal.
Eigenvectors with distinct eigenvalues are linearly independent.
Elementary matrices
Elementary matrices are invertible/nonsingular.
Equation operations on a linear system give an equivalent system.
Equivalence theorem for injective linear transformations: The columns of the matrix of T are linearly independent.
Equivalence theorem for injective linear transformations: The image of a basis for V is a basis for the range of T.
Equivalence theorem for injective linear transformations: The inverse of T is a linear transformation on its range.
Equivalence theorem for injective linear transformations: The kernel of T is 0.
Equivalence theorem for injective linear transformations: The nullity of T is 0.
Equivalence theorem for injective linear transformations: The null space of T is 0.
Equivalence theorem for injective linear transformations: The rank of T is equals the number of columns in any matrix representation..
Equivalence theorem for injective linear transformations: The rank of T is n.
Equivalence theorem for injective linear transformations: T(x)=0 only for x=0.
Equivalence theorem for nonsingular matrices: the columns of A are a basis for R^n (or C^n).
Equivalence theorem for nonsingular matrices: the columns of A are linearly independent.
Equivalence theorem for nonsingular matrices: the columns of A span R^n (or C^n).
Equivalence theorem for nonsingular matrices: the determinant of A is nonzero.
Equivalence theorem for nonsingular matrices: the dimension of the column space of A is n.
Equivalence theorem for nonsingular matrices: the equation Ax=0 has only the trivial solution.
Equivalence theorem for nonsingular matrices: the equation Ax=b has a solution for all b.
Equivalence theorem for nonsingular matrices: the equation Ax=b has a unique solution for all b.
Equivalence theorem for nonsingular matrices: the linear transformation given by T(x)=Ax has an inverse.
Equivalence theorem for nonsingular matrices: the linear transformation given by T(x)=Ax is an isomorphism.
Equivalence theorem for nonsingular matrices: the linear transformation given by T(x)=Ax is one-to-one/injective.
Equivalence theorem for nonsingular matrices: the linear transformation given by T(x)=Ax is onto/surjective.
Equivalence theorem for nonsingular matrices: the matrix A does not have 0 as an eigenvalue.
Equivalence theorem for nonsingular matrices: the matrix A has a left inverse.
Equivalence theorem for nonsingular matrices: the matrix A has an inverse.
Equivalence theorem for nonsingular matrices: the matrix A has a right inverse.
Equivalence theorem for nonsingular matrices: the matrix A has rank n.
Equivalence theorem for nonsingular matrices: the matrix A is a change-of-basis matrix.
Equivalence theorem for nonsingular matrices: the matrix A represents the identity map with respect to some pair of bases.
Equivalence theorem for nonsingular matrices: the matrix A row-reduces to the identity matrix.
Equivalence theorem for nonsingular matrices: the nullity of the matrix A is 0.
Equivalence theorem for nonsingular matrices: the null space of the matrix A is {0}.
Equivalence theorem for nonsingular matrices: there is a pivot position in every row of A.
Equivalence theorem for nonsingular matrices: the rows of A are a basis for R^n (or C^n).
Equivalence theorem for nonsingular matrices: the rows of A are linearly independent.
Equivalence theorem for nonsingular matrices: the rows of A span R^n (or C^n).
Equivalence theorem for nonsingular matrices: the transpose of the matrix A has an inverse.
Equivalence theorems for injective transformations
Equivalent matrices represent the same linear transformation with resect to appropriate bases.
Every basis for a vector space contains the same number of elements
Every finite dimensional vector space over R (or C) is isomorphic to R^n (or C^n) for some n.
Every matrix has an eigenvalue over the complex numbers.
Every matrix is row-equivalent to a matrix in reduced row echelon form.
Every matrix is row-equivalent to only one matrix in reduced row echelon form.
Every nilpotent matrix is similar to one with 1 on subdiagonal blocks and all other entries 0.
Every square matrix is conjugate
Every square matrix is similar the sum of a diagonal and a nilpotent matrix.
Every square matrix is similar to one in Jordan form.
Example of a linear transformation on R^2: generic
Example of a linear transformation on R^2: projection
Example of a linear transformation on R^2: rotation
Example of a linear transformation on R^2: shear
Example of a linear transformation on R^3: rotation
Example of a sum of vectors interpreted geometrically in R^2
Example of (echelon matrix/matrix in (row) echelon form)
Example of finding the inverse of a 2-by-2 matrix by row reducing the augmented matrix
Example of finding the inverse of a 2-by-2 matrix by using a formula
Example of finding the inverse of a 3-by-3 matrix by row reducing the augmented matrix
Example of finding the inverse of a 3-by-3 matrix by using Cramer's rule
Example of linear combination of vectors in R^2
Example of matrix-vector product
Example of multiplying 2x2 matrices
Example of multiplying 3x3 matrices
Example of multiplying matrices
Example of multiplying nonsquare matrices
Example of putting a matrix in echelon form
Example of putting a matrix in echelon form and identifying the pivot columns
Example of row reducing a 3-by-3 matrix
Example of row reducing a 4-by-4 matrix
Example of solving a 3-by-3 homogeneous matrix equation
Example of solving a 3-by-3 homogeneous system of linear equations by row-reducing the augmented matrix
Example of solving a 3-by-3 matrix equation
Example of solving a 3-by-3 system of linear equations by row-reducing the augmented matrix
Example of using the echelon form to determine if a linear system is consistent.
Example of vector-scalar multiplication in R^2
Example of writing a given vector in R^3 as a linear combination of given vectors
Examples
Examples of vector spaces
Factorization of matrices
F^n is a vector space.
For invertible linear transformations A and B
For matrices
Formula for computing the least squares solution to a linear system
Formula for computing the least squares solution to a linear system.
Formula for diagonalizing a real 2-by-2 matrix with a complex eigenvalue.
Formula for the coordinates of a vector with respect to an orthogonal/orthonormal basis.
Formula for the coordinates of the projection of a vector onto a subspace
Formula for the determinant of a 2-by-2 matrix.
Formula for the determinant of a 3-by-3 matrix.
Formula for the inverse of a 2-by-2 matrix.
Formula for the least-squares linear fit to 2-dimensional data
Formula for the (orthogonal) projection of one vector onto another vector
Formula for the spectral decomposition for a symmetric matrix
For n-by-n invertible matrices A and B
Gaussian elimination as a method to solve a linear system
Gauss-Jordan procedure to put a matrix into reduced row echelon form
Geometric description of span of a set of vectors in R^n (or C^n)
Geometric picture of a 2-by-2 linear system
Geometric picture of a 3-by-3 linear system
Geometric picture of the solution set of a linear equation in 3 unknowns
Geometric properties of linear transformations
Geometric properties of linear transformations on R^2
Geometric properties of R^n (or C^n)
Hermitian matrices
Hermitian matrices have real eigenvalues.
Homogeneous linear systems are consistent.
If A and B are n-by-n matrices
If A is a matrix
If a matrix has both a left and a right inverse
If a set of vectors contains the 0 vector
If a set of vectors in R^n (or C^n) contains more than n elements
If a space is the direct sum of invariant subspaces
If a square matrix has a one-sided inverse
If a vector space has dimension n
If B is a basis containing b and the b coordinate of c is nonzero
If the product of a vector and a scalar is 0
If two finite dimensional subspaces have the same dimension and one is contained in the other
If two matrices have equal products with all vectors
Inner products
Inner products in coordinate spaces
Inverse
Isomorphic vector spaces have the same dimension.
Isomorphism
Least squares
Linear algebra
Linear combinations
Linear (in)dependence
Linear systems and echelon matrices
Linear systems and matrices
Linear systems have 0
Linear systems of equations
Linear transformations
LU decomposition
Matrices
Matrices act as a transformation by multiplying vectors
Matrices as linear transformations
Matrix addition is commutative and associative.
Matrix adjoint is an involution.
Matrix conjugation is an involution.
Matrix describing a rotation of the plane
Matrix diagonalization
Matrix equations
Matrix equivalence
Matrix inverse is an involution.
Matrix inverses are unique: if A and B are square matrices
Matrix multiplication can be viewed as the dot product of a row vector of column vectors with a column vector of row vectors
Matrix multiplication is associative.
Matrix multiplication is distributive over matrix addition.
Matrix multiplication is not commutative in general.
Matrix representation of a composition of linear transformations is given by a matrix product
Matrix-scalar multiplication is commutative
Matrix-scalar product is commutative
Matrix transpose commutes with matrix inverse.
Matrix transpose is an involution.
Matrix-vector multiplication is a linear transformation.
Matrix-vector product is associative
Matrix-vector products
Multiplication
Multiplication by a change of coordinates matrix converts representations for different bases.
Multiplication by a Hermitian matrix commutes with the standard inner product on C^n.
Multiplication of block/partitioned matrices
Multiplicity
Multiplying a row by a scalar multiplies the determinant by that scalar.
Nilpotent matrices
Non-example of a linear transformation
Nonsingular matrices and equivalences
Normal matrices
Norm and length
Notation for entry of matrix
Notation for the set of m by n matrices
Operations on matrices
Orthogonality
Orthogonality and projection
Parametric form of the solution set of a system of linear equations
Parametric vector form of the solution set of a system of linear equations
Particular types of matrices
Projection
Proof of several equivalences for nonsingular matrix
QR decomposition
Rank and mullity
Rank and nullity
Removing a linearly dependent vector from a set does not change the span of the set.
R^n is a vector space.
Row equivalence is an equivalence relation
Row equivalent matrices have the same row space.
Row equivalent matrices represent equivalent linear systems
Row operations
Row operations are given by multiplication by elementary matrices.
Row operations do not necessarily preserve the column space.
Scalar multiplication
Similarity of matrices
Similarity of matrices in an equivalence relation.
Similar matrices have the same eigenvalues and the same characteristic polynomials.
Spans
Subspaces
Subspaces associated to a linear transformation
Subspaces associated to a matrix
Switching two rows multiplies the determinant by -1.
Symmetric matrices
Symmetric matrices are square.
Terminology
The 0 scalar multiplied by any vector equals the 0 vector.
The 0 vector is unique.
The 0 vector multiplied by any scalar equals the 0 vector.
The additive inverse of a vector equals the vector multiplied by -1.
The additive inverse of a vector is called the negative of the vector.
The additive inverse of a vector is unique.
The adjoint of a matrix-scalar product is the product of the adjoint and the conjugate.
The adjoint of a product of matrices is the product of the adjoints in reverse order.
The adjoint of a sum is the sum of the adjoints.
The Cauchy-Schwartz inequality
The Cauchy-Schwarz inequality
The Cayley-Hamilton theorem for a linear transformation
The Cayley-Hamilton theorem for a matrix.
The change of coordinates matrix between two bases exists and is unique
The characteristic polynomial applied to the matrix gives the 0 matrix.
The column space of a matrix is a vector space
The column space of an m-by-n matrix is a subspace of R^m (or C^m)
The composition of injective linear transformations is injective
The composition of invertible linear transformations is invertible
The composition of linear transformations is a linear transformation
The composition of surjective linear transformations is surjective
The condition number of matrix measures how close it is to being singular
The conjugate of a matrix-scalar product is the product of the conjugates.
The conjugate of a product of matrices is the product of the conjugates.
The conjugate of a sum of vectors in C^n is the sum of the conjugates
The conjugate of the sum of matrices is the sum of the conjugates.
The conjugate of the transpose is the transpose of the conjugate.
The conjugate of vector-scalar multiplication in C^n is the product of the conjugates.
The coordinate vector relative to a given basis is a linear mapping to R^n (or C^n).
The coordinate vector relative to a given basis is an injective linear mapping to R^n (or C^n).
The coordinate vector relative to a given basis is a surjective linear mapping to R^n (or C^n).
The crazy vector space is a vector space.
The determinant function exists.
The determinant function is unique.
The determinant of a block diagonal matrix is the product of the determinants of the blocks.
The determinant of a matrix can be computed as a cofactor expansion across any row.
The determinant of a matrix can be computed as a cofactor expansion down any column.
The determinant of a matrix can be expressed as a product of the diagonal entries in a non-scaled echelon form.
The determinant of a matrix measures the area/volume of the parallelogram/parallelipiped determined by its columns.
The determinant of a triangular matrix is the product of the entries on the diagonal.
The determinant of the inverse of A is the reciprocal of the determinant of A.
The determinant of the matrix of a linear transformation is the factor by which the area/volume changes.
The dimension of a direct sum of subspaces is the sum of the dimensions of the subspaces.
The dimension of a eigenspace is less than or equal to the (algebraic) multiplicity of the eigenvalue.
The dimension of a subspace is less than or equal to the dimension of the whole space
The dimension of the domain of an injective linear transformation is at most the dimension of the codomain.
The dimension of the domain of a surjective linear transformation is at least the dimension of the codomain.
The direct sum of a subspace and its orthogonal complement is the whole space.
The echelon form can be used to determine if a linear system is consistent.
The eigenspace of a linear transformation is a nontrivial subspace.
The eigenvalues of a matrix are the roots/solutions of its characteristic polynomial/equation.
The eigenvalues of a polynomial of a matrix are the polynomial of the eigenvalues.
The eigenvalues of a power of a matrix are the power the eigenvalues.
The eigenvalues of a scalar multiple of a matrix are the scalar multiples of the eigenvalues.
The eigenvalues of a triangular matrix are the entries on the main diagonal.
The eigenvalues of the inverse of a nonsingular matrix are the reciprocals of the eigenvalues.
The eigenvectors of a normal matrix are an orthonormal basis.
The geometry of linear systems
The Gram-Schmidt process converts a linearly independent set into an orthogonal set.
The identity matrix is the identity for matrix multiplication.
The image of a linearly dependent set under a linear transformation is linearly dependent.
The image of a linearly independent set under an injective linear transformation is linearly independent.
The inner product of a vector with itself is the square of its norm/length.
The intersection of subspaces is a subspace
The inverse image of a subspace under a linear transformation is a subspace.
The inverse of a linear transformation is a linear transformation
The inverse of a matrix can be expressed in terms of its matrix of cofactors.
The inverse of a matrix can be used to solve a linear system.
The inverse of a matrix (if it exists) can be found by row reducing the matrix augmented by the identity matrix.
The inverse of an invertible upper/lower triangular matrix is upper/lower triangular.
The inverse of an isomorphism is an isomorphism.
The inverse of a scalar multiple is the reciprocal times the inverse.
The inverse of the inverse of a linear transformation is the original linear transformation
The kernel/null space of a linear transformation is a subspace
The kernels of powers of a linear transformation form an ascending chain
The least squares solution to a linear system is unique if and only if the columns of the coefficient matrix are linearly independent.
The left null space of a matrix is a subspace of R^m (or C^m).
The matrix equation Ax=b has a solution if and only if b is a linear combination of the columns of A.
The matrix representation of a composition of linear transformations is the product of the matrices.
The matrix representation of a scalar multiple of linear transformations is the scalar multiple of the matrix.
The matrix representation of a sum of linear transformations is the sum of the matrices.
The matrix representation of the inverse of linear transformations is the inverse of the matricix.
The minimal polynomial of a linear transformation exists and is unique.
The minimal polynomial of a square matrix exists and is unique.
The nonzero rows of an echelon form of a matrix are linearly independent.
The nonzero rows of the reduced row-echelon form of a matris are a basis for the row space.
The null space of a matrix is a subspace of R^n (or C^n).
The null space of a matrix is the orthogonal complement of the column space.
The number of pivots in the reduced row echelon form of a consistent system determines the number of free variables in the solution set.
The number of pivots in the reduced row echelon form of a consistent system determines whether there is one or infinitely many solutions.
The number of solutions to a linear system
Theorem: a set of vectors is linearly dependent if and only if one of the vectors can be written as a linear combination of the other vectors
Theorem: a set of vectors is linearly independent if and only if whenever a linear combination is 0
Theorem characterizing when a space is the direct sum of two subspaces
Theorem describing matrix multiplication
Theorem describing properties of the block matrices of the extended reduced row echelon form of a matrix
Theorem describing spaces associated to the block matrices of the extended reduced row echelon form of a matrix
Theorem describing the determinants of elementary matrices.
Theorem describing the dimension of spaces associated to the block matrices of the extended reduced row echelon form of a matrix
Theorem describing the vector form of sulutions to a linear system.
The orthogonal complement of a subspace is a subspace.
The (orthogonal) projection of a vector onto a subspace is the point in the subspace closest to the vector.
The permutation expansion for determinants
The pivot columns of a matrix are a basis for the column space.
The preimage of a vector is a translation of the kernel of the linear transformation
The product of square matrices is nonsingular if and only if each factor is nonsingular.
The product of upper/lower triangular matrices is upper/lower triangular.
The projection of a vector which is in a subspace is the vector itself.
The QR decomposition of a nonsingular matrix exists.
The range/image of a linear transformation is a subspace.
The range of a linear transformation is a subspace
The range spaces of powers of a linear transformation form a descending chain
The rank of a matrix equals number of pivots in a reduced row echelon form.
The rank of a matrix equals the rank of the linear transformation it represents.
The rank plus the nullity of a linear transformation equals the dimension of the domain.
The reduced row-echelon form of a matrix determines which subset of a spanning set is a basis.
The row space and the column space of a matrix have the same dimension.
The row space of a matrix is a vector space
The set containing only 0 is a vector space.
The set of all functions on a set is a vector space.
The set of all polynomials is a vector space.
The set of all polynomials of degree at most n is a vector space.
The set of all sequences is a vector space.
The set of linear transformations between two vector spaces is a vector space.
The set of m by n matrices is a vector space.
The solutions of a homogeneous system are the pre-image (of 0) of a linear transformation.
The solutions to a homogeneous linear differential equation is a vector space.
The solutions to a homogeneous system of linear equations is a vector space.
The solutions to a nonhomogeneous system are given by a particular solution plus the solutions to the homogeneous system.
The span of a set of vectors is a subspace
The spectral theorem for symmetric matrices
The standard inner product of a vector with itself is 0 only for the 0 vector
The standard inner product of a vector with itself is non-negative
The standard inner product on C^n can be written as the product of a vector and the adjoint of a vector.
The standard inner product on C^n commutes/anticommutes with scalar multiplication.
The standard inner product on C^n is anticommutative.
The standard inner product on R^n can be written as the product of a vector and the transpose of a vector.
The standard inner product on R^n commutes with (real) scalar multiplication.
The standard inner product on R^n is commutative.
The standard inner product on R^n (or C^n) distributes over addition.
The standard/natural basis of R^n (or C^n) is a basis.
The sum of linear transformations is a linear transformation
The sum of subspaces is a subspace
The the image of a spanning set is a spanning set for the range space
The transpose of a product of matrices is the product of the transposes in reverse order.
The transpose of a sum of matrices is the sum of the transposes.
The triangle inequality
The union of bases from independent subspaces is a basis for the space.
Transpose and adjoint
Transpose commutes with scalar multiplication.
Triangular matrices
Two matrices of the same size are equivalent if and only if they have the same rank.
Two vectors are orthogonal if and only if the Pythagorean Theorem holds.
Unitary matrices
Unitary matrices are invertible.
Unitary matrices have orthogonal (orthonormal) rows/columns.
Unitary matrices preserve inner products.
Unitary matrices preserve orthogonal (orthonormal) bases.
Using matrices to solve linear systems
Vector space isomorphism is an equivalence relation.
Vector spaces
Vector spaces with the same dimension are isomprphic.
Vector sum/addition interpreted geometrically in R^n (or C^n)
Vector sum/addition is commutative and associative
Visualise a linear transformation on R^2 by looking at the image of a region
Physics
Lecture Notes
Finance
Misc
Knowen blog
Media Coverage
Group Theory Through Problems
Workshops
Personal Knowledgebases
Products of Knowen
Test
Testing
Recyling Bin
Lecture notes test
Math
Security and Strategic Studies
CTSI Resources
How Long Do The Baked Potatoes Take on the Grill
User Suggestions and Comments
Paid Search
my test article
TECHNOLOGY and MARKETS
Dapp and Cubego partnership and It’s Giveaway Time!
Dapp.com Ranking
Dapp.com AMA Recap
Dapp.com where the blockchain comes alive
Dapp.com and the Yummies
Dapp.com and the Cubegoes
The 1st Largest dedicated Platform
Imperial Thrones Dapp.com
Imperial Thrones & Yummie
Imperial Thrones & Yummie
CryptoWars in Dapp.com
All about DApps and Dapp.com
Chainbreakers at Dapp.com
Giveaways at Dapp.com
Holiday Gifts from Dapp.com
Philosophy
test jhjhk