Linear Algebra
Vector Spaces¶
Subspace¶
- Definition: skip due to complication
That's say a vector space
then the subspace
Thus it's intuitive that subspace
Null Space¶
- Definition
for homogeneous equation
example
Thus
Properties¶
-
is namely the solution space of -
for the matrix
, is a subspace of
since , thus can be written as
so this property is intuitive.
Column Space¶
- Definition: For an
matrix
or equivalently,
Properties¶
is a subspace of- range of linear transform
this is quite intuitive since when we lost dimension in , we got dimension in solution space.
e.g. equations in can yield an unique solution. However, equations can only yield a line (i.e. one dimension) at most.
Characteristic Equation¶
Eigenvector & Eigenvalue¶
- Definition
note:
cannot be an eigenvector (since it is trivial and by definition)- 0 can be an eigenvalue.
Orthogonality¶
Orthogonal Complements¶
- A vector
is in iff is orthogonal to every vector in
Orthogonal Projection¶
- Given a vector
and a subspace in
Let's say
then the orthogonal projection of
assume
Thus, we can conclude that for an
Diagonalization¶
Diagonal Matrix¶
thus
Diagonalization¶
Square matrix
- in which
is diagonal matrix is invertible matrix.
The Diagonalization Theorem¶
An
Easy Proof
Let's say an invertiable matrix
by the definition of characteristic equation
Orthogonally Diagonalizable¶
- Definition
A matrix is said to be orthogonally diagonalizable if
- And iff
is a symmetric matrix.
SVD¶
- Singular Value Diagonalization
Singular Values¶
Let
Thus,
Then, Let say
And also, say
(make
The singular values
Thinking
collapse: none
Singular values are the "eigenvalues" of non-square matrix
Properties¶
is an orthogonal set.
for any , is an orthogonal basic of .
say in , then
Singular Value Decomposition¶
"diagonal" matrix
e.g.
- singular value decomposition
- Left singular vectors of
: The columns of - Right singular vectors of
: The columns of
Easy Proof
First following the property 2. of singular value, we can obtain an orthonormal basic
then
Least-Squares Solution¶
- Target
for matrix
say
Most of times we cannot find the perfect solution. Thus, we try to find the Least-Squares solution
Find
First, let define
We know that the least-square root happens at
Thus say the projection matrix
Back to the condition
from the definition of null space and equation
Besides, we now further have
Pseudoinverse Aspect¶
compare equation
by theorem of Orthogonal Projection, we know that