Skip to content

Linear Algebra

Vector Spaces

Subspace

  • Definition: skip due to complication

That's say a vector space V=Span{v1,v2,v3},

then the subspace H=Span{v1,v2},Span{v2,v3},

Thus it's intuitive that subspace H have the property: {0}H


Null Space

  • Definition
    for homogeneous equation
Ax=0Nul A={x|xRnAx=0}
example
A=[132591]u=[532]Au=[132591][532]=[00]

Thus u is in Nul A.

Properties

  1. Nul A is namely the solution space of x

  2. for the matrix Am×n, Nul A is a subspace of n
    since Ax=0, thus x can be written as

x=[x1x2xn]

so this property is intuitive.


Column Space

  • Definition: For an m×n matrix A=[a1,a2,,an]
Col A=Span{a1,a2,,an}aim

or equivalently,

Col A={Ax|xn}

Properties

  • Col A is a subspace of m
  • range of linear transform Ax
  • rank A=dim(Col A)
  • rank A+dim(Nul A)=n
    this is quite intuitive since when we lost 1 dimension in Col A, we got 1 dimension in solution space.
    e.g. 3 equations in 3 can yield an unique solution. However, 2 equations can only yield a line (i.e. one dimension) at most.

Characteristic Equation

Eigenvector & Eigenvalue

  • Definition
Ax=λx(AλI)x=0

n×n matrix A, with the eigenvector x and eigenvalue λ.

note:

  1. 0 cannot be an eigenvector (since it is trivial and by definition)
  2. 0 can be an eigenvalue.

Orthogonality

Orthogonal Complements

  1. A vector x is in W iff x is orthogonal to every vector in W
  2. (Row A)=Nul A
  3. (Col A)=Nul AT

Orthogonal Projection

  • Given a vector y and a subspace W in n
y=c1u1+c2u2++cnun=[u1u2un][c1c2cn]

Let's say W=Span {u1,u2,,up} (pn)

then the orthogonal projection of y onto W, y^ (aka projWy)

y^=yu1||u1||2u1+yu2||u2||2u2++yup||up||2up

assume {u1,u2,,up} is an orthonormal basic. we can thus further write

y^=yu1||u1||2u1+yu2||u2||2u2++yup||up||2up=(yu1)u1+(yu2)u2+(yup)up=[u1u2up][yu1yu2yup]=[u1u2up][u1Tu2TupT]y=UUTy

Thus, we can conclude that for an n×p matrix U which consists of W which is p-dimension subspace of n, there is UUT that project yn onto W.

UUTy=projWy

Diagonalization

Diagonal Matrix

D is said to be a diagonal matrix if D has the form

D=[a000b000c]

thus D has the property

Dn=[an000bn000cn]

Diagonalization

Square matrix A is said to be diagonalizable if

  1. A=PDP1
  2. in which D is diagonal matrix
  3. P is invertible matrix.

The Diagonalization Theorem

An n×n matrix A is diagonalizable iff A has n eigenvectors and eigenvalues(may be multiple roots).

Easy Proof

Let's say an invertiable matrix Pn×n and diagnol matrix D

P=[v1,v2,,vn]D=[λ100λ2λn]AP=[Av1,Av2,]PD=[v1,v2,,vn][λ100λ2λn]=[λ1v1,λ2v2,]

by the definition of characteristic equation

AP=PDA=PDP1

Orthogonally Diagonalizable

  • Definition
    A matrix is said to be orthogonally diagonalizable if
A=PDPT=PDP1AT=PTTDTPT=PDPT=A
  • And iff A is a symmetric matrix.

SVD

  • Singular Value Diagonalization

Singular Values

Let A be an m×n matrix. Then

(ATA)T=ATATT=ATA

Thus, ATA is symmetric and orthogonally diagonalizable.

Then, Let say {v1,v2,,vn} be an orthonormal basic for n consisting of eigenvectors of ATA

And also, say λ1,λ2, be the eigenvalues of ATA
(make λ1λ2λn0)

The singular values σi of A are

σi=λi

Thinking

collapse: none

Singular values are the "eigenvalues" of non-square matrix A.

Properties
  1. {Av1,Av2,,Avn} is an orthogonal set.
    for any ij, (Avi)T(Avj)=viATAvj=viλjvj=0
  2. {Av1,Av2,,Avr} is an orthogonal basic of Col A. rank A=dim Col A=r
    say y=Ax in Col A, then
x=c1v1+c2v2++crvr++cnvn=c1v1+c2v2++crvry=Ax=c1Av1+c2Av2++crAvr

Singular Value Decomposition

  • m×n "diagonal" matrix
Σm×n=[Dr×r0r×(nr)0(mr)×r0(mr)×(nr)]

e.g. A2×3 with r=2

Σ=[σ1000σ20]

  • singular value decomposition
Am×n=Um×mΣm×nVn×nT
  • Left singular vectors of A : The columns of U
  • Right singular vectors of A : The columns of V

Easy Proof

First following the property 2. of singular value, we can obtain an orthonormal basic {u1,u2,,ur} by

ui=Avi||Avi||=Aviσi

then

UΣ=[u1u2um][σ100σ2]m×n=[σ1u1σ2u2σrur00]AV=[Av1Av2Avn]=[σ1u1σ2u2σrur00]UΣ=AVUΣVT=A

Least-Squares Solution

  • Target
    for m×n matrix A
A=[a1a2an]Ax=b

say b is a linear combination of columns of A for solution x.

Most of times we cannot find the perfect solution. Thus, we try to find the Least-Squares solution x^.

Find

x^=argminx^||bAx^||2

First, let define

p=Ax^Col A

We know that the least-square root happens at (bp) orthogonal to p.

Thus say the projection matrix P s.t.

Pb=p

Back to the condition (bp)p=0. That is

pCol A(a)(bp)(Col A)=Nul AT

from the definition of null space and equation a, we have

AT(bp)=0AT(bAx^)=0ATAx^=ATbx^=(ATA)1ATb

Besides, we now further have

(b)p=Ax^=A(ATA)1ATbP=A(ATA)1AT

Pseudoinverse Aspect
Ax^=bx^=Ab(c)p=Ax^=AAb

compare equation b and c

p=Ax^=AAb=(UrDVrT)(VrD1UrT)b=UrUrTb

by theorem of Orthogonal Projection, we know that

P=AA=UrUrT