Linear Algebra 3: Inner products#

Projection, orthogonality and norm#

  • Dot product ab quantifies the projection of vector a on b and vice-versa. That is, how much a and b have in common with each other in terms of direction in space. I

eiej=δij
  • Norm of a vector a Is project of the vector onto itself and quantifies the length of the vector. When the norm is a∣=1, we say that the vector is normalized.

aa=a12+a22
a∣=a12+a22
  • Orthogonality If the projection of vector a on b is zero we say that the vectors are orthogonal. Example of the orthogonal vectors are unit vectors of cartesian coordinate system.

(1,0)(01)=10+01=0
  • Orthonormal vectors are both normalized and orthogonal. We denote orthornamilty condition with the convenient Kornecker symbol: δij=0 when ij and 1 when i=j.

To normalize a vector is to divide the vector by its norm. E1=(4,0,0,0) is not normalized since E1E1=4 hence we divide by norm and obtain a normalized vector e1=14E1=(1,0,0,0). And now E1E1=1.

Basis set and linear independence.#

1. Every N-dimensional vector can be uniquely represented as a linear combination of N orthogonal vectors. And vice-versa: if a vector can be represented by N orthogonal vectors, it means that the vector is N-dimensional. A set of vectors in terms of which an arbitrary N-dimensional vector is expressed is called a basis set.

  • v=i=1i=Nei
  • a=(23)=2(10)+3(01)
  • a=(158)=1(100)+5(010)+8(001)

2. Orthogonal vectors are linearly independent. This means that no member of a set of vectors can be expressed in terms of the others. Linear independence is exprsessed mathematically by having coefficients of the linear combination of 3D (4D, ND, etc) vectors to zero α1=α2=α3=0 as the only way to satify zero vector equality:

α1e1+α1e2+α3e3=0

The converse, when one of the coefificent αican be non-zero immeaditely implies linear depenence, because one can divide by that coeficient αi and express the unit vector ei in terms of the others.

Decomposition of functions into orthogonal components#

  • Writing a vector in terms of its orthogonal unit vectors is a powerful mathematical technique which permeates much of quantum mechanics. The role of finite dimensional vectors in QM play the infinite dimensional functions. In analogy with sequence vectors which can live in 2D, 3D or ND spaces, the inifinite dimensional space of functions in quantum mathematics is known as a Hilbert space, named after famous mathematician David Hilbert. We will not go too much in depth about functional spaces other than listing some powerful analogies with simple sequence vectors.

Vectors

Functions

Orthonormality xy=i=1i=Nxiyi=δxy

Orthonormality ϕiϕj=+ϕi(x)ϕj(x)dx=δij

Linear superposition A=Axx+Ayy

Linear superposition f=c1ϕ1+c2ϕ2

Projections exA=Axxx+Ayxy=Ax

Projections ϕ1Ψ=c1Ψϕ1+c2Ψϕ2=c1

In the first column we decompose a vectors in terms of two orthogonal components Ai or projections of vector A along the orthonormal vectors x and y. In the second column similiar decomposition where the dot product, due to infinite dimension, is given by an integral!