Linear Algebra 3: Inner products#
Projection, orthogonality and norm#
Dot product
quantifies the projection of vector on and vice-versa. That is, how much and have in common with each other in terms of direction in space. I
Norm of a vector
Is project of the vector onto itself and quantifies the length of the vector. When the norm is , we say that the vector is normalized.
Orthogonality If the projection of vector
on is zero we say that the vectors are orthogonal. Example of the orthogonal vectors are unit vectors of cartesian coordinate system.
Orthonormal vectors are both normalized and orthogonal. We denote orthornamilty condition with the convenient Kornecker symbol:
when and when .
To normalize a vector is to divide the vector by its norm.
is not normalized since hence we divide by norm and obtain a normalized vector . And now .
Basis set and linear independence.#
1. Every
2. Orthogonal vectors are linearly independent. This means that no member of a set of vectors can be expressed in terms of the others. Linear independence is exprsessed mathematically by having coefficients of the linear combination of 3D (4D, ND, etc) vectors to zero
The converse, when one of the coefificent
Decomposition of functions into orthogonal components#
Writing a vector in terms of its orthogonal unit vectors is a powerful mathematical technique which permeates much of quantum mechanics. The role of finite dimensional vectors in QM play the infinite dimensional functions. In analogy with sequence vectors which can live in 2D, 3D or ND spaces, the inifinite dimensional space of functions in quantum mathematics is known as a Hilbert space, named after famous mathematician David Hilbert. We will not go too much in depth about functional spaces other than listing some powerful analogies with simple sequence vectors.
Vectors |
Functions |
---|---|
Orthonormality |
Orthonormality |
Linear superposition |
Linear superposition |
Projections |
Projections |
In the first column we decompose a vectors in terms of two orthogonal components