Linear Algebra 3: Inner products#

Projection, orthogonality and norm#

  • Dot product \(\langle a\mid b \rangle\) quantifies the projection of vector \(a\) on \(b\) and vice-versa. That is, how much \(a\) and \(b\) have in common with each other in terms of direction in space. I

\[\langle e_i \mid e_j \rangle =\delta_{ij}\]
  • Norm of a vector \(\mid a\mid\) Is project of the vector onto itself and quantifies the length of the vector. When the norm is \(\mid a \mid=1\), we say that the vector is normalized.

\[\langle a \mid a\rangle= a_1^2+a_2^2\]
\[\mid a \mid =\sqrt{a_1^2+a_2^2}\]
  • Orthogonality If the projection of vector \(a\) on \(b\) is zero we say that the vectors are orthogonal. Example of the orthogonal vectors are unit vectors of cartesian coordinate system.

\[\begin{split} (1,0)\begin{pmatrix} 0\\ 1\\ \end{pmatrix}=1\cdot 0+0\cdot 1=0 \end{split}\]
  • Orthonormal vectors are both normalized and orthogonal. We denote orthornamilty condition with the convenient Kornecker symbol: \(\delta_{ij}=0\) when \(i\neq j\) and \(1\) when \(i=j\).

To normalize a vector is to divide the vector by its norm. \(\mid E_1\rangle = (4,0,0,0)\) is not normalized since \(\langle E_1\mid E_1\rangle = 4\) hence we divide by norm and obtain a normalized vector \(\mid e_1\rangle=\frac{1}{4}\mid E_1\rangle=(1,0,0,0)\). And now \(\langle E_1 \mid E_1\rangle=1\).

Basis set and linear independence.#

1. Every \(N\)-dimensional vector can be uniquely represented as a linear combination of \(N\) orthogonal vectors. And vice-versa: if a vector can be represented by \(N\) orthogonal vectors, it means that the vector is \(N\)-dimensional. A set of vectors in terms of which an arbitrary \(N\)-dimensional vector is expressed is called a basis set.

  • \[\mid v\rangle = \sum^{i=N}_{i=1} \mid e_i\rangle\]
  • \[\begin{split}a= \begin{pmatrix} 2\\ 3\\ \end{pmatrix} = 2\begin{pmatrix} 1\\ 0\\ \end{pmatrix}+3 \begin{pmatrix} 0\\ 1\\ \end{pmatrix}\end{split}\]
  • \[\begin{split}a= \begin{pmatrix} -1\\ 5\\ 8\\ \end{pmatrix} = -1\begin{pmatrix} 1\\ 0\\ 0\\ \end{pmatrix}+5 \begin{pmatrix} 0\\ 1\\ 0\\ \end{pmatrix}+8 \begin{pmatrix} 0\\ 0\\ 1\\ \end{pmatrix}\end{split}\]

2. Orthogonal vectors are linearly independent. This means that no member of a set of vectors can be expressed in terms of the others. Linear independence is exprsessed mathematically by having coefficients of the linear combination of 3D (4D, ND, etc) vectors to zero \(\alpha_1=\alpha_2=\alpha_3=0\) as the only way to satify zero vector equality:

\[\alpha_1 \mid e_1\rangle +\alpha_1 \mid e_2\rangle+\alpha_3 \mid e_3\rangle=0\]

The converse, when one of the coefificent \(\alpha_i\)can be non-zero immeaditely implies linear depenence, because one can divide by that coeficient \(\alpha_i\) and express the unit vector \(\mid e_i\rangle\) in terms of the others.

Decomposition of functions into orthogonal components#

  • Writing a vector in terms of its orthogonal unit vectors is a powerful mathematical technique which permeates much of quantum mechanics. The role of finite dimensional vectors in QM play the infinite dimensional functions. In analogy with sequence vectors which can live in 2D, 3D or ND spaces, the inifinite dimensional space of functions in quantum mathematics is known as a Hilbert space, named after famous mathematician David Hilbert. We will not go too much in depth about functional spaces other than listing some powerful analogies with simple sequence vectors.

Vectors

Functions

Orthonormality \(\\ \langle x\mid y \rangle = \sum^{i=N}_{i=1} x_i y_i=\delta_{xy}\)

Orthonormality \(\\ \langle \phi_i \mid \phi_j \rangle = \int^{+\infty}_{-\infty} \phi_i(x) \phi_j(x)dx=\delta_{ij}\)

Linear superposition \(\\ \mid A \rangle = A_x \mid x\rangle+A_y\mid y\rangle\)

Linear superposition \(\\ \mid f\rangle = c_1 \mid\phi_1\rangle+c_2\mid\phi_2\rangle\)

Projections \(\\ \langle e_x\mid A\rangle=A_x \langle x\mid x \rangle +A_y \langle x\mid y \rangle=A_x \)

Projections \(\\ \langle \phi_1\mid \Psi\rangle=c_1 \langle \Psi \mid\phi_1 \rangle +c_2 \langle \Psi \mid\phi_2 \rangle=c_1\)

In the first column we decompose a vectors in terms of two orthogonal components \(A_i\) or projections of vector \(A\) along the orthonormal vectors \(x\) and \(y\). In the second column similiar decomposition where the dot product, due to infinite dimension, is given by an integral!