Inner Product Spaces


Back to Algebra Page

On this page:


Introduction

Inner products and notions of orthogonality are ubiquitous throughout mathematics. As such, we will spend time investigating inner product spaces.

Inner Product Spaces

[definition] [deftitle]Definition %counter% (Inner product space)[/deftitle]

Let $V$ be a vector space over $F$ where $F = \RR$ or $F = \CC$. An inner product on $V$ is a mapping $\innerprod{-}{-}:V \to V \to F$ such that

  1. (Positive definiteness) For every $v\in V$, $\innerprod{v}{v} \ge 0$ and $\innerprod{v}{v} = 0$ if and only if $v = 0$.
  2. (Conjugate symmetry) For all $u,v \in V$, $\innerprod{u}{v} = \overline{\innerprod{v}{u}}$.
  3. (Linearity in first argument) For every $u, v, w\in V$ and $r, s\in F$, $\innerprod{ru + sv}{w} = r\innerprod{u}{w} + s\innerprod{v}{w}$.

A vector space $V$ paired with an inner product is called an inner product space. [/definition]

[theorem] [thmtitle]Proposition (Real inner products are $\RR$-bilinear)[/thmtitle]

Let $V$ be a real inner product space. Then the inner product is bilinear. [/theorem]

[proof] Inner products are linear in the first argument so it remains to show that a real inner product is linear in the second argument. Let $u, v, w\in V$ and $a, b \in \RR$. Then

\[\begin{align*} \innerprod{u}{av + bw} &= \innerprod{av + bw}{u} \\ &= a\innerprod{v}{u} + b\innerprod{w}{u} \\ &= a\innerprod{u}{v} + b\innerprod{u}{w} \end{align*}\]

as desired. [/proof]

[theorem] [thmtitle]Proposition (Complex inner products are sesquilinear)[/thmtitle]

Let $V$ be a complex inner product space. Then the inner product is sesquilinear. [/theorem]

[proof] Inner products are linear in the first argument so it remains to show that a complex inner product is conjugate linear in the second argument. Let $u, v, w\in V$ and $a, b \in \CC$. Then

\(\begin{align*} \innerprod{u}{av + bw} &= \overline{\innerprod{av + bw}{u}} \\ &= \overline{a}\overline{\innerprod{v}{u}} + \overline{b}\overline{\innerprod{w}{u}} \\ &= \overline{a}\innerprod{u}{v} + \overline{b}\innerprod{u}{w} \end{align*}\) [/proof]

There are some easy-to-see examples of inner product spaces.

[example] [extitle]Example %counter% (Dot product on $\RR$) [/extitle]

The vector space $\RR^n$ has a canonical inner product space structure via the dot product:

\[\innerprod{\mqty[x_1 \\ \vdots \\ x_n]}{\mqty[y_1 \\ \vdots \\ y_n]} = \sum_{j=1}^n x_jy_j.\]

Because this example is so important, we often refer to the dot product as the standard inner product. [/example]

[example] [extitle]Example %counter% (Dot product on $\CC$) [/extitle]

The vector space $\CC^n$ has a canonical inner product space structure via the dot product:

\[\innerprod{\mqty[z_1 \\ \vdots \\ z_n]}{\mqty[w_1 \\ \vdots \\ w_n]} = \sum_{j=1}^n z_j\conj{w_j}.\]

Because this example is so important, we often refer to the dot product as the standard inner product. [/example]

[example] [extitle]Example %counter% ($\ell^2$)[/extitle]

A noteworthy inner product space (Hilbert space, in fact) is $\ell^2(F)$ where $F = \RR$ or $F = \CC$. We define $\ell^2(F)$ to be the vector space of sequences $(z_n)$ such that

\[\sum_{n} |s_n|^2 < \infty.\]

The choice of inner product on this space is

\[\innerprod{(z_n)}{(w_n)} \coloneqq \sum_{n} z_n\conj{w_n}.\]

Notice that the right side converges since

\[|z_n\conj{w_n}| \le |z_n||w_n| \le \frac{1}{2}(|z_n|^2 + |w_n|^2)\]

which is easily seen via the inequality $(|z_n| - |w_n|)^2 \ge 0$. [/example]

[example] [extitle]Example %counter% ($L^2$)[/extitle]

Another important example of an inner product space is $L^2(X, \mathcal{A}, \mu)$ where $L^2(X, \mathcal{A}, \mu)$ is the space of square integrable functions on $X$ with respect to the measure $\mu$ and $\sigma$-algebra $\mathcal{A}$. The inner product structure is given by

\[\innerprod{f}{g} = \int_X f\conj{g} \dd{\mu}.\]

[/example]

[theorem] [thmtitle]Lemma %counter%[/thmtitle]

Let $V$ be an inner product space such that

\[\innerprod{u}{w} = \innerprod{v}{w}\]

for every $w \in V$. Then $u = v$. [/theorem]

[proof] The equation given is equivalent to stating

\[\innerprod{u - v}{w} = 0\]

for all $w \in V$. In particular, if we choose $w = u - v$, then we obtain

\[\innerprod{u - v}{u - v} = 0\]

and this occurs if and only if $u - v = 0$. [/proof]

Now, real and complex spaces have vastly different structures. This can be seen from the fact that multiplication by complex numbers allows for both scaling and rotation (in the naive geometric sense). This is also reflected in their inner product structures as well as we will soon see.

[theorem] [thmtitle]Theorem %counter%[/thmtitle]

Let $T$ be a linear operator on the inner product space $V$. If $\innerprod{Tv}{u} = 0$ for every $v, u\in V$, then $T = 0$. [/theorem]

[proof] Equivalently, we can just notice that this is just saying

\[\innerprod{Tv}{u} = 0 = \innerprod{0}{u}\]

for every $v, u \in V$ (where the second equality follows by linearity in the first argument). So the lemma implies that $Tv = 0$ for all $v \in V$ which just means that $T = 0$. [/proof]

In the complex inner product space setting, we can making a significantly strong claim because multiplication by $e^{i\theta}$ is rotation by $\theta$ radians. Accordingly, this means it suffices to just look at $\innerprod{Tv}{v} = 0$ instead.

[theorem] [thmtitle]Theorem %counter%[/thmtitle]

Let $T$ be a linear operator on the complex inner product space $V$. If $\innerprod{Tv}{v} = 0$ for every $v\in V$, then $T = 0$. [/theorem]

[proof] Let $r \in \CC$ and write $v = rx + y$ for $x, y\in V$ (note that we can actually generate the entire vector space this way by setting $r = 0$ and $y = v$). Then

\[\begin{align*} 0 &= \innerprod{T(rx + y)}{rx + y} \\ &= \innerprod{rTx + Ty}{rx + y} \\ &= |r|^2\innerprod{Tx}{x} + r\innerprod{Ty}{y} + r\innerprod{Tx}{y} + \conj{r}\innerprod{Ty}{x} \\ &= r\innerprod{Tx}{y} + \conj{r}\innerprod{Ty}{x}. \end{align*}\]

Setting $r = 1$ and $r = i$ gives the equations

\[\begin{align*} \innerprod{Tx}{y} + \innerprod{Ty}{x} &= 0 \\ \innerprod{Tx}{y} - \innerprod{Ty}{x} &= 0. \end{align*}\]

So it follows that $\innerprod{Tx}{y} = 0$ for all $x, y \in V$. The previous Theorem implies that this means $T = 0$. [/proof]

[example] [extitle]Remark %counter%[/extitle]

This theorem is clearly false if the vector space is a real vector space. Say $V = \RR^2$ and $T$ is the operator that rotates about the origin by $\pi/2$ radians. Clearly, $\innerprod{Tv}{v} = 0$ under the standard inner product but $T \neq 0$. [/example]

Normed Spaces

We recall the definition of a norm and some basic facts about normed spaces.

[definition] [deftitle]Definition %counter% (Normed space)[/deftitle]

Let $V$ be a vector space over $F$. A function $\norm{-}:V \to \RR$ is called a norm provided that:

  1. (Positive definiteness) For every $v \in V$, $\norm{v} \ge 0$ and $\norm{x} = 0$ if and only if $v = 0$.
  2. (Homogeneity) For every $c\in F$ and $v \in V$, $\norm{cv} = |c|\norm{v}$.
  3. (Triangle inequality) For every $v, u\in V$, $\norm{v+u} \le \norm{v} + \norm{u}$.

A vector space $V$ paired with a norm is called a normed space. [/definition]

[example] [extitle]Example %counter% (Euclidean norm on $\RR$)[/extitle]

Recall that the Euclidean norm of a vector in $\RR^n$ is given by

\[\nrm{\mqty[x_1 \\ \vdots \\ x_n]}_2 = \sqrt{\sum_{j=1}^n x_j^2}.\]

This is the canonical example of a normed space. A similar setup allows us to define Euclidean norm on $\CC^n$. [/example]

[example] [extitle]Example %counter% ($p$-norm on $\RR$)[/extitle]

Recall that the $p$-norm of a vector in $\RR^n$ is given by

\[\nrm{\mqty[x_1 \\ \vdots \\ x_n]}_p = \sqrt[p]{\sum_{j=1}^n |x_j|^p}.\]

In analysis, one spends time showing carefully that this makes $\RR^n$ into a normed space. Note that if $p = 2$, then we just recover the Euclidean norm. A similar setup allows us to define $p$-norm on $\CC^n$. [/example]

[example] [extitle]Example %counter% ($\ell^p$ norm)[/extitle]

Recall that $\ell^p$ is the space of sequences $(z_n)$ of real (or complex) numbers for which

\[\sum_{n} |z_n|^p < \infty.\]

Taking the $p$-th root of the series above gives the $\ell^p$ norm of $(z_n)$. [/example]

[example] [extitle]Example %counter% ($L^p$ norm)[/extitle]

Let $(X, \mathcal{A}, \mu)$ be a measure space. Then

\[\norm{f}_p \coloneqq \sqrt[p]{\int_X |f|^p \dd{\mu}}\]

defines a norm on $L^p(X, \mathcal{A}, \mu)$ where $1 \le p < \infty$. [/example]

One particularly important norm is the following: Let $V$ be an inner product space. Then we will define

\[\norm{v} \coloneqq \sqrt{\innerprod{v}{v}}.\]

We get the first two properties of a norm immediately from the properties of the inner product. To see the third axiom, we will need another result to help us out.

[theorem] [thmtitle]Theorem %counter% (Cauchy-Schwarz inequality)[/thmtitle]

Let $V$ be an inner product space. Then

\[|\innerprod{u}{v}| \le \norm{u}\norm{v}.\]

[/theorem]

[proof] Clearly, if $u$ or $v$ is zero, then the result is immediate. Assume that $u, v \neq 0$. Now consider

\[\begin{align*} 0 &\le \norm{u - rv}^2 \\ &= \innerprod{u-rv}{u-rv} \\ &= \innerprod{u}{u} - \overline{r}\innerprod{u}{v} - r(\innerprod{v,u} - \overline{r}\innerprod{v,v}). \end{align*}\]

Set $\overline{r} = \innerprod{v}{u}/\innerprod{v}{v}$ gives

\[0 \le \innerprod{u}{u} - \frac{\innerprod{v,u}\innerprod{u,v}}{\innerprod{v,v}} = \norm{u}^2 - \frac{|\innerprod{u}{v}|^2}{\norm{v}^2}.\]

Rearranging the terms gives the desired result. [/proof]

[theorem] [thmtitle]Theorem %counter%[/thmtitle]

The norm induced by a norm is actually a norm. [/theorem]

[proof] We have shown the first two axioms above. For the triangle inequality, note that

\[\begin{align*} \norm{u + v}^2 &= \innerprod{u + v}{u + v} \\ &= \innerprod{u}{u} + \innerprod{u}{v} + \innerprod{v}{u} + \innerprod{v}{v} \\ &= \norm{u}^2 + \innerprod{u}{v} + \overline{\innerprod{u}{v}}+ \norm{v}^2 \\ &= \norm{u}^2 + 2\Re\innerprod{u}{v} + \norm{v}^2 \\ &\le \norm{u}^2 + 2\norm{u}\norm{v} + \norm{v}^2 \\ &= (\norm{u} + \norm{v})^2 \end{align*}\]

where the inequality follows from $\Re z \le |z|$ for any $z \in \CC$ and Cauchy-Schwarz. Taking the square root of both sides gives the desired result and we are done. [/proof]

We conclude this section by mentioning the polarization identities. In short, the induced norm is so strongly connected to its inner product that the inner product can be fully recovered from the norm.

[theorem] [thmtitle]Theorem %counter% (Polarization identity for $\RR$-vector spaces)[/thmtitle]

Let $V$ be a real inner product space. Then

\[\innerprod{u}{v} = \frac{1}{4}(\norm{u+v}^2 - \norm{u-v}^2)\]

for every $u, v \in V$. [/theorem]

[proof] A result that we will mention, but not prove, is the parallelogram law which states that

\[2\norm{u}^2 + 2\norm{v}^2 = \norm{u + v}^2 + \norm{u - v}^2.\]

To prove this statement, just expand the right side in terms of inner products and the claim follows immediately. In any case, applying the parallelogram law gives

\[\begin{align*} \norm{u+v}^2 - \norm{u-v}^2 &= \norm{u+v}^2 + \norm{u-v}^2 -2\norm{u-v}^2 \\ &= 2\norm{u}^2 + 2\norm{v}^2 - 2\norm{u - v}^2 \\ &= 2\norm{u}^2 + 2\norm{v}^2 - 2\innerprod{u-v}{u-v} \\ &= 2\norm{u}^2 + 2\norm{v}^2 - 2[\norm{u}^2 - 2\innerprod{u}{v} + \norm{v}^2] \\ &= 4\innerprod{u}{v} \end{align*}\]

and the claim follows. [/proof]

[theorem] [thmtitle]Theorem %counter% (Polarization identity for $\CC$-vector spaces)[/thmtitle]

Let $V$ be a complex inner product space. Then

\[\innerprod{u}{v} = \frac{1}{4}(\norm{u+v}^2 - \norm{u-v}^2) + \frac{i}{4}(\norm{u + iv}^2 - \norm{u - iv}^2)\]

for every $u, v \in V$. [/theorem]

Proving this result is pretty similar to the real case, so we omit the proof.

Isometries

[definition] [deftitle]Definition %counter% (Isometry)[/deftitle]

Let $V$ and $W$ be innerproduct spaces and $T \in \Hom(V, W)$. Then we say that $T$ is an isometry provided that

\[\innerprod{Tu}{Tv} = \innerprod{u}{v}\]

for every $u, v\in V$. If $T$ is, in addition, a bijection, then we call $T$ an isometric isomorphism and we say that $V$ and $W$ are isometrically isomorphic. [/definition]

[theorem] [thmtitle]Theorem %counter% (Isometries are injections)[/thmtitle]

Let $T:V \to W$ be an isometry. Then $T$ is injective. [/theorem]

[proof] Suppose that $v\in \ker T$. Then

\[\innerprod{Tv}{Tv} = \innerprod{v}{v} \ge 0.\]

Since $\innerprod{v}{v} = 0$ precisely when $v = 0$, the claim follows. [/proof]

[example] [extitle]Example %counter% (Isometry that is not a surjection)[/extitle]

Let $T:\ell^2 \to \ell^2$ be defined by

\[T(x_1, x_2, x_3, \ldots) = (0, x_1, x_2, \ldots).\]

Clearly $T$ is an isometry but it is most definitely not a surjection. [/example]

An alternative way to characterize isometry is to use the induced norm.

[theorem] [thmtitle]Theorem %counter% (Isometry $\iff$ preserve norm)[/thmtitle]

Let $T:V \to W$ be a linear transformation of inner product spaces. Then $T$ is an isometry if and only if $\norm{Tv} = \norm{v}$ for every $v \in V$. [/theorem]

[proof] ($\implies$) Obvious.

($\impliedby$) Let us consider the complex case. By the polarization identity,

\[\begin{align*} \innerprod{Tu}{Tv} &= \frac{1}{4}(\norm{Tu+Tv}^2 - \norm{Tu-Tv}^2) + \frac{i}{4}(\norm{Tu + iTv}^2 - \norm{Tu - iTv}^2) \\ &= \frac{1}{4}(\norm{u+v}^2 - \norm{u-v}^2) + \frac{i}{4}(\norm{u + iv}^2 - \norm{u - iv}^2) \\ &= \innerprod{u, v}. \end{align*}\]

Since the real case can be realized in the same way by removing the “imaginary part”, the claim follows. [/proof]

Orthogonality

We finally have made it to orthogonality! In any case, the inner product induces a geometry on the vector space. As such, notions of “perpendicular” make sense to talk about in an inner product space. Here we go!

[definition] [deftitle]Definition %counter% (Orthogonal)[/deftitle]

Let $V$ be an inner product space. We say that $u, v\in V$ are orthogonal provided that

\[\innerprod{u}{v} = 0.\]

We denote this relationship by $u \perp v$. Similarly, we say that $X, Y \subseteq V$ are orthogonal provided that

\[\innerprod{X}{Y} \coloneqq \{\innerprod{x}{y} \mid x \in X, y\in Y\} = \{0\}.\]

We denote this relationship by $X \perp Y$. [/definition]

Given any subspace of a vector, it is very natural to ask if there is a complement to that subspace. Even of more interest are orthogonal complements.

[definition] [deftitle]Definition %counter% (Orthogonal complement)[/deftitle]

The orthogonal complement of a subset $X \subseteq V$ is the set

\[X^\perp \coloneqq \{v \in V \mid v \perp X\}.\]

[/definition]

[example] [extitle]Example %counter% ($0\oplus 0\oplus \RR \perp \RR^2 \oplus 0$)[/extitle]

A trivial example is that the subspace of $\RR^3$ spanned by $e_1$ and $e_2$ (standard basis vectors) has orthogonal complement $\span (e_3)$. [/example]

A couple of easy results:

[theorem] [thmtitle]Proposition %counter%[/thmtitle]

Let $X$ be a subset of an inner product space $V$. Then $X^\perp$ is a subspace of $V$. [/theorem]

[proof] Obviously $0 \in X^\perp$. Let $u, v \in X^\perp$, $x\in X$, and $a,b\in F$. Then

\[\innerprod{au + bv}{x} = a\innerprod{u}{x} + b\innerprod{v}{x} = 0.\]

Thus, $au + bv \in X^\perp$. [/proof]

[theorem] [thmtitle]Proposition %counter%[/thmtitle]

Let $S$ be a subspace of an inner product space $V$. Then $S \cap S^\perp = 0$. [/theorem]

[proof] It is immediate that $0 \in S \cap S^\perp$. Now we show that $0$ is the only element of said space. Let $v \in S \cap S^{\perp}$. Then this means that $v$ is orthogonal to itself, i.e.

\[\innerprod{v, v} = 0.\]

This happens if and only if $v = 0$ so the claim follows. [/proof]

If we take the sum of two orthogonal subspaces, they have trivial intersection and so it follows that the sum is actually a direct sum. As well will see, this has very profound consequences.

[definition] [deftitle]Definition %counter% (Orthogonal direct sum)[/deftitle]

Let $V$ be an inner product space. We say that $V$ is an orthogonal direct sum of subspaces $S$ and $T$ if

\[V = S \oplus T, \quad S\perp T.\]

We notate this as $S \odot T$. More generally,

\[V = \bigodot_{i=1}^n S_i\]

provided that $V = \bigoplus_{i=1}^n S_i$ and the $S_i$ are pairwise orthogonal. [/definition]

[theorem] [thmtitle]Theorem %counter%[/thmtitle]

Let $V$ be an inner product space. Then $V = S\odot T$ if and only if $V = S \oplus T$ and $T = S^\perp$. [/theorem]

[proof] The $\impliedby$ direction is the definition of an orthogonal direct sum so we only need to show the $\implies$ direction. If $V = S\odot T$, then $T \subseteq S^\perp$. If $v \in S^\perp$, then it can be written

\[v = s + t\]

where $s \in S$ and $t\in T$. Both $s$ is orthogonal to both $v$ and $t$ so $s = v - t$ is orthogonal to itself which means $s = 0$, forcing $v \in T$. Thus, $T = S^\perp$. [/proof]

Gram-Schmidt

Given a set of linearly independent vectors $a_1, \ldots, a_n$, a common goal is find a list of pairwise orthogonal vectors $q_1, \ldots, q_n$ such that

\[\span(q_1, \ldots, q_k) = \span(a_1, \ldots, a_k)\]

for each $k\in[n]$. In the case where the vectors $a_1, \ldots, a_n$ are the columns of an $n \times n$ matrix, we are really computing

\[\mqty[&a_1 & \Bigg| & \cdots & \Bigg| & a_n&] = \mqty[&q_1 & \Bigg| & \cdots & \Bigg|& q_n&]\mqty[ r_{11} & r_{12} & \cdots & r_{1n} \\ & r_{22} & \cdots & r_{2n} \\ & & \ddots & \vdots \\ & & & r_{nn} ]\]

which is known as the QR factorization. Another reason for wanting to find the list $q_1, \ldots, q_n$ is because we can construct an orthogonal basis of the vector space which we know to be extremely nice to work with.

[definition] [deftitle]Definition %counter% (Orthogonal set)[/deftitle]

Let $V$ be an inner product space and let

\[X = \{q_i \mid i \in I\}\]

where $I$ is some indexing set. If the $q_i$’s are all pairwise orthogonal, then we say that $X$ is an orthogonal set. If, in addition, each $q_i$ is a unit vector, then $X$ is an orthonormal set. [/definition]

[theorem] [thmtitle]Theorem %counter% (Orthogonal sets are linearly independent)[/thmtitle]

Let $X$ be an orthogonal set of nonzero vectors in the inner product space $V$. Then $X$ is linearly independent. [/theorem]

[proof] Suppose that

\[\sum_{i=1}^r a_i q_i = 0\]

where each $q_i \in X$ and $a_i \in F$. Taking the inner product of both sides gives

\[0 = \innerprod{\sum_{i=1}^r a_i q_i}{q_j} = a_j\innerprod{q_j}{q_j}.\]

Now, since $q_j\neq 0$, it follows that $a_j = 0$. Thus, $X$ is linearly independent. [/proof]

[theorem] [thmtitle]Theorem %counter% (Classical Gram-Schmidt Algorithm)[/thmtitle]

Let $V$ be an inner product space and let

\[\mathcal{B} \coloneqq \{b_1, \quad b_2, \quad \ldots\}\]

be a sequence of vectors in $V$. Define a sequence

\[\mathcal{Q} \coloneqq \{q_1, \quad q_2, \quad \ldots\}\]

by setting $q_1 = b_1$ and

\[q_k = b_k - \sum_{i=1}^{k-1} a_{k,i} u_i\]

for $k > 1$ where

\[a_{k,i} \coloneqq \begin{cases} 0 & \text{ if } u_i = 0, \\ \dfrac{\innerprod{u_k}{u_i}}{\innerprod{u_i}{u_i}} & \text{ if } u_i \neq 0. \end{cases}\]

Then $\mathcal{Q}$ is an orthogonal sequence of vectors in $V$ such that

\[\span(b_1, \ldots, b_k) = \span(q_1, \ldots, q_k)\]

for every $k > 0$. Furthermore, $q_k = 0$ if and only if $b_k \in \span(b_1, \ldots, b_{k-1})$. [/theorem]

The result is pretty easy to prove by induction and the later claim is by construction so we omit a proof. Our final result in this section is about Hilbert bases.

[definition] [deftitle]Definition %counter% (Hilbert basis)[/deftitle]

Let $V$ be an inner product space. A maximal orthonormal set in an inner product space is called a Hilbert basis for $V$. [/definition]

[theorem] [thmtitle]Theorem %counter% (Hilbert bases always exist)[/thmtitle]

Let $V$ be an inner product space. Then $V$ has a Hilbert basis. [/theorem]

[proof] Let

\[\mathcal{F} \coloneqq \{X \subseteq V \mid X \text{ is orthonormal}\}.\]

Clearly this family is nonempty as all singletons of $V$ are in $\mathcal{F}$ (in particular, the trivial subspace is an element). Now suppose that we have an increasing chain $(X_i)$ in $\mathcal{F}$ and consider

\[\mathcal{X} = \bigcup_i X_i.\]

If $u, v \in \mathcal{X}$, then they are both in some $X_i$ in the chain. In particular, this means that $\innerprod{u}{v}$ is zero when $u \neq v$ and $1$ when $u = v$. So $\mathcal{X}$ itself is in $\mathcal{F}$. This implies that every chain has an upper bound in $\mathcal{F}$ and Zorn’s lemma implies the existence of a maximal element of $\mathcal{F}$ as desired. [/proof]

Fourier Expansion

Perhaps one of the most useful (in an applications sense) of orthonormal bases is that one can do a Fourier expansion as follows. Say $\mathcal{Q} = {q_1, \ldots, q_n}$ is an orthonormal basis of $V$ and $v\in V$. Then

\[v = \sum_{i=1}^n c_i q_i\]

where each $c_i \in F$. We can determine each $c_i$. Take the inner product of both sides with respect to $q_j$ to obtain

\[\innerprod{v}{q_j} = \sum_{i=1}^n c_i \innerprod{q_i}{q_j} = c_j \innerprod{q_j}{q_j}.\]

Thus, it follows that

\[c_j = \innerprod{v}{q_j}.\]

The $c_j$’s are called the Fourier coefficients of $v$ with respect to $\mathcal{Q}$ and the expansion

\[v = \sum_{i=1}^n \innerprod{v}{q_j} q_i\]

is called the Fourier expansion of $v$ with respect to $\mathcal{Q}$.

Riesz Representation Theorem

A classical problem of analysis is to understand what kind of objects the elements of the continuous dual space are. This is of importance as linear functionals are ubiquitous throughout mathematics. A special case arises if we have a finite-dimensional vector space in which case the continuous and algebraic duals coincide.

[theorem] [thmtitle]Theorem %counter% (Riesz Representation Theorem)[/thmtitle]

Let $V$ be a finite-dimensional inner product space. The the map

\[\begin{align*} T:V &\to V^* \\ v &\mapsto \innerprod{-}{v} \end{align*}\]

is a conjugate isomorphism (i.e. a conjugate linear bijection). [/theorem]

[proof] (Conjugate linearity) Immediate since the inner product is conjugate linear in the second argument.

(Injectivity) Suppose that $\innerprod{-}{v} = \innerprod{-}{u}$. Then, for all $x \in V$, we have

\[\innerprod{x}{v} = \innerprod{x}{u}\]

but this implies that $v = u$.

(Surjectivity) Let $L$ be a linear functional on $V$. Picking out an orthonormal basis $b_1, \ldots, b_n$ for $V$ gives the dual basis $L_1, \ldots, L_n$ with the property that

\[L_i(b_j) = \delta_{ij}\]

where $\delta_{ij}$ is the Kronecker delta. Let

\[u = \innerprod{v}{b_1}b_1 + \cdots + \innerprod{v}{b_n}b_n\]

and

\[L = d_1L_1 + \cdots + d_nL_n.\]

Then

\[\begin{align*} L(u) &= d_1\innerprod{u}{b_1} + \cdots + d_n\innerprod{u}{b_n} \\ &= \innerprod{u}{\conj{d_1}b_1} + \cdots + \innerprod{u}{\conj{d_n}b_n} \\ &= \innerprod{u}{\conj{d_1}b_1 + \cdots + \conj{d_n}b_n}. \end{align*}\]

So setting $v = \conj{d_1}b_1 + \cdots + \conj{d_n}b_n$ yields

\[L(u) = \innerprod{u}{v}\]

and we have surjectivity! [/proof]

Worked Exercises

[example] [extitle]Cooperstein Advanced Linear Algebra Exercise 5.1.8[/extitle]

Let $V$ be an inner product space. Prove that the vectors $v_1, \ldots, v_n$ are linearly independent if and only if

\[A \coloneqq \mqty[\innerprod{v_i}{v_j} \mid i, j\in [n]]\]

is invertible. [/example]

[proof] Recall that an equivalent formulation of singularity of $A$ is that the columns are not linearly independent. Keeping this in mind, consider the linear combination

\[a_1\mqty[\innerprod{v_1}{v_1} \\ \vdots \\ \innerprod{v_1}{v_n}] + \cdots + a_n\mqty[\innerprod{v_n}{v_1} \\ \vdots \\ \innerprod{v_n}{v_n}] = 0.\]

The left side can be realized as

\[\mqty[\innerprod{a_1v_1}{v_1} \\ \vdots \\ \innerprod{a_1v_1}{v_n}] + \cdots + \mqty[\innerprod{a_nv_n}{v_1} \\ \vdots \\ \innerprod{a_nv_n}{v_n}] = \mqty[\innerprod{\sum_{i=1}^n a_iv_i}{v_1} \\ \vdots \\ \innerprod{\sum_{i=1}^n a_iv_1}{v_n}].\]

So it follows that

\[\innerprod{\sum_{i=1}^n a_iv_i}{v_j} = 0\]

for every $j\in [n]$. Multiplying each equation by $\conj a_j$ gives

\[\innerprod{\sum_{i=1}^n a_iv_i}{a_jv_j} = 0\]

and adding all the equations together gives

\[\innerprod{\sum_{i=1}^n a_iv_i}{\sum_{i=1}^n a_iv_i} = 0.\]

Thus, $\sum_{i=1}^n a_iv_i = 0$. It follows that there is a nontrivial linear dependency between the $v_i$’s so the claim follows. Running through the argument in reverse gives the converse direction. [/proof]

[example] [extitle]Cooperstein Advanced Linear Algebra Exercise 5.2.14 (Modified)[/extitle]

Let $u, v$ be vectors in an inner product space $V$ and assume that $\norm{u + v} = \norm{u} + \norm{v}$. Prove that for all $c, d \in \RR$ that

\[\norm{cu + dv} = |c\norm{u} + d\norm{v}|\]

Note: The original exercise asked the reader to prove that $\norm{cu + dv}^2 = c^2\norm{u}^2 + d^2\norm{v}^2$ which is a false statement. I took the liberty of slightly modifying the original exercise into one that is actually a true statement. [/example]

[proof] Since $\norm{u + v} = \norm{u} + \norm{v}$, it follows that

\[\norm{u + v}^2 = \norm{u}^2 + 2\norm{u}\norm{v} + \norm{v}^2.\]

But as $\innerprod{u+v}{u+v} = \norm{u+v}^2$ and

\[\innerprod{u+v}{u+v} = \norm{u}^2 + 2\Re\innerprod{u}{v} + \norm{v}^2,\]

it follows that $\Re\innerprod{u}{v} = \norm{u}\norm{v}$. So then

\[\begin{align*} \norm{cu + dv}^2 &= \innerprod{cu + dv}{cu + dv} \\ &= c^2 \norm{u}^2 + 2\Re(cd\innerprod{u}{v}) + d^2\norm{v}^2 \\ &= c^2 \norm{u}^2 + 2cd\norm{u}\norm{v} + d^2\norm{v}^2 \\ &= (c\norm{u} + d\norm{v})^2 \end{align*}\]

as desired. [/proof]

[example] [extitle]Cooperstein Advanced Linear Algebra Exercise 5.1.16[/extitle]

Let $V$ be an inner product space, $x \in V$ a unit vector, and $y \in V$. Prove $\innerprod{y}{x}\innerprod{x}{y} \le \innerprod{y}{y}$. [/example]

[proof] Note that

\[\innerprod{y}{x}\innerprod{x}{y} = \conj{\innerprod{x}{y}}\innerprod{x}{y} = |\innerprod{x}{y}|^2.\]

Furthermore, Cauchy-Schwarz tells us that

\[|\innerprod{x}{y}| \le \sqrt{\innerprod{x}{x}}\sqrt{\innerprod{y}{y}}.\]

Thus, it follows that

\[|\innerprod{x}{y}|^2 \le \innerprod{y}{y}\]

as desired. [/proof]

[example] [extitle]Axler Linear Algebra Done Right Exercise 6.A.6[/extitle]

Let $V$ be an inner product space over $F$ (where $F = \RR$ or $F = \CC$). Prove that $\innerprod{u}{v} = 0$ if and only if

\[\norm{u} \le \norm{u + av}\]

for all $a \in F$. [/example]

[proof] ($\implies$) Note that

\[\begin{align*} \norm{u + av}^2 &= \innerprod{u + av}{u + av} \\ &= \norm{u}^2 + 2\Re(\conj{a}\innerprod{u}{v}) + |a|^2 \\ &\ge \innerprod{u}{u} \end{align*}\]

as desired.

($\impliedby$) Squaring both sides of the inequality and translating to inner products gives

\[0 \le 2\Re(\conj{a}\innerprod{u}{v}) + |a|^2\innerprod{v}{v}.\]

Now we set $a = re^{i\theta}$ where $r > 0$ (notice that this is okay because if we’re in a real inner product space, we can restrict $\theta$ to only allow it to take on values $\pm \pi$). This gives

\[0 \le 2r\Re(e^{-i\theta}\innerprod{u}{v}) + r^2\innerprod{v}{v}.\]

Dividing through by $r$ and then taking the limit as $r\downarrow 0$ gives

\[0 \le \Re(e^{-i\theta}\innerprod{u}{v}).\]

Now suppose we write $\innerprod{u}{v} \eqqcolon x + iy$. If $x > 0$, then setting $\theta = \pi$ gives $0 \le -x$ or that $x \le 0$. So $x$ cannot be positive. Similarly, if $x < 0$, then setting $\theta = 0$ gives $0 \le x$. So $x$ cannot be negative. Thus, $x = 0$. Now if $y$ is anything but $0$, then we can set $\theta = \pm \pi/2$ to show that $0\le y \le 0$ as well. Thus, $\innerprod{u}{v} = 0$ and we are done. [/proof]

[example] [extitle]Axler Linear Algebra Done Right Exercise 6.B.16[/extitle]

Suppose $V$ is finite-dimensional. Suppose $\innerprod{-}{-}_1$ and $\innerprod{-}{-}_2$ are inner products on $V$ with corresponding norms $\norm{-}_1$ and $\norm{-}_2$. Prove that there exists a positive number $c$ such that $\norm{v}_1 \le c\norm{v}_2$ for every $v \in V$. [/example]

[proof] Let $q_1, \ldots, q_n$ be an orthonormal basis of $V$ with respect to $\innerprod{-}{-}_2$. Then if $v = \sum_{i=1}^n a_i q_i$, it follows that

\[\norm{v}_2 = \sqrt{\innerprod{v}{v}_2} = \sqrt{\sum_{i=1}^n |a_i|^2} \ge M\]

where $M = \max_{i\in [n]} |a_i|$. So then we have

\[\begin{align*} \norm{v}_1 &= \nrm{\sum_{i=1}^n a_i q_i}_1 \\ &\le \sum_{i=1}^n |a_i| \norm{q_i}_1 \\ &\le M\sum_{i=1}^n\norm{q_i}_1 \\ &\le \norm{v}_2\sum_{i=1}^n\norm{q_i}_1. \end{align*}\]

[/proof]