Blogs, Linear Algebra

Orthogonal Complement

In Gram-Schmidt Orthonormalization, we looked at orthogonal vectors when we created a basis for a vector space. As we generalize this, we saw that two subspaces of a vector space were orthogonal if the inner product of any vector in the first subspace with any vector in the second subspace is 0. As we looked at this, we wanted to know, if we were given a subspace of a vector space \(V\), could we find the largest possible subspace of \(V\) orthogonal to my subspace.

Orthogonal Complement

As a reminder, we defined the orthogonal complement of a subspace, \(S\), of a vector space, \(V\) as
\begin{align*}
S^{\perp}=\left\{\mathbf{u} \in V: \langle \mathbf{u}, \mathbf{v} \rangle =0 \text{ for all } \mathbf{v} \in S\right\}.
\end{align*}

Now, let \(S=span(\left\{(1,2,1,0),(1,2,2,2)\right\})\) in \(\mathbb{R}^{4}\) with the dot product as the inner product. Then find \(S^{\perp}\).

Orthogonal Vectors

In order to find the orthogonal complement, we first need to recall that if a vector is orthogonal to each vector in a set \(S\), then the vector is orthogonal to all linear combinations of these vectors. That is, if we want to find vectors orthogonal to everything in \(S\), we need only find all the vectors orthogonal to both the vectors \((1,2,1,0)\) and \((1,2,2,2)\).

Now, if \(\mathbf{v} \in \mathbb{R}^{4}\), then \(\mathbf{v}\) is orthogonal to both of these vectors if and only if the dot product with these vectors is \(0\). This gives us the matrix equation
\begin{align*}
\begin{bmatrix}
1 & 2 & 1 & 0 \\
1 & 2 & 2 & 2
\end{bmatrix}\begin{bmatrix}
v_{1} \\ v_{2} \\ v_{3} \\ v_{4} \end{bmatrix} =\begin{bmatrix} 0 & 0 \end{bmatrix}.
\end{align*}
That is, if we want to find all vectors orthogonal to these two vectors, we just need to find the null space of the matrix
\begin{align*}
\begin{bmatrix}
1 & 2 & 1 & 0 \\
1 & 2 & 2 & 2
\end{bmatrix}
\end{align*}
where the vectors that generate \(S\) become the rows of the matrix.

Null Space

Recall that we were able to find null spaces for vectors in Row, Column and Null Space, so please review here if you would like more practice with this. In this case, however, we will need to row reduce our matrix and find vectors that will solve the system of equations.

We see that
\begin{align*}
\begin{bmatrix}
1 & 2 & 1 & 0 \\
1 & 2 & 2 & 2
\end{bmatrix}
\end{align*}
row reduces to
\begin{align*}
\begin{bmatrix}
1 & 2 & 0 & -2 \\
0 & 0 & 1 & 2
\end{bmatrix}.
\end{align*}
Hence the free variables are \(v_{2}\) and \(v_{4}\), so we let these be \(s\) and \(t\), respectively. We then find that \(v_{1}=-2s+2t\) and \(v_{3}=-2t\). Then, arbitrary vectors in the null space will look like \((-2s+2t,s,-2t,t)\).

In order to find a basis for the space, we first let \(s=0\) and \(t=1\) and get the vector \((2,0,-2,1)\). Then, letting \(s=1\) and \(t=0\), we get \((-2,1,0,0)\). Hence, we find that
\begin{align*}
S^{\perp}=span(\left\{(2,0,-2,1),(-2,1,0,0)\right\}).
\end{align*}

Conclusion

As always, I hoped this helped and that you enjoyed the process. If you did, make sure to let other people know about these resources and subscribe to our YouTube channel.

We'd love to hear your thoughts!

This site uses Akismet to reduce spam. Learn how your comment data is processed.