I´m trying to find a vector $\vec{c} = $ , which is orthogonal to vector $\vec{a}$ and $\vec{b}$:
As far I understood, I have to show that:
$$\langle a,c\rangle=0 $$ $$\langle b,c\rangle=0 $$
So if I would like to determine an orthogonal vector regarding: \begin{bmatrix}-1\\1\end{bmatrix} I just intuitively uses: $$\langle v,w\rangle=1 \cdot(-1)+1\cdot 1=0 $$ in order to arrive at \begin{bmatrix}1\\1\end{bmatrix} My problem is that I just dont know a mechanic way to solve for an orthogonal vector. It was more a educated guess.
For example, given: $\vec{a} = \begin{bmatrix}-1\\1\\1\end{bmatrix}$ and $\vec{b} = \begin{bmatrix}\sqrt{2}\\1\\-1\end{bmatrix}$ how do I find a orthogonal vector?
Thank you in advance.
$\endgroup$ 53 Answers
$\begingroup$Given $m$ orthogonal vectors $v_1, v_2, \ldots, v_m$ in $\mathbb R^n$, a vector orthogonal to them is any vector $x$ that solves the matrix equation
$$\begin{pmatrix}v_1^T \\ v_2^T \\ \vdots \\ v_m^T\end{pmatrix} x = 0.$$
To put this a bit more concretely, suppose
$$v_1 = \begin{pmatrix}v_{11} \\ v_{12} \\ \vdots \\ v_{1n}\end{pmatrix},\quad v_2 = \begin{pmatrix}v_{21} \\ v_{22} \\ \vdots \\ v_{2n}\end{pmatrix},\ \ldots,\quad v_m = \begin{pmatrix}v_{m1} \\ v_{m2} \\ \vdots \\ v_{mn}\end{pmatrix},\ \mbox{and}\quad x = \begin{pmatrix}x_{1} \\ x_{2} \\ \vdots \\ x_{n}\end{pmatrix}$$
where the numbers $v_{ij} \in \mathbb R$ are all known and the numbers $x_i \in \mathbb R$ are all unknown. Then the matrix equation above can also be written
$$ \begin{pmatrix} v_{11} & v_{12} & \cdots & v_{1n} \\ v_{21} & v_{22} & \cdots & v_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ v_{m1} & v_{m2} & \cdots & v_{mn} \end{pmatrix} \begin{pmatrix}x_1 \\ x_2 \\ \vdots \\ x_n\end{pmatrix} = \begin{pmatrix}0 \\ 0 \\ \vdots \\ 0\end{pmatrix}.$$
This is equivalent to the system of linear equations$$ \begin{array}{ccccccccl} v_{11}x_1 &+& v_{12}x_2 &+& \cdots &+& v_{1n}x_n &=& 0, \\ v_{21}x_1 &+& v_{22}x_2 &+& \cdots &+& v_{2n}x_n &=& 0, \\ \vdots&&\vdots&&\ddots&&\vdots&&\vdots \\ v_{m1}x_1 &+& v_{m2}x_2 &+& \cdots &+& v_{mn}x_n &=& 0. \end{array}$$
That is, you need to solve a linear system of $m$ equations with $n$ unknowns. This is something you can do using row reduction.
The solution will never be unique; if the vector $x$ is a solution then the vector $cx$ is also a solution, where $c$ is any scalar constant. If the $m$ vectors include fewer than $n-1$ independent vectors, the solution is not even unique up to a scalar constant; you can have multiple vectors in different directions that are all orthogonal to the given vectors. If $m \geq n$ there may not be a solution at all; the $m$ vectors may span $\mathbb R^n$. There will, however, be solutions as long as the set of given vectors does not contain$n$ or more mutually independent vectors.
In your particular case, if you are not aware of the fact that the cross-product of two independent vectors in $\mathbb R^3$ is orthogonal to each of those vectors, you have
$$v_1 = \begin{pmatrix}v_{11}\\v_{12}\\v_{13}\end{pmatrix} = \begin{pmatrix}-1\\1\\1\end{pmatrix} \quad \mbox{and} \quad v_2 = \begin{pmatrix}v_{21}\\v_{22}\\v_{23}\end{pmatrix} = \begin{pmatrix}\sqrt{2}\\1\\-1\end{pmatrix},$$
so you could solve the system of equations
$$\begin{eqnarray} -1\cdot x_1 + 1\cdot x_2 + 1 \cdot x_3 &=& 0, \\ \sqrt{2}\cdot x_1 + 1\cdot x_2 - 1 \cdot x_3 &=& 0. \end{eqnarray}$$
Blindly applying the methods I was taught in high school, I find this is equivalent to
$$\begin{array}{ccccccl} x_1 &-& x_2 &-& x_3 &=& 0, \\ &&\left(1+\sqrt{2}\right)x_2 &+&\left(-1+\sqrt{2}\right) x_3 &=& 0. \end{array}$$
At this point we can make an arbitrary choice of for $x_3$ and proceed to solve the equations as a system of two equations in two unknowns.
$\endgroup$ 3 $\begingroup$Take an arbitrary vector $$c=\begin{bmatrix}x\\y\\z\end{bmatrix}$$ then write down the equations in terms of $xyz$:
$$\langle a,c\rangle = 0;\langle b,c\rangle = 0$$
The solutions to this system will all lie on one line and this gives you the perpendicular vector.
Easy example: Let $$a= \begin{bmatrix}1\\0\\0\end{bmatrix}; b= \begin{bmatrix}0\\1\\0\end{bmatrix}$$
Then set $\langle a,c\rangle=x = 0$ and $\langle b,c\rangle=y = 0$. So we conclude that $$c=\begin{bmatrix}0\\0\\z\end{bmatrix}$$ is perpendicular for any $z\in\mathbb{R}$.
$\endgroup$ 2 $\begingroup$I discovered a "quick" $(O(d^2))$ algorithm to generate $d-1$ mutually orthogonal vectors that are perpendicular to $\vec{x}$ where $d$ is the size of $\vec{x}$, while working on a semi-related problem (I needed to generate $d-2$ mutually orthogonal vectors that are perpendicular to both $\vec{u}$ and $\vec{v}$, (where $\vec{u}$ and $\vec{v}$ are perpendicular.. unfortunately the concept of vector cross product does not exist when d is not 3). The alternative would be to apply Gram-Schmidt which would take $O(d^3)$
I made use of something called a Householder Transform ($H= I - 2nn^T$ (with $||n||=1$)) which conceptually works like a reflection on a d-dimensional hyperplane through the origin, with $n$ the vector normal to the hyperplane. The general idea is to reflect d orthogonal basis (an easy basis to use would be the usual basis (aka the d-size identity matrix $I_d$) , such one of the column vectors coincides with x.
Step 1: normalize x (take x and divide it by ||x||)
Step 2: let $n_1= \sqrt{\frac{1-x_1}{2}}$, and $n_j= \frac{-x_j}{\sqrt{2(1-x_1)}} $ with $j \in [2..d]$
Step 3: calculate $H = I - 2nn^T$.
Step 4: Columns 2 to d are orthogonal to $x$
Proof: Since, the column vectors of $I_d$ are mutually orthogonal, it follows that the column vectors of the reflection of $I_d$ would also be mutually orthogonal. Hence all we need to do is to design $H$ such that we have the following property $HI = H = [x \text{ | } V_{2..d} ] $, giving us $V_2, V_3 ... V_d$ mutually-orthogonal to $x$.
To construct $H$, we have$x_i = I_{1,i} - 2n_in_1$simplifying,
$x_1 = 1 - 2n_1^2$,
Solving for $n_1$, we get $n_1= \sqrt{\frac{1-x_1}{2}}$.
Similarly,
$x_j = -2n_1n_j$ for $j \in [2.. d]$,
Solving for $n_j$ we get $n_j= \frac{-x_j}{\sqrt{2(1-x_1)}}$
Finally, proof that $V_2 ... V_d$ are mutually orthogonal to $x$
$ V_{i,j} = I_{i,j} -2n_jn_i $
when $i=1$, $ V_{1,j} = -2n_jn_1 = -2 \frac{-x_j}{2n_1}n_1 = x_j$
when $i=j$, $ V_{j,j} = 1-2n_jn_j = 1-2 \frac{x_j^2}{2(1-x_1)} = \frac{1 -x_1 - x_j^2}{1-x_1}$
when $i \neq j, i \neq 1$, $ V_{i,j} = -2n_in_j = -2 \frac{x_jx_i}{2(1-x_1)} = \frac{-x_ix_j}{1-x_1}$
To show $V_j$ is orthorgonal to $x$,
$x^TV_j = \sum_{i=1}^d V_{i,j}x_i$
$= x_jx_1 + \dfrac{x_j -x_jx_1- x_j^3}{1-x_1}+\sum_{i\neq j, i\geq 2}^d \dfrac{-x_jx_i^2}{1-x_1}$
$=\dfrac{x_jx_1-x_jx_1^2}{1-x_1} + \dfrac{x_j -x_jx_1- x_j^3}{1-x_1}-\sum_{i\neq j, i\geq 2}^d \dfrac{x_jx_i^2}{1-x_1} $
$= \dfrac{x_j}{1-x_1}(1-\sum_{i=1}^dx_i^2)=0$
Since $\|x\| = 1$ through step one, implying $\sum_{i=1}^d x_i^2 = 1 $
(or we can just exploit the fact that by construction, $HH^T=H^TH=I$ implying the column vectors are mutually orthogonal to each other... )
$\endgroup$