Can we decompose real matrices in a way that is analogous to polar decomposition in the complex case?
What I mean is: given an invertible real matrix $M$, can we always write: $$ M = OP, $$
maybe uniquely, where $O$ is orthogonal, and $P$ is symmetric positive-definite?
Thanks.
$\endgroup$ 43 Answers
$\begingroup$The answer is yes. In fact, we don't need the spectral theorem to prove it.
Suppose that $M$ is a real invertible matrix. Then $M^TM$ is positive definite and has the unique positive semidefinite square root $P = \sqrt{M^TM}$. Now, note that $P$ has the property $\|Px\| = \|Mx\|$ for all vectors $x$.
If $M$ is invertible, then $P$ is invertible as well, and we have $M = (MP^{-1})P$. We note that $MP^{-1}$ is orthogonal since for all $y = Px$, we have$$ \|MP^{-1}y\| = \|y\| $$Thus, we have a polar decomposition with $O =( MP^{-1})$ .
We note that this decomposition is unique. In particular, suppose $M = OP$ for orthogonal $O$ and positive $P$. then$$ M^TM = PO^TOP = P^2 $$And so, by the uniqueness of positive definite square roots, $P$ is uniquely determined. Then we can rearrange $M = OP$ to find $O = MP^{-1}$ is also unique determined.
Things get a bit trickier when $M$ is not invertible, but we can still guarantee a (non-unique) polar decomposition.
$\endgroup$ 5 $\begingroup$A derivation of the polar decomposition for a 2x2 matrix can be found on Polar Decomposition
I also elaborate the relation with SVD. In short, shuffling around with the SVD decompostion:
$A= U \Sigma V^T$ = $(UV^T) (V \Sigma V^T) $= $R_{\theta} S_{AT}$
$S_{AT}$ scales along the principal axes of the ellipse defined by $A^T(unitcircle)$
$R_{\theta}$ rotates over the angle between the principal axes of $A$ and $A^T$
$A^{-1}=\left({\ \ U\ \mathrm{\Sigma}}^{-1}U^{-1}\right)^{-1}R_\theta$ = $S_A R_\theta$
$R_{\theta}$ rotates over the angle between the principal axes of $A$ and $A^T$
$S_{A}$ scales along the principal axes of the ellipse defined by $A(unitcircle)$
$\endgroup$ 1 $\begingroup$Theory of Polar Decomposition is described in WikipediaFor the sake of simplicity we shall restrict our attention to real-valued non-singular square (invertible) matrices.
Theorem. Every such a matrix B can be decomposed as follows and the decomposition is unique:$$
B = U Q
$$Where $U =$ orthogonal matrix and $Q=$ symmetric positive definite matrix.
Instead of providing still another proof of the above theorem, I have decided to be practical and describe how to numericallyobtain the decomposition.
First define the transpose of the matrix multiplied with the original:$$
P = B^T B = \left(B^TB\right)^T = P^T \quad \mbox{: symmetric}
$$It follows that $P$ is symmetric and positive definite.
Then what we need is the square root of the matrix $P$. In order to understand how to obtain it, let's take a look atNewton's method for obtaining the square root
of a real positive number $p$:$$
f(x) = x^2 - p\quad \Longrightarrow \\ x_{n+1} = x_n - \frac{x_n^2-p}{2x_n} = \left(x_n + p\,x_n^{-1}\right)/2
$$Let's do the same for our matrix $P$, iterations starting with the unit matrix:$$
X_0 = I \quad ; \quad X_{n+1} = \left(X_n + P X_n^{-1}\right)/2 \quad ; \quad \sqrt{P} = \lim_{n\to\infty} X_n
$$According to aMSE reference$X_n^{-1}$ is positive definite and symmetric, provided that $X_n$ is positive definite and symmetric. Products and sums
of positive definite and symmetric matrices are positive definite and symmetric too. Therefore,
by induction to $n$, it can be proved that $\sqrt{P}$ is positive definite and symmetric as well. Now define $Q=\sqrt{P}=Q^T$.
At last define $U = B Q^{-1}$ and prove that $U$ is orthogonal:$$
U^{T}U = \left(B Q^{-1}\right)^T\left(B Q^{-1}\right) = \left(Q^{-1}\right)^T\left(B^TB\right)Q^{-1} = \\
\left(Q^T\right)^{-1} Q^2 Q^{-1} = \left(Q^{-1}Q\right)\left(QQ^{-1}\right) = II = I
$$The conclusion looks like trivial:$$
B = \left[B\left(B^TB\right)^{-1/2}\right]\left[\left(B^TB\right)^{1/2}\right]
$$So far so good about theory. I want to talk about practice now, which in modern times is computer programming: