I am looking for a proof using the min-max principle. Wikipedia seem to provide just that:
But this part seems to be wrong:
This can be proven using the min-max principle. Let $\beta_i$ have corresponding eigenvector $b_i$ and $S_j$ be the $j$ dimensional subspace $S_j=\operatorname{span}\{b_1,\dots, b_j\}$, then $$ \beta_j = \max_{x\in S_j,\|x\|=1}(Bx,x) =\max_{x\in S_j,\|x\|=1}(PAPx,x) =\max_{x\in S_j,\|x\|=1}(Ax,x)$$
How is the shift from $PAPx$ to $Ax$ legal? $PAP$ is an $m\times m$ matrix while $A$ is an $n\times n$ matrix. $x$ can't fit both. Can anyone correct the proof?
$\endgroup$ 61 Answer
$\begingroup$The proof on wikipedia has many flaws indeed. The projection $P$ can be expressed with matrix $P=V*V'$ where columns of $V$ are eigenvectors $v_1, v_2, \dots v_m$ associated with $\alpha_1, \dots \alpha_m$ of $A$. So $V$ is $n*m$ matrix. Then you can write $B=V'*A*V$ and $\max (Bx,x) = \max x'*V'*A*V*x = \max (V*x)'*A*(V*x)$. When $\operatorname{norm}(x)=1$ then $\operatorname{norm}(Vx)=1$ as well. This is how I got to the end of the proof.
$\endgroup$