I'm unsure about when the Gram-Schmidt process is used to find orthonormal vectors. AFAIK, you use it when you have an eigenvalue that is repeated such as, $(λ-2)^2$. If someone could elaborate/correct me that would be appreciated.
$\endgroup$1 Answer
$\begingroup$First of all, I think what you are referring to is the Gram-Schmidt process for finding/constructing an orthonormal basis. This, at first, has nothing to do with eigenvalues/eigenvectors. The situation is the following(in the following just for finite dimensions):
You have a so called Inner product space $(V,\langle\cdot,\cdot\rangle)$, that is a vector space $V$ together with an inner product $\langle\cdot,\cdot\rangle$, an additional structure allowing the incorporation of geometric concepts such as orthogonality(or more generally angles) as well as distances, etc. into the realm of vector spaces.
Working with vector spaces, an essential tool/viewpoint are the multiplicity of bases. In these new notions of an inner product, a desired(new) property of a basis may be to be orthonormal w.r.t. the given inner product, that is the basis vectors being orthogonal to each other and having normalized(that is unit) length. Bear again in mind that all these relations are relative to the associated inner product.
A natural question would be, if an inner product space always admits an orthonormal basis in the same way as a classical, plain vector space always admits a plain basis. The so called Gram-Schmidt procedure now does two things:
- It answers this questions positively, i.e. it garantues you the existence of an orhtonormal basis for an inner product space.
- It gives a procedure/an algorithm to construct such a basis.
For this construction, Gram-Schmidt requires a basis for the underlying(plain) vector space $V$ and transforms this basis into a corresponding orthonormal basis w.r.t. to the associated inner product.
This obviously(from a constructive point of view) is a much stronger result. I omit further details here as you just asked for an elaboration of the concept.
For a slight outlook concerning the other notion you brought into play: there are deep correspondences between eigenvalue/eigenvectors or the concept of diagonalization and inner product spaces, as this context allows for richer notions and procedures such a orthogonal complement decomposition, etc. Also, different classes of homomorphism of inner product spaces emerge here, some of which having beautiful properties in terms of diagonalization such as self-adjoint operators which allow diagonalization even always w.r.t. orthonormal bases.
$\endgroup$