Why a linearly independent set of vectors must not contain the zero vector?

$\begingroup$

Why is it necessary for a linearly independent set of vectors to not contain the zero vector? I am looking with the definition perspective i.e. why do we define linear independence in this way?

$\endgroup$ 10

2 Answers

$\begingroup$

A big part of what makes the definition of "linearly independent" so useful is that it gives a robust notion of "basis" and "dimension": a basis is a linearly independent set which spans the entire vector space, and any two bases for a vector space have the same number of elements, which we call the dimension of the space. Any two vector spaces of the same dimension are isomorphic. These basic facts are fantastically powerful, and (when paired with the fact that any linearly independent set can be extended to a basis) are arguably the main reason that linear algebra is such a central part of mathematics.

If you allow the zero vector to be in a linearly independent set, all of this breaks down. You could take any basis and add the zero vector to it, so a vector space can have bases of different sizes. Furthermore, it would no longer be true that the size of a basis determines the vector space up to isomorphism (for instance, if $K$ is your scalar field, then both $K$ and $K^2$ have a basis of size $2$, namely $\{0,1\}$ and $\{(0,1),(1,0)\}$).

In a certain sense, allowing the zero vector to be in a linearly independent set is much like considering the integer $1$ to be prime: the purpose of primes is to be able to factor other numbers into them, but if you allow $1$ to be prime these factorizations are no longer unique, because you can add in as many copies of $1$ as you like. Similarly, any vector space can be "factored" into a basis (or more abstractly, split as a direct sum of simple vector spaces), but the number of terms in this factorization would not be unique if zero could be a basis element.

$\endgroup$ 1 $\begingroup$

Consider a set of vectors A1,A2, ... ,AnLet A1 be the null vector, i.e. A1= 0Now consider the linear combination:Y=x1 A1+x2 A2+....+xn An , where x1, x2, ..., xn are scalars. By definition, for the vectors to be linearly independent, Y=0 if and only if x1=x2=...=xn=0. However, let x1 not be equal to zero and x2=x3=....=xn=0 Then Y=x1 A1=0Hence they can't be linearly independent!

Have a look at this video:

$\endgroup$ 1

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy

You Might Also Like