Mathworld.wolfram.com defines
"vectors xi are linearly dependent iff there exist scalars ci, not all zero, such that the sum of the products (ci)*(xi) = 0
If no such scalars exist, then the vectors are said to be linearly independent".
Now if one of the xi, say xk, is the zero vector, then the sum is zero for any value of ck provided all the other ci are zero. However xk can only be expressed as a linear combination of the other vectors by using 0 for every coefficient; and indeed if the remaining n-1 vectors are independent of each other, then none of them can be expressed as a linear combination of the others, which contradicts the concept of dependence. Should the zero vector be excluded from the definition of linear dependence/ independence to avoid this paradox?
Tags: