# 10.1: Showing Linear Dependence - Mathematics

In the above example we were given the linear combination (3v_{1}+2v_{2}-v_{3}+v_{4}) seemingly by magic. The next example shows how to find such a linear combination, if it exists.

Example 107

Consider the following vectors in (Re^{3}):
[
v_{1}=egin{pmatrix}01end{pmatrix},
qquad v_{2}=egin{pmatrix}121end{pmatrix},
qquad v_{3}=egin{pmatrix}123end{pmatrix}.
]
Are they linearly independent?

We need to see whether the system
[
c^{1}v_{1} + c^{2}v_{2}+ c^{3}v_{3}=0
]
has any solutions for (c^{1}, c^{2}, c^{3}). We can rewrite this as a homogeneous system by building a matrix whose columns are the vectors (v_{1}), (v_{2}) and (v_{3}):
[
egin{pmatrix}v_{1}&v_{2}&v_{3}end{pmatrix}egin{pmatrix}c^{1}c^{2}c^{3}end{pmatrix}=0.
]
This system has solutions if and only if the matrix (M=egin{pmatrix}v_{1}&v_{2}&v_{3}end{pmatrix}) is singular, so we should find the determinant of (M):
[
det M = det egin{pmatrix}
0 & 1 & 1 \
0 & 2 & 2 \
1 & 1 & 3 \
end{pmatrix}
= det egin{pmatrix}
1 & 1 \
2 & 2 \
end{pmatrix}
=0.
]

Therefore nontrivial solutions exist. At this point we know that the vectors are linearly dependent. If we need to, we can find coefficients that demonstrate linear dependence by solving the system of equations:
[
left(egin{array}{rrrr}
0 & 1 & 1 & 0\
0 & 2 & 2 & 0\
1 & 1 & 3 & 0\
end{array} ight) sim
left(egin{array}{rrrr}
1 & 1 & 3 & 0\
0 & 1 & 1 & 0\
0 & 0 & 0 & 0\
end{array} ight) sim
left(egin{array}{rrrr}
1 & 0 & 2 & 0\
0 & 1 & 1 & 0\
0 & 0 & 0 & 0\
end{array} ight).
]
Then (c^{3}=c^{3}=:mu), (c^{2}=-mu), and (c^{1}=-2mu). Now any choice of (mu) will produce coefficients (c^{1},c^{2},c^{3}) that satisfy the linear equation. So we can set (mu=1) and obtain:
[
c^{1}v_{1} + c^{2}v_{2}+ c^{3}v_{3}=0
Rightarrow -2v_{1} - v_{2} + v_{3}=0.
]

Theorem (Linear Dependence)

An ordered set of non-zero vectors (( v_{1}, ldots, v_{n} )) is linearly dependent if and only if one of the vectors (v_{k}) is expressible as a linear combination of the preceding vectors.

Proof
The theorem is an if and only if statement, so there are two things to show.

((i.)) First, we show that if (v_{k}=c^{1}v_{1}+cdots c^{k-1}v_{k-1}) then the set is linearly dependent.

This is easy. We just rewrite the assumption:
[
c^{1}v_{1}+cdots+c^{k-1}v_{k-1}-v_{k} + 0v_{k+1}+cdots + 0v_{n}=0.
]
This is a vanishing linear combination of the vectors ({ v_{1}, ldots, v_{n} }) with not all coefficients equal to zero, so ({ v_{1}, ldots, v_{n} }) is a linearly dependent set.

((ii.)) Now, we show that linear dependence implies that there exists (k) for which (v_{k}) is a linear combination of the vectors ({ v_{1}, ldots, v_{k-1} }).

The assumption says that
[
c^{1}v_{1} + c^{2}v_{2}+ cdots +c^{n}v_{n}=0.
]
Take (k) to be the largest number for which (c_{k}) is not equal to zero. So:
[
c^{1}v_{1} + c^{2}v_{2}+ cdots +c^{k-1}v_{k-1}+c^{k}v_{k}=0.
]

(Note that (k>1), since otherwise we would have (c^{1}v_{1}=0Rightarrow v_{1}=0), contradicting the assumption that none of the (v_{i}) are the zero vector.)

As such, we can rearrange the equation:
egin{eqnarray*}
c^{1}v_{1} + c^{2}v_{2} + cdots +c^{k-1}v_{k-1}&=&-c^{k}v_{k} Rightarrow
-frac{c^{1}}{c^{k}}v_{1} - frac{c^{2}}{c^{k}}v_{2} - cdots -frac{c^{k-1}}{c^{k}}v_{k-1}&=&v_{k}.
end{eqnarray*}

Therefore we have expressed (v_{k}) as a linear combination of the previous vectors, and we are done.

Example 108

Consider the vector space (P_{2}(t)) of polynomials of degree less than or equal to (2). Set:
egin{eqnarray*}
v_{1} &=& 1+t
v_{2} &=& 1+t^{2}
v_{3} &=& t+t^{2}
v_{4} &=& 2+t+t^{2}
v_{5} &=& 1+t+t^{2}.
end{eqnarray*}
The set ({ v_{1}, ldots, v_{5} }) is linearly dependent, because (v_{4} = v_{1}+v_{2}).