8 minute read

Basis and Dimension

$\mathbf{Definition.}$ Linearly independent
Let $V$ be a vector space over $F$. A subset $S$ of $V$ is called linearly dependent if there exists finite distinct vectors $v_1, \cdots, v_m \in S$ such that $a_1 v_1 + \cdots + a_m v_m = 0$ for some $a_i \in F$ which are not all zero $._\blacksquare$

Let a vector space

\[\begin{align*} V = \left< \mathbf{v}_1, \cdots, \mathbf{v}_s \right> = \text{Span} \{ \mathbf{v}_1, \cdots, \mathbf{v}_n \} \end{align*}\]

be a subspace of $\mathbb{R}^n$. If the vectors in the set

\[\begin{align*} S = \{ \mathbf{v}_1, \cdots, \mathbf{v}_s \} \end{align*}\]

are linearly independent, even if some vector is removed, the remaining set still span $V$. We can repeat this until the remaining set of vectors are linearly independent. That means, a linearly independent set that spans $V$ can be something that represents a vector space $V$.

$\mathbf{Def.}$ Basis, Dimension
A set of vectors in a subspace $V$ of $\mathbb{R}^n$ is said to be a basis of $V$ if it is linearly independent and spans $V$. The number of vectors in a basis is called the dimension of $V._\blacksquare$

Here are some examples.

$\mathbf{Example.}$ Examples of linearly independence and basis

1. Let $\mathbf{0} \neq \mathbf{v} \in V$. Then, by definition, { $\mathbf{v}$ } is linearly independent.

2. Clearly, every set of vectors containing $\mathbf{0}$ is linearly dependent.

3.

$V$ $F^n$ $M_{n \times n} (F)$ $F[x]$
basis { $\mathbf{e}_1, \cdots, \mathbf{e}_n$ } { $E_{ij} | 1 \leq i \leq m, 1 \leq j \leq n$ } { $1, x, x^2, \cdots$ }
dim $n$ $mn$ $\infty$


4. Let $V$ be a vector space of all $n \times n$ symmetric matrices. Then, the basis of $V$ is

\[\begin{align*} \{ E_{11}, E_{12}, \cdots, E_{1n}, E_{22}, E_{23}, \cdots, E_{2n}, \cdots, E_{nn} \} \end{align*}\]

5. $V =$ { $(x, y, z) \in \mathbb{Q}^3 | 2x + 3y + z = 0$ } $\subseteq \mathbb{Q}^3$. Then, $\text{dim}V = 2$



Uniqueness of Dimension

We will see that the dimension of a vector space is unique. To prove it, first I introduce useful lemma:

$\mathbf{Lemma\ 1.}$ Let $V = \text{Span}$ { $v_1 \cdots, v_n$ } be a vector space over $F$. If $S$ is linearly independent subset of $V$, then $| S | \leq n._\blacksquare$

$\mathbf{Proof.}$

Suppose $| S | \geq n + 1$, i.e., $S =$ { $\mahtbf{u}1, \cdots, \mathbf{u}{n+1}$ }.

Let’s write $\mathbf{u}_1 = a_1 \mathbf{v}_1 + \cdots + a_n \mathbf{v}_n$.
As $S$ is linearly independent, $\mathbf{u}_1 \neq \mathbf{0}$. So, one of $a_1, \cdots, a_n$ is nonzero and we may assume that $a_1 \neq 0$.
Hence, $\mathbf{v}_1$ is a linear combination of $\mathbf{u}_1, \mathbf{v}_2, \cdots, \mathbf{v}_n$
$\Rightarrow V = \text{Span}$ { $\mathbf{u}_1, \mathbf{v}_2, \cdots, \mathbf{v}_n$ } as $\mathbf{v}_1 = \frac{1}{a_1} \mathbf{u}_1 - \frac{a_2}{a_1} \mathbf{v}_2 - \cdots - \frac{a_n}{a_1} \mathbf{v}_n$.

Write $\mathbf{u_2} = b_1 \mathbf{u}_1 + b_2 \mathbf{v}_2 + \cdots + b_n \mathbf{v}_n$, $b_i \in F$.
As $S$ is linearly independent, { $\mathbf{u}_1, \mathbf{u}_2$ } is linearly independent. Hence, one of $b_2, \cdots, b_n$ is nonzero. Without loss of generality, let $b_2 \neq 0$. Therefore, $V = \text{Span}$ { $\mathbf{u}_1, \mathbf{u}_2, \mathbf{v}_3, \cdots, \mathbf{v}_n$ }.

We do the same procedure until $\mathbf{v_1}, \cdots, \mathbf{v_n}$ are replaced by $\mathbf{u_1}, \cdots, \mathbf{u_n}$.
Then, $V = \text{Span}$ { $\mathbf{u_1}, \mathbf{u_2}, \cdots, \mathbf{u_n}$ }, thus $\mathbf{u_{n + 1}} \in \text{Span}$ { $\mathbf{u_1}, \mathbf{u_2}, \cdots, \mathbf{u_n}$ }.
And it contradicts to the assumption of linearly indepedence of $S$.


Now, we will prove the main theorem:

$\mathbf{Thm.\ 1.1.}$ Uniqueness of Dimension
If a basis of a finite dimensional vector space $V$ has $n$ vectors, then so does every other basis of $V._\blacksquare$

$\mathbf{Proof.}$

Let $S =$ { $\mathbf{v}_1, \cdots \mathbf{v}_n$ } be a basis of $V$.
Let $T =$ { $\mathbf{u}_1, \cdots \mathbf{u}_m$ } be another basis of $V$.

As $\text{Span}(S) = V$ and $T$ is linearly independent, by the previous lemma, $| S | = n \leq m$.
As $\text{Span}(T) = V$ and $S$ is linearly independent, by the previous lemma, $| T | = m \leq n$.

Thus, $m = n._\blacksquare$


So, this theorem tells us that our definition of the dimension of a vector space is well-defined.

Existence of Basis

So, how can we obtain a basis from the vector space? How can we determine the dimension of a finite dimensional vector space? We will see the way how we get the basis of finite dimensional vector space, and it tells us that we can always find a basis of the given finite dimensional vector space.

$\mathbf{Lemma\ 2.}$ Let $S =$ { $v_1 \cdots, v_n$ } be a linearly independent subset of a vector space $V$. If $\mathbf{u} \in V - \text{Span}(S)$, then $S \; \cup$ { $\mathbf{u}$ } is also linearly independent $._\blacksquare$

$\mathbf{Proof.}$

Suppose $a \mathbf{u} + b_1 \mathbf{v}_1 + \cdots + b_m \mathbf{v}_m = 0$, $a, b_i \in F$.
Then, $a \neq 0$ as $u \notin \text{Span}(S)$. So, $b_i = 0 \; \forall 1 \leq i \leq m$ as $S$ is linearly independent.


$\mathbf{Thm.\ 1.2.}$ Construction of Basis
Let $V$ be a finite dimensional vector space.
1. Any linearly independent subset of $V$ can be extended to a basis of $V$ by adding more vectors.
2. Any subset that spans $V$ can be reduced to a basis of $V$ by discarding more vectors.

$\mathbf{Proof.}$

1. Let $S =$ { $v_1 \cdots, v_m$ } be a linearly independent subset of a vector space $V$. Let $\text{dim } V = n \geq m$.

If $\text{Span}(S) = V$, then $S$ is a basis of $V$ clearly. Otherwise, choose $v_{m+1} \in V - \text{Span}(S)$. By the previous lemma, $S’ = S \; \cup$ { $v_{m+1}$ } is linearly independent.

If $\text{Span}(S’) = V$, then $S’$ is a basis of $V$ clearly. Otherwise, we do the same procedure until we get a basis of $V$. We can guarantee that it ends at a finite step as $\text{dim } V \leq \infty._\blacksquare$

2. Let $V = \text{Span}(S)$. Choose $\mathbf{0} \neq s_1 \in S$.
If $V = \text{Span}$ { $s_1$ } is a basis of $V$, done. Otherwise, we choose $v \in V - \text{Span}$ { $s_1$ }.

Write $v = a_1 s_1 + \cdots a_n s_n$ for some $s_i \in S$, and some $0 \neq a_i \in F$. As { $v, s_1$ } is linearly independent, we may assume { $a_2 s_2, s_1$ } is linearly independent. So, { $s_1, s_2$ } is linearly independent.

Again, if $V = \text{Span}$ { $s_1, s_2$ } is a basis of $V$, done. Otherwise, we do the same procedure until we get a basis of $V$. We can guarantee that it ends at a finite step as $\text{dim } V \leq \infty._\blacksquare$



$\mathbf{Corollary.}$ Let $V$ be a finite dimensional vector space over $F$ with $\text{dim } V = n.$
1. Let $S$ be a subset of $V$ with $| S | = n$. Then,

$S$ is linearly independent $\Leftrightarrow$ $V = \text{Span}(S)$


Either case, $S$ is a basis of $V$.
2. Let $W$ be a subspace of $V$. Then, $\text{dim } W \leq n$. Moreover, if $\text{dim } W = n$, then $W = V$.
3. Let $W$ and $U$ be subspaces of $V$. Then, $\text{dim } W + \text{dim } U = \text{dim}(W + U) + \text{dim}(W \cap U)$

$\mathbf{Proof.}$

1. $(\Rightarrow)$ By the previous theorem, we can extend $S$ to a basis of $V$ $T \cup S$ by adding more vectors $T$.
But, if $| S \cup T | \geq n + 1$, it is impossible by the first lemma. So, $T$ must be $\emptyset$. Thus, $S$ is a basis of $V$.

$(\Leftarrow)$ By the previous theorem, we can reduce $S$ to a basis of $V$ $T \subseteq S$ by discarding some vectors of $S$.
But, if $T \neq S$, $| T | < \text{dim } V = n$ (contradiction). Thus, $T = S$ is a basis of $V$.

2. Note that every linearly indepedent subset of $W$ is a linearly independent subset of $V$.
As $\text{dim } V = n$, a maximal linearly independent subset of $W$ has at most $n$ elements.

Let $S$ be a maximal linearly independent subset of $W$ with $| S | \leq n$.
If $\text{Span}(S) \neq W$, then choose $w \in W - \text{Span}(S)$.
And { $w$ } $\cup \; S$ is linearly independent, which contradicts the maximality.
Hence, $S$ is a basis of $W$, i.e., $| S | = \text{dim } W \leq n = \text{dim } V$.

For second statement, assume that $W \neq W$.
Then choose $v \in V - W = V - \text{Span}(S)$ where $S$ is a basis of $W$.
Then, { $v$ } $\cup \; S$ is linearly independent, which contradicts the lemma as $|$ { $v$ } $\cup \; S$ $| > n.$

3. By second corollary, $\text{dim}(U \cap W) \leq n$.
Let

\[\begin{align*} \{ \mathbf{x}_1, \cdots, \mathbf{x}_k \} \end{align*}\]

be a basis of $U \cap W$.

By theorem, it can be extended to a basis of U by adding more vectors:

\[\begin{align*} \{ \mathbf{x}_1, \cdots, \mathbf{x}_k, \mathbf{y}_1, \cdots, \mathbf{y}_l \} \end{align*}\]

and a basis of $W$ by adding more vectors:

\[\begin{align*} \{ \mathbf{x}_1, \cdots, \mathbf{x}_k, \mathbf{z}_1, \cdots, \mathbf{z}_t \} \end{align*}\]

Then, it is clear that

\[\begin{align*} \text{Span} \{ \mathbf{x}_1, \cdots, \mathbf{x}_k, \mathbf{y}_1, \cdots, \mathbf{y}_l, \mathbf{z}_1, \cdots, \mathbf{z}_t \} = U + W \end{align*}\]

It suffices to show that this subset is linearly independent.
Assume that

\[\begin{align*} (\star) \quad a_1 \mathbf{x}_1 + \cdots a_k \mathbf{x}_k + b_1 \mathbf{y}_1 + \cdots + b_l \mathbf{y}_l + c_1 \mathbf{z}_1 + \cdots + c_t \mathbf{z}_t \} = \mathbf{0}. \end{align*}\]

where $a_i, b_i, c_i \in F$.

Note that $c_1 \mathbf{z}_1 + \cdots + c_t \mathbf{z}_t \in W \cap U$, so

\[\begin{align*} c_1 \mathbf{z}_1 + \cdots + c_t \mathbf{z}_t = d_1 \mathbf{x}_1 + \cdots + d_k \mathbf{x}_k \end{align*}\]

Thus, $c_i = d_j = 0$ for all $i$ and $j$. Hence, in $(\star)$, $a_i = b_j = 0$ for all $i$ and $j$.



Reference

[1] K. Hoffman, and R. Kunze. Linear Algebra PHI Learning, Second edition, 2004

Leave a comment