A Beginner-Friendly Approach to Understanding Vector Spaces and Subspaces


A Beginner-Friendly Approach to Understanding Vector Spaces and Subspaces

Understanding vector spaces and subspaces is important because they provide a framework for understanding how mathematical structures behave under addition and scalar multiplication.

At first, the definitions of vector spaces and subspaces may seem abstract. However, a vector space is just a set where addition and scalar multiplication behave predictably.

In this guide, you’ll learn what vector spaces and subspaces are, how to verify whether a given set is a vector space, and common examples of vector spaces, including \( \mathbb{R}^n \), matrices, and polynomials.

This article relies on concepts covered in The Ultimate Step by Step Guide to Basic Matrix Operations for Beginners and Vector Operations Tutorial for Beginners: A Complete Step-by-Step Guide, so please read them if you haven’t.

The following infographic illustrates the concepts covered in this article.

Vector Spaces Infographic Resized. Related Article - A Beginner-Friendly Approach to Understanding Vector Spaces and Subspaces.

Vector Spaces

A vector space is a set of objects, called vectors, that can be added together and multiplied by real numbers, called scalars, while remaining in the same set. These operations must satisfy a specific list of properties known as the vector space axioms.

You can think of a vector space as a set where you can add vectors or scale vectors without leaving the set.

Definition

A vector space \( V \) over a field \( F \) is a nonempty set equipped with two operations:

Vector addition: An operation that combines two vectors \( u, v \in V \) to produce another vector \( u + v \in V \).

Scalar multiplication: An operation that combines a scalar \( c \in F \) with a vector \( v \in V \) to produce another vector \( cv \in V \).

These operations must satisfy the following ten axioms for all \( u, v, w \in V \) and \( a, b \in F \):

Closure under addition: \( u + v \in V \).

Commutativity of addition: \( u + v = v + u \).

Associativity of addition: \( (u + v) + w = u + (v + w) \).

Existence of additive identity: There exists a vector \( 0 \in V \) such that \( v + 0 = v \).

Existence of additive inverses: For each \( v \in V \), there exists a vector \( -v \in V \) such that \( v + (-v) = 0 \).

Closure under scalar multiplication: \( av \in V \).

Distributivity of scalar multiplication over addition: \( a(u + v) = au + av \).

Distributivity of vectors over scalar addition: \( (a + b)v = av + bv \).

Compatibility of scalar multiplication: \( a(bv) = (ab)v \).

Multiplicative identity: \( 1v = v \).

If a set satisfies all ten axioms, it is a vector space.

Example

Example 1: Show that the following sets are vector spaces:
(a) $$V = \mathbb{R}^n = \{ (x_1, \dots, x_n) | x_i \in \mathbb{R} \}$$
(b) The set \( M_{mn} \), consisting of all \( m \times n \) matrices.
(c) $$P_n = \{ p(x) = a_0 + \dots + a_n x^n \mid a_i \in \mathbb{R} \}$$

Solution: (a) $$V = \mathbb{R}^n = \{ (x_1, \dots, x_n) | x_i \in \mathbb{R} \}$$

We check that the set satisfies all the axioms.

Closure under addition: $$(u_1, \dots, u_n) + (v_1, \dots, v_n) = (u_1 + v_1, \dots, u_n + v_n) \in V.$$

Commutativity of addition: $$(u_1, \dots, u_n) + (v_1, \dots, v_n) = (u_1 + v_1, \dots, u_n + v_n) = (v_1 + u_1, \dots, v_n + u_n) = (v_1, \dots, v_n) + (u_1, \dots, u_n).$$

Associativity of addition: $$((u_1, \dots, u_n) + (v_1, \dots, v_n))+ (w_1, \dots, w_n) = ((u_1 + v_1), \dots, (u_n + v_n)) + (w_1, \dots, w_n) = ((u_1 + v_1) + w_1, \dots, (u_n + v_n) + w_n) = (u_1 + (v_1 + w_1), \dots, u_n + (v_n + w_n)) = (u_1, \dots, u_n) + ((v_1 + w_1), \dots, (v_n + w_n)) = (u_1, \dots, u_n) + ((v_1, \dots, v_n) + (w_1, \dots, w_n)).$$

Existence of additive identity: $$(v_1, \dots, v_n) + (0, \dots, 0) = (v_1 + 0, \dots, v_n + 0) = (v_1, \dots, v_n).$$

Existence of additive inverses: $$(v_1, \dots, v_n) + (-v_1, \dots, -v_n) = (v_1 – v_1, \dots, v_n – v_n) = (0, \dots, 0).$$

Closure under scalar multiplication: $$c( u_1, \dots, u_n) = (cu_1, \dots, cu_n) \in V.$$

Distributivity of scalar multiplication over addition: $$a((u_1, \dots, u_n) + (v_1, \dots, v_n)) = a(u_1 + v_1, \dots, u_n + v_n) = (a(u_1 + v_1), \dots, a(u_n + v_n)) = (au_1 + av_1, \dots, au_n + av_n) = (au_1, \dots, au_n) + (av_1, \dots, av_n) = a(u_1, \dots, u_n) + a(v_1, \dots, v_n).$$

Distributivity of vectors over scalar addition: $$(a + b)(v_1, \dots, v_n) = ((a + b)v_1, \dots, (a + b)v_n) = (av_1 + bv_1, \dots, av_n + bv_n) = (av_1, \dots, av_n) + (bv_1, \dots, bv_n) = a(v_1, \dots, v_n) + b(v_1, \dots, v_n).$$

Compatibility of scalar multiplication: $$a(b(v_1, \dots, v_n)) = a((bv_1), \dots, (bv_n)) = (a(bv_1), \dots, a(bv_n)) = ((ab)v_1, \dots, (ab)v_n) = (ab)(v_1, \dots, v_n).$$

Multiplicative identity: $$1(v_1, \dots, v_n) = (1v_1, \dots, 1v_n) = (v_1, \dots, v_n).$$

Since all ten axioms hold, \( \mathbb{R}^n \) is a vector space.

(b) The set \( M_{mn} \), consisting of all \( m \times n \) matrices.

We check that the set satisfies all the axioms.

Closure under addition: $$\begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix} + \begin{bmatrix} b_{11} & \dots & b_{1n} \\ \vdots & \ddots & \vdots \\ b_{m1} & \dots & b_{mn} \end{bmatrix} = \begin{bmatrix} a_{11} + b_{11} & \dots & a_{1n} + b_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} + b_{m1} & \dots & a_{mn} + b_{mn} \end{bmatrix} \in V.$$

Commutativity of addition: $$\begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix} + \begin{bmatrix} b_{11} & \dots & b_{1n} \\ \vdots & \ddots & \vdots \\ b_{m1} & \dots & b_{mn} \end{bmatrix} = \begin{bmatrix} a_{11} + b_{11} & \dots & a_{1n} + b_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} + b_{m1} & \dots & a_{mn} + b_{mn} \end{bmatrix} = \begin{bmatrix} b_{11} + a_{11} & \dots & b_{1n} + a_{1n} \\ \vdots & \ddots & \vdots \\ b_{m1} + a_{m1} & \dots & b_{mn} + a_{mn} \end{bmatrix} = \begin{bmatrix} b_{11} & \dots & b_{1n} \\ \vdots & \ddots & \vdots \\ b_{m1} & \dots & b_{mn} \end{bmatrix} + \begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix}.$$

Associativity of addition: $$(\begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix} + \begin{bmatrix} b_{11} & \dots & b_{1n} \\ \vdots & \ddots & \vdots \\ b_{m1} & \dots & b_{mn} \end{bmatrix}) + \begin{bmatrix} c_{11} & \dots & c_{1n} \\ \vdots & \ddots & \vdots \\ c_{m1} & \dots & c_{mn} \end{bmatrix} = \begin{bmatrix} (a_{11} + b_{11}) & \dots & (a_{1n} + b_{1n}) \\ \vdots & \ddots & \vdots \\ (a_{m1} + b_{m1}) & \dots & (a_{mn} + b_{mn}) \end{bmatrix} + \begin{bmatrix} c_{11} & \dots & c_{1n} \\ \vdots & \ddots & \vdots \\ c_{m1} & \dots & c_{mn} \end{bmatrix} = \begin{bmatrix} (a_{11} + b_{11}) + c_{11} & \dots & (a_{1n} + b_{1n}) + c_{1n} \\ \vdots & \ddots & \vdots \\ (a_{m1} + b_{m1}) + c_{m1} & \dots & (a_{mn} + b_{mn}) + c_{mn} \end{bmatrix} = \begin{bmatrix} a_{11} + (b_{11} + c_{11}) & \dots & a_{1n} + (b_{1n} + c_{1n}) \\ \vdots & \ddots & \vdots \\ a_{m1} + (b_{m1} + c_{m1}) & \dots & a_{mn} + (b_{mn} + c_{mn}) \end{bmatrix} = \begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix} + \begin{bmatrix} (b_{11} + c_{11}) & \dots & (b_{1n} + c_{1n}) \\ \vdots & \ddots & \vdots \\ (b_{m1} + c_{m1}) & \dots & (b_{mn} + c_{mn}) \end{bmatrix} = \begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix} + (\begin{bmatrix} b_{11} & \dots & b_{1n} \\ \vdots & \ddots & \vdots \\ b_{m1} & \dots & b_{mn} \end{bmatrix} + \begin{bmatrix} c_{11} & \dots & c_{1n} \\ \vdots & \ddots & \vdots \\ c_{m1} & \dots & c_{mn} \end{bmatrix}).$$

Existence of additive identity: $$\begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix} + \begin{bmatrix} 0 & \dots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \dots & 0 \end{bmatrix} = \begin{bmatrix} a_{11} + 0 & \dots & a_{1n} + 0 \\ \vdots & \ddots & \vdots \\ a_{m1} + 0 & \dots & a_{mn} + 0 \end{bmatrix} = \begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix}.$$

Existence of additive inverses: $$\begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix} + \begin{bmatrix} -a_{11} & \dots & -a_{1n} \\ \vdots & \ddots & \vdots \\ -a_{m1} & \dots & -a_{mn} \end{bmatrix} = \begin{bmatrix} a_{11} – a_{11} & \dots & a_{1n} – a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} – a_{m1} & \dots & a_{mn} – a_{mn} \end{bmatrix} = \begin{bmatrix} 0 & \dots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \dots & 0 \end{bmatrix}.$$

Closure under scalar multiplication: $$a\begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix} = \begin{bmatrix} aa_{11} & \dots & aa_{1n} \\ \vdots & \ddots & \vdots \\ aa_{m1} & \dots & aa_{mn} \end{bmatrix} \in V.$$

Distributivity of scalar multiplication over addition: $$a(\begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix} + \begin{bmatrix} b_{11} & \dots & b_{1n} \\ \vdots & \ddots & \vdots \\ b_{m1} & \dots & b_{mn} \end{bmatrix}) = a\begin{bmatrix} (a_{11} + b_{11}) & \dots & (a_{1n} + b_{1n}) \\ \vdots & \ddots & \vdots \\ (a_{m1} + b_{m1}) & \dots & (a_{mn} + b_{mn}) \end{bmatrix} = \begin{bmatrix} a(a_{11} + b_{11}) & \dots & a(a_{1n} + b_{1n}) \\ \vdots & \ddots & \vdots \\ a(a_{m1} + b_{m1}) & \dots & a(a_{mn} + b_{mn}) \end{bmatrix} = \begin{bmatrix} aa_{11} + ab_{11} & \dots & aa_{1n} + ab_{1n} \\ \vdots & \ddots & \vdots \\ aa_{m1} + ab_{m1} & \dots & aa_{mn} + ab_{mn} \end{bmatrix} = \begin{bmatrix} aa_{11} & \dots & aa_{1n} \\ \vdots & \ddots & \vdots \\ aa_{m1} & \dots & aa_{mn} \end{bmatrix} + \begin{bmatrix} ab_{11} & \dots & ab_{1n} \\ \vdots & \ddots & \vdots \\ ab_{m1} & \dots & ab_{mn} \end{bmatrix} = a\begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix} + a\begin{bmatrix} b_{11} & \dots & b_{1n} \\ \vdots & \ddots & \vdots \\ b_{m1} & \dots & b_{mn} \end{bmatrix}.$$

Distributivity of vectors over scalar addition: $$(a + b)\begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix} = \begin{bmatrix} (a + b)a_{11} & \dots & (a + b)a_{1n} \\ \vdots & \ddots & \vdots \\ (a + b)a_{m1} & \dots & (a + b)a_{mn} \end{bmatrix} = \begin{bmatrix} aa_{11} + ba_{11} & \dots & aa_{1n} + ba_{1n} \\ \vdots & \ddots & \vdots \\ aa_{m1} + ba_{m1} & \dots & aa_{mn} + ba_{mn} \end{bmatrix} = \begin{bmatrix} aa_{11} & \dots & aa_{1n} \\ \vdots & \ddots & \vdots \\ aa_{m1} & \dots & aa_{mn} \end{bmatrix} + \begin{bmatrix} ba_{11} & \dots & ba_{1n} \\ \vdots & \ddots & \vdots \\ ba_{m1} & \dots & ba_{mn} \end{bmatrix} = a\begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix} + b\begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix}.$$

Compatibility of scalar multiplication: $$a(b\begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix}) = a\begin{bmatrix} (ba_{11}) & \dots & (ba_{1n}) \\ \vdots & \ddots & \vdots \\ (ba_{m1}) & \dots & (ba_{mn}) \end{bmatrix} = \begin{bmatrix} a(ba_{11}) & \dots & a(ba_{1n}) \\ \vdots & \ddots & \vdots \\ a(ba_{m1}) & \dots & a(ba_{mn}) \end{bmatrix} = \begin{bmatrix} (ab)a_{11} & \dots & (ab)a_{1n} \\ \vdots & \ddots & \vdots \\ (ab)a_{m1} & \dots & (ab)a_{mn} \end{bmatrix} = (ab)\begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix}.$$

Multiplicative identity: $$1\begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix} = \begin{bmatrix} 1a_{11} & \dots & 1a_{1n} \\ \vdots & \ddots & \vdots \\ 1a_{m1} & \dots & 1a_{mn} \end{bmatrix} = \begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \dots & a_{mn} \end{bmatrix}.$$

Since all ten axioms hold, \( M_{mn} \) is a vector space.

(c) $$P_n = \{ p(x) = a_0 + \dots + a_n x^n \mid a_i \in \mathbb{R} \}$$

We check that the set satisfies all the axioms.

Closure under addition: $$a_0 + \dots + a_nx^n + b_0 + \dots + b_nx^n = (a_0 + b_0) + \dots + (a_n + b_n)x^n \in V.$$

Commutativity of addition: $$a_0 + \dots + a_nx^n + b_0 + \dots + b_nx^n = (a_0 + b_0) + \dots + (a_n + b_n)x^n = (b_0 + a_0) + \dots + (b_n + a_n)x^n = b_0 + \dots + b_nx^n + a_0 + \dots + a_nx^n.$$

Associativity of addition: $$(a_0 + \dots + a_nx^n + b_0 + \dots + b_nx^n) + c_0 + \dots + c_nx^n = (a_0 + b_0) + \dots + (a_n + b_n)x^n + c_0 + \dots + c_nx^n = ((a_0 + b_0) + c_0) + \dots + ((a_n + b_n) + c_n)x^n = (a_0 + (b_0 + c_0)) + \dots + (a_n + (b_n + c_n))x^n = a_0 + (b_0 + c_0)) + \dots + (a_n + (b_n + c_n))x^n = a_0 + \dots + a_nx^n + (b_0 + c_0) + \dots + (b_n + c_n)x^n = a_0 + \dots + a_nx^n + (b_0 + \dots + b_nx^n + c_0 + \dots + c_nx^n).$$

Existence of additive identity: $$a_0 + \dots + a_nx^n + 0 + \dots + 0x^n = (a_0 + 0) + \dots + (a_n + 0)x^n = a_0 + \dots + a_nx^n.$$

Existence of additive inverses: $$a_0 + \dots + a_nx^n – a_0 + \dots – a_nx^n = (a_0 – a_0) + \dots + (a_n – a_n)x^n = 0 + \dots + 0x^n.$$

Closure under scalar multiplication: $$k(a_0 + \dots + a_nx^n) = ka_0 + \dots + ka_nx^n \in V.$$

Distributivity of scalar multiplication over addition: $$k(a_0 + \dots + a_nx^n + b_0 + \dots + b_nx^n) = k((a_0 + b_0) + \dots + (a_n + b_n)x^n) = k(a_0 + b_0) + \dots + k(a_n + b_n)x^n = ka_0 + kb_0 + \dots + ka_nx^n + b_nx^n = ka_0 + \dots + ka_nx^n + kb_0 + \dots + kb_nx^n = k(a_0 + \dots + a_nx^n) + k(b_0 + \dots + b_nx^n).$$

Distributivity of vectors over scalar addition: $$(a + b)(a_0 + \dots + a_nx^n) = (a + b)a_0 + \dots + (a + b)a_nx^n = aa_0 + ba_0 + \dots + aa_nx^n + ba_nx^n = aa_0 + \dots + aa_nx^n + ba_0 + \dots + ba_nx^n = a(a_0 + \dots + a_nx^n) + b(a_0 + \dots + a_nx^n).$$

Compatibility of scalar multiplication: $$a(b(a_0 + \dots + a_nx^n)) = a((ba_0) + \dots + (ba_nx^n)) = a(ba_0) + \dots + a(ba_nx^n) = (ab)a_0 + \dots + (ab)a_nx^n = (ab)(a_0 + \dots + a_nx^n).$$

Multiplicative identity: $$1(a_0 + \dots + a_nx^n) = 1a_0 + \dots + 1a_nx^n = a_0 + \dots + a_nx^n.$$

Since all ten axioms hold, \( P_n \) is a vector space.

Subspaces

A subspace is a vector space contained entirely within another that satisfies all the vector space properties, and every operation in the vector space also works inside the subspace.

Definition

Let \( V \) be a vector space over a field \( F \). A subspace of \( V \) is a nonempty subset \( W \subseteq V \) that is itself a vector space under the same operations of addition and scalar multiplication defined on \( V \).

Subspace Test

We don’t need to verify all ten axioms to check whether a subset \( W \) of \( V \) is a subspace. Instead, we can use the following test.

A subset \( W \subseteq V \) is a subspace of \( V \) if and only if it satisfies all three of the following conditions:

\( W \) contains the zero vector of \( V \).

\( W \) is closed under vector addition: If \( u, v \in W \), then \( u + v \in W \).

\( W \) is closed under scalar multiplication: if \( c \in F \) and \( v \in W \), then \( cv \in W \).

If all three conditions hold, then \( W \) automatically satisfies all the vector space axioms and is therefore a subspace of \( V \).

Example

Example 2: Show that the following are subspaces:
(a) $$W = \{ (x, 2x) \mid x \in \mathbb{R} \} \subseteq \mathbb{R}^2$$
(b) $$W = \{ \begin{bmatrix} a & b \\ b & c \end{bmatrix} | a, b, c \in \mathbb{R} \} \subseteq M_{22}$$
(c) $$W = \{ p(x) \in P_n | p(-x) = p(x) \} \subseteq P_n$$

Solution: (a) $$W = \{ (x, 2x) \mid x \in \mathbb{R} \} \subseteq \mathbb{R}^2$$

We check that the set satisfies all conditions of the test for subspaces:

Zero vector: When \( x = 0 \), we have \( (0, 0) \in W \).

Closure under addition: $$(x_1, 2x_1) + (x_2, 2x_2) = (x_1 + x_2, 2x_1 + 2x_2) = (x_1 + x_2, 2(x_1 + x_2)) \in W.$$

Closure under scalar multiplication: \( c(x, 2x) = (cx, 2cx) \in W \).

Since all three conditions are satisfied, \( W \) is a subspace of \( \mathbb{R}^2 \).

(b) $$W = \{ \begin{bmatrix} a & b \\ b & c \end{bmatrix} | a, b, c \in \mathbb{R} \} \subseteq M_{22}$$

We check that the set satisfies all conditions of the test for subspaces:

Zero vector: When \( a = b= c = 0 \), we have \( \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix} \in W \).

Closure under addition: $$\begin{bmatrix} a_1 & b_1 \\ b_1 & c_1 \end{bmatrix} + \begin{bmatrix} a_2 & b_2 \\ b_2 & c_2 \end{bmatrix} = \begin{bmatrix} a_1 + a_2 & b_1 + b_2 \\ b_1 + b_2 & c_1 + c_2 \end{bmatrix} \in W.$$

Closure under scalar multiplication: $$k\begin{bmatrix} a & b \\ b & c \end{bmatrix} = \begin{bmatrix} ka & kb \\ kb & kc \end{bmatrix} \in W.$$

Since all three conditions are satisfied, \( W \) is a subspace of \( M_{22} \).

(c) $$W = \{ p(x) \in P_n | p(-x) = p(x) \} \subseteq P_n$$

We check that the set satisfies all conditions of the test for subspaces:

Zero vector: \( p(x) = 0 \) is even, so \( 0 \in W \).

Closure under addition: If \( p(x) \) and \( q(x) \) are even, then $$(p + q)(-x) = p(-x) + q(-x) = p(x) + q(x) = (p + q)(x) \in W.$$

Closure under scalar multiplication: $$(cp)(-x) = cp(-x) = cp(x) = (cp)(x) \in W.$$

Since all three conditions are satisfied, \( W \) is a subspace of \( P_n \).

Conclusion

Understanding vector spaces and subspaces is an important step in understanding linear algebra. Through examples, from \( \mathbb{R}^n \) to matrices and polynomials, we’ve seen how vector spaces provide the rules that allow for addition and scalar multiplication, while subspaces reveal the subsets that exist within these spaces. Verifying vector space and subspace properties helps build intuition, showing why seemingly different objects can all fit within the same abstract structure.

Further Reading

A Comprehensive Beginner’s Guide to Partial Fraction Decomposition – With your knowledge of vector spaces, I recommend revisiting partial fraction decomposition in light of this new knowledge.

Frequently Asked Questions

Showing the zero vector belongs to the set is equivalent to showing it’s nonempty. Furthermore, since the subset inherits the operations from the vector space, the operations all have the same properties as the vector space, assuming the subspace is closed under these operations.