← Back to linear algebra 1

Vectors in Euclidean Space

A vector in nn-dimensional Euclidean space Rn\mathbb{R}^n is an ordered list of nn real numbers, usually written as a column vector:

v=(v1,v2,,vn)=[v1v2vn]\vec{v} = (v_1, v_2, \ldots, v_n) = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix}

Note: We often use boldface (v\mathbf{v}) or arrows (v\vec{v}) to denote vectors, distinguishing them from scalars (ordinary numbers).

Geometric Interpretation

Visualizing Vectors in 2D

Vector u⃗ =(,)
💡 Tip: Drag the arrow head to change direction/magnitude, drag the circular tail to move the starting point, or drag the line to translate the entire vector.
u⃗(3.0, 2.0)

Vector Arithmetic

Given two vectors u=(u1,u2,,un)\vec{u} = (u_1, u_2, \ldots, u_n) and v=(v1,v2,,vn)\vec{v} = (v_1, v_2, \ldots, v_n) in Rn\mathbb{R}^n, and a scalar cRc\in\mathbb{R}. We define

Think about vectors as displacement.

  • First walk along u\vec{u} from the origin
  • Then walk along v\vec{v} from where you ended
  • You end up at the same place as if you walked along u+v\vec{u} + \vec{v} directly from the origin

Parallelogram Rule for Vector Addition

u⃗ =(,);
v⃗ =(,);
u⃗+v⃗ =(5.0,3.0)
💡 Tip: Drag the arrow head to change direction/magnitude, drag the circular tail to move the starting point, or drag the line to translate the entire vector.
u⃗+v⃗(5.0, 3.0)u⃗(4.0, 1.0)v⃗(1.0, 2.0)

Vector subtraction can be understood as adding the opposite vector: uv=u+(v)\vec{u} - \vec{v} = \vec{u} + (-\vec{v}).

  1. The red dashed vector v-\vec{v} is v\vec{v} reversed (opposite direction, same magnitude)
  2. Form a parallelogram with u\vec{u} and v-\vec{v} as adjacent sides
  3. The purple diagonal from the origin is uv\vec{u} - \vec{v} (using parallelogram law)

Vector Subtraction

u⃗ =(,);
v⃗ =(,);
u⃗-v⃗ =(-4.0,2.0)
💡 Tip: Drag the arrow head to change direction/magnitude, drag the circular tail to move the starting point, or drag the line to translate the entire vector.
-v⃗(-5.0, -1.0)u⃗-v⃗(-4.0, 2.0)u⃗-v⃗u⃗(1.0, 3.0)v⃗(5.0, 1.0)

The lighter purple vector from the tip of v\vec{v} to the tip of u\vec{u} also represents uv\vec{u} - \vec{v}. Think about u\vec{u} as your position and v\vec{v} as a friend's position, then uv\vec{u} - \vec{v} is the displacement pointing from your friend to you.

Scalar multiplication stretches or shrinks a vector and may reverse its direction:

  • If c>1c > 1: Stretches v\vec{v} by factor cc (same direction)
  • If 0<c<10 < c < 1: Shrinks v\vec{v} by factor cc (same direction)
  • If c=0c = 0: Results in the zero vector 0\vec{0}
  • If c<0c < 0: Reverses direction and scales by c|c|

For v=[21]\vec{v} = \begin{bmatrix} 2 \\ 1 \end{bmatrix}:

  • 2v=[42]2\vec{v} = \begin{bmatrix} 4 \\ 2 \end{bmatrix} — twice as long, same direction
  • 12v=[10.5]\frac{1}{2}\vec{v} = \begin{bmatrix} 1 \\ 0.5 \end{bmatrix} — half as long, same direction
  • v=[21]-\vec{v} = \begin{bmatrix} -2 \\ -1 \end{bmatrix} — same length, opposite direction
  • 3v=[63]-3\vec{v} = \begin{bmatrix} -6 \\ -3 \end{bmatrix} — three times as long, opposite direction

For any vectors u,v,w\vec{u}, \vec{v}, \vec{w} in Rn\mathbb{R}^n:

  1. Commutativity: u+v=v+u\vec{u} + \vec{v} = \vec{v} + \vec{u}
  2. Associativity: (u+v)+w=u+(v+w)(\vec{u} + \vec{v}) + \vec{w} = \vec{u} + (\vec{v} + \vec{w})
  3. Identity: u+0=u\vec{u} + \vec{0} = \vec{u}
  4. Inverse: u+(u)=0\vec{u} + (-\vec{u}) = \vec{0}, where u=[u1u2un]-\vec{u} = \begin{bmatrix} -u_1 \\ -u_2 \\ \vdots \\ -u_n \end{bmatrix}

For any vectors u,v\vec{u}, \vec{v} in Rn\mathbb{R}^n and scalars c,dc, d:

  1. Associativity: c(dv)=(cd)vc(d\vec{v}) = (cd)\vec{v}
  2. Distributivity over vector addition: c(u+v)=cu+cvc(\vec{u} + \vec{v}) = c\vec{u} + c\vec{v}
  3. Distributivity over scalar addition: (c+d)v=cv+dv(c + d)\vec{v} = c\vec{v} + d\vec{v}
  4. Identity: 1v=v1\vec{v} = \vec{v}

Linear Combinations

A linear combination of vectors v1,v2,,vk\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_k is an expression of the form:

c1v1+c2v2++ckvkc_1\vec{v}_1 + c_2\vec{v}_2 + \cdots + c_k\vec{v}_k

where c1,c2,,ckc_1, c_2, \ldots, c_k are scalars (called coefficients or weights).

Let v1=[101]\vec{v}_1 = \begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix}, v2=[012]\vec{v}_2 = \begin{bmatrix} 0 \\ 1 \\ 2 \end{bmatrix}, and v3=[110]\vec{v}_3 = \begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix}.

Compute v1+2v2v3\vec{v}_1 + 2\vec{v}_2 - \vec{v}_3:

v1+2v2v3=[101]+2[012][110]=[101]+[024]+[110]=[1+010+211+4+0]=[015]\begin{aligned} \vec{v}_1 + 2\vec{v}_2 - \vec{v}_3 &= \begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix} + 2\begin{bmatrix} 0 \\ 1 \\ 2 \end{bmatrix} - \begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix} \\ &= \begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix} + \begin{bmatrix} 0 \\ 2 \\ 4 \end{bmatrix} + \begin{bmatrix} -1 \\ -1 \\ 0 \end{bmatrix} \\ &= \begin{bmatrix} 1 + 0 - 1 \\ 0 + 2 - 1 \\ 1 + 4 + 0 \end{bmatrix} \\ &= \begin{bmatrix} 0 \\ 1 \\ 5 \end{bmatrix} \end{aligned}