Linear algebra is the study of vector spaces and transformation between these vector spaces. A naive view of vector would be of an $n$-tuple of scalar (which are some objects like real numbers), If $F$ is our set of scalars then an $n$-dimensional vector is an element of $F^n. It is popular to write vectors as $\vec{v}$, where is supposed to denote an arrow pointing to some point in an Euclidean like space from the origin. Though such view of vector space being some algebra in Euclidean space akin to coordinate geometry is prevalent, it is but one particular case of vector spaces. We will see over time that it is indeed possible to have a one to one correspondence between any vector space to $F^n$, and yet we should keep in mind that vectors are abstract objects that could be anything satisfy its axioms.

Vector Spaces

Fields

First, we will have to define scalars that scales an vector. Scalars are elements of an algebraic structure called fields which has following axioms

Axioms of Additions

  1. Commutativity of addition: $x+y=y+x$
  2. Associativity of addition: $x+(y+z)=(x+y)+z$
  3. Existence of an additive identity: $\exists\ 0\ni x+0=x$
  4. Existence of additive inverse: $\exists\ -x\ni x+(-x)=0$

Axioms of Multiplication

  1. Commutativity of multiplication: $xy=yx$
  2. Associativity of multiplication: $x(yz)=(xy)z$
  3. Existence of an multiplicative identity: $\exists\ 1\ni x1 = x$
  4. Existence of multiplicative inverse: $\forall x\ne0\ \exists x^{-1}\ni xx^{-1}=1$
  5. Distributivity of multiplication over addition: $x(y + z)=xy+xz$

Axioms of Vector Spaces

A vector space consists of two sets of objects - A field $F$ of scalars, and a set $V$ of vectors. These objects must satisfy following axioms

Axioms of Vectors Addition

  1. $\alpha+\beta=\beta+\alpha$
  2. $\alpha+(\beta+\gamma)=(\alpha+\beta)+\gamma$
  3. $\exists,0\in V\ni\alpha+0=\alpha$
  4. $\forall,\alpha,:,\exists,-\alpha\ni\alpha+(-\alpha)=0$

Axioms of Scalar Multiplication

  1. $\forall,\alpha\in V,:,1\alpha=\alpha$
  2. $(c_1c_2)\alpha=c_1(c_2\alpha)$
  3. $c(\alpha+\beta)=c\alpha+c\beta$
  4. $(c_1+c_2)\alpha=c_1\alpha+c_2\alpha$

Vector Subspaces

Linear Combination and Span

We will define few terms here before further discussions on how they helps us in characterizing a vector space.

A vector $\beta$ is a linear combination of the vectors $\alpha_1,\ldots,\alpha_n$ if there exists scalars $c_1,\ldots,c_n$ such that $\beta=\sum_{i=1}^n c_i\alpha_i$.

The span of a set, $S$ of vectors in $V$ is defined as the set of all vectors that are linear combinations of vectors in $S$.

A set $W\subseteq V$ is a subspace of vector space $V$ if $W$ is a vector space over $F$ with closed vector addition and scalar multiplication.

Intersection of any collection of subspaces of $V$ is a subspace in $V$. Intersection of all subspaces of $V$ containing $S$ gives us the span of $S$.

Basis and Coordinates

A set of vectors $\set{\alpha_1,\ldots,\alpha_n}$ is called linearly independent if $\sum_{i=1}^nc_i\alpha_i=0$ only if each $c_i=0$.

A basis for $V$ is a linearly independent set of vectors that spans $V$. The dimension is the size of the basis for $V$. Once we have discuss matrices, we will prove that any basis for $V$ have the same dimension. So without proving for now, let us start attributing dimension as a property of $V$ itself.

Ordered Basis is a basis with a sequence defined on it. Since a basis, $\mathcal{B}=\{\alpha_1,\ldots\alpha_n\}$ spans the entire vector space, we can say that all vectors in $V$ are some linear combination of vectors in this basis, i.e. all vector $\alpha$ can be written as $\sum_{i=1}^nc_i\alpha_i$. Since the basis is ordered by a sequence $\langle\alpha_1,\ldots\alpha_n\rangle$, we can order these scalars in linear combination too and get a sequence $\langle c_{i\in n}\rangle$. We can write this sequence as a tuple $(c_1,\ldots,c_n)$ which is the coordinate of $\alpha$ with respect to the ordered basis $\mathcal{B}$.

Uniqueness of Coordinates

Let’s say a vector $\alpha$ can be expressed as linear combination of basis vectors in two different ways - $\sum_{i=1}^nc_i\alpha_i$ and $\sum^n_{i=1}c_{i}’\alpha_i$. Then we have,

$$ \begin{align*} &\sum_{i=1}^nc_i\alpha_i=\sum_{i=1}^nc_i^\prime\alpha_i\\ \implies&\sum_{i=1}^nc_i\alpha_i-\sum_{i=1}^nc_i^\prime\alpha_i=0\\ \implies&\sum_{i=1}^n(c_i-c_i^\prime)\alpha_i=0 \end{align*} $$ Since $\set{\alpha_1,\ldots,\alpha_n}$ is an independent set, the equation is true if $\forall i:c_i-c_i^\prime=0$, i.e., $c_i=c_i^\prime$. Hence each vector $\alpha$ has a unique linear combination, and also a unique coordinate with respect to the ordered basis.

Since these coordinates are elements from the set $F^n$ where $n=\mathrm{dim}\ V$, we have constructed a bijection from any vector space to $F^n$. We can view any vector space in terms of $F^n$ keeping in mind that these coordinates are supposed to be the scalars for linear combination for the basis vectors in our actual vector space.