KEMBAR78
Chapter 2 | PDF | Linear Subspace | Vector Space
0% found this document useful (0 votes)
92 views97 pages

Chapter 2

The document provides an overview of vector spaces in linear algebra, defining a vector space as a set of vectors with operations of addition and scalar multiplication that satisfy specific axioms. It includes examples such as Euclidean space, n-tuple space, and spaces of matrices and functions, illustrating how these structures meet the criteria of vector spaces. Additionally, it discusses concepts like linear combinations and subspaces, along with problems to demonstrate understanding of these concepts.

Uploaded by

asmitsd2024
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
92 views97 pages

Chapter 2

The document provides an overview of vector spaces in linear algebra, defining a vector space as a set of vectors with operations of addition and scalar multiplication that satisfy specific axioms. It includes examples such as Euclidean space, n-tuple space, and spaces of matrices and functions, illustrating how these structures meet the criteria of vector spaces. Additionally, it discusses concepts like linear combinations and subspaces, along with problems to demonstrate understanding of these concepts.

Uploaded by

asmitsd2024
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 97

Vector Spaces

Dr. Jagannath Bhanja


IIITDM Kancheepuram, Chennai
Mathematical structures in linear algebra

(1) Field (See Chapter 1)


(2) Vector Space
(3) ...........

1
A vector space V over a field F

A vector space ⟨ V , F , +, . ⟩ consists of the following:

(1) a field F of scalars;


(2) a set V of objects, called vectors;
(3) an operation + : V × V −→ V , called vector addition, which
satisfies the following axioms:
(a) addition is commutative

α + β = β + α, for all α, β ∈ V ;

(b) addition is associative

α + (β + γ) = (α + β) + γ, for allα, β, γ ∈ V ;

2
contd.

(c) there is a unique vector 0 ∈ V called the zero vector such that

α + 0 = α, for all α ∈ V ;

(d) for each vector α ∈ V , there is a unique vector −α ∈ V


such that α + (−α) = 0;

(4) an operation . : F × V −→ V , called scalar multiplication,


which satisfies the following axioms.
(e) 1.α = α, for all α ∈ V ;
(f) (c1 c2 )α = c1 (c2 α), for all c1 , c2 ∈ F , α ∈ V ;
(g) c(α + β) = cα + cβ, for all α, β ∈ V , c ∈ F ;
(h) (c1 + c2 )α = c1 α + c2 α, for all c1 , c2 ∈ F , α ∈ V .

3
Example 1 : The Euclidean space ⟨Rn , R, +, .⟩

V = Rn = {(x1 , x2 , . . . , xn ) : xi ∈ R} and F = R.

Let α = (x1 , x2 , . . . , xn ), β = (y1 , y2 , . . . , yn ) ∈ V


Let us define vector addition and scalar multiplication as follows:
Define + : V × V −→ V as (vector addition)

α + β = (x1 + y1 , x2 + y2 , . . . , xn + yn )

and . : F × V −→ V as

cα = (cx1 , cx2 , . . . , cxn )

Then ⟨Rn , R, +, .⟩ = Rn is a vector space.

4
Verification

(a)

α + β = (x1 + y1 , x2 + y2 , . . . , xn + yn )
= (y1 + x1 , y2 + x2 , . . . , yn + xn )
= β + α.

(b) Let γ = (z1 , z2 , . . . , zn ). Then

α + (β + γ) = (x1 , x2 , . . . , xn ) + (y1 + z1 , y2 + z2 , . . . , yn + zn )
= (x1 + y1 + z1 , x2 + y2 + z2 , . . . , xn + yn + zn )
= (x1 + y1 , x2 + y2 , . . . , xn + yn ) + (z1 , z2 , . . . , zn )
= (α + β) + γ.

5
contd.

(c) Let 0 = (0, 0, . . . , 0) ∈ V = Rn such that


α + 0 = (x1 , . . . , xn ) + (0, . . . , 0) = (x1 , . . . , xn ) = α.
(d) For every α = (x1 , x2 , . . . , xn ), there exists
−α = (−x1 , −x2 , . . . , −xn ) ∈ V such that α + (−α) = 0.
(e) 1 · α = (1 · x1 , 1 · x2 , . . . , 1 · xn ) == (x1 , x2 , . . . , xn )α.
(f) For scalars c1 , c2 and a vector α = (x1 , x2 , . . . , xn ), we have

(c1 c2 ) · α = (c1 c2 x1 , c1 c2 x2 , . . . , c1 c2 xn )
= c1 · (c2 x1 , c2 x2 , . . . , c2 xn )
= c1 · (c2 · α).

6
contd.

(g) Let α = (x1 , x2 , . . . , xn ) and β = (y1 , y2 , . . . , yn ), we have

c(α + β) = c(x1 + y1 , x2 + y2 , . . . , xn + yn )
= (c · (x1 + y1 ), c · (x2 + y2 ), . . . , c · (xn + yn ))
= (cx1 + cy1 , cx2 + cy2 , . . . , cxn + cyn ))
= cα + cβ.

(h) For scalars c1 , c2 and a vector α = (x1 , x2 , . . . , xn ), we have

(c1 + c2 ) · α = ((c1 + c2 )x1 , (c1 + c2 )x2 , . . . , (c1 + c2 )xn )


= (c1 x1 + c2 x1 , c1 x2 + c2 x2 , . . . , c1 xn + c2 xn )
= (c1 x1 , c1 x2 , . . . , c1 xn ) + (c2 x1 , c2 x2 , . . . , c2 xn )
= c1 · α + c2 · α.
7
Example 2: The n-tuple space ⟨F n , F , +, .⟩

Let F be a field and let

V = F n = {(x1 , x2 , . . . , xn ) : xi ∈ F } .

Let α = (x1 , x2 , . . . , xn ), β = (y1 , y2 , . . . , yn ) ∈ V = F n .


Define + : V × V −→ V as (vector addition)

α + β = (x1 + y1 , x2 + y2 , . . . , xn + yn )

and . : F × V −→ V as

cα = (cx1 , cx2 , . . . , cxn )

Show that ⟨F n , F , +, .⟩ is a vector space.


8
Example 3 : The space of m × n matrices ⟨F m×n , F , +, .⟩

Let F be a field and

V = F m×n = {A = [aij ]m×n : aij ∈ F } .

We define vector addition and scalar mulatiplication as follows,


where A = [aij ], B = [bij ] ∈ V and c ∈ F

[A + B]ij = [aij + bij ]

and
[cA]ij = [caij ].

Show that F m×n is a vector space over F


Note that F n×n is not a field

9
Example 4: The set of all real valued continuous functions
defined on [0, 1]

Let V = {f : f : [0, 1] −→ R and f is continuous on [0,1]}


We define
+ : V × V −→ V

as (f + g )(s) = f (s) + g (s) for s ∈ [0, 1].

· : R × V −→ V

as (cf )(s) = cf (s) for s ∈ [0, 1]

Show that ⟨V , R, +, .⟩ is a a vector space.

10
Example 5: The space of polynomial functions over a field

Assignment

11
Problem 1

Let V = {(x, y ) : x, y ∈ R}. We define

(x1 , y1 ) + (x2 , y2 ) = (x1 + x2 , y1 + y2 )

and
c(x, y ) = (cx, y )

Prove or disprove that ⟨V , R, +, .⟩ is a vector space.

Solution: Verify (c1 + c2 )α ̸= c1 α + c2 α.

12
Note 1

Let V be a vector space over a field F . We have

0 = 0 + 0, (additive identity)

c0 = c(0 + 0), c ∈F

c0 = c0 + c0, (c(α + β) = cα + cβ)

Add −(c0) ∈ V on both sides

c0 + (−(c0)) = (c0 + c0) + (−(c0)),

0 = c0 + (c0 + −(c0)) , (Associative)

13
Note 1 contd.

0 = c0 + 0, (Existence of inverse)

0 = c0 (additive identity )

c0 = 0 for all c ∈ F .

Qn. Show that 0α = 0 for all α ∈ V , where 0 is the additive


identity in the field F and 0 is the zero vector in the vector
space V .

14
Note 2

0 = 0α, (see last question)


= (1 + (−1)) α
= 1.α + (−1)α ( Reason: (c1 + c2 )α = c1 α + c2 α)
= α + (−1)α (Reason : 1.α = α)
=⇒ additive inverse of α, −α = (−1)α.

15
Note 3

Prove that if cα = 0, then c = 0 or α = 0.


Proof : Suppose that c ̸= 0 (else 0α = 0).
Since 0 ̸= c ∈ F and F is a field, c −1 ∈ F .

cα = 0

=⇒ c −1 (cα) = c −1 0 = 0

=⇒ (c −1 c)α = 0 Reason: (c1 c2 )α = c1 (c2 α)

=⇒ 1.α = 0 Reason : c −1 c = 1

=⇒ α=0 Reason : 1.α = α

16
Linear Combination

Definition
Let V be a vector space over a field F . A vector β ∈ V is said to
be a linear combination of vectors α1 , α2 , . . . , αn in V if there exist
scalars c1 , c2 , . . . , cn in F such that
n
X
β = c1 α1 + c2 α2 + . . . + cn αn = ci αi .
i=1

17
Problem 2

Show that (x, y , z) ∈ R3 is a linear combination of vectors


α = (1, 1, 1), β = (0, 1, 1) and γ = (0, 0, 1).
Solution : Find scalars (if exist) a, b, c ∈ R such that

(x, y , z) = aα + bβ + cγ

(x, y , z) = a(1, 1, 1) + b(0, 1, 1) + c(0, 0, 1)

(x, y , z) = (a, a + b, a + b + c)

(x, y , z) = x(1, 1, 1) + (y − x)(0, 1, 1) + (z − y )(0, 0, 1).

18
Problem 3

Prove or disprove that (1, 2, 3) is a linear combination of


α = (1, 1, 1) and β = (0, 1, 1).
Ans. No.

(1, 2, 3) = a(1, 1, 1) + b(0, 1, 1)

=⇒ a + b = 2 and a + b = 3, lead us to a contradiction.

19
Problem 4

Let R be the real field. Find all vectors in R3 that are linear
combination of (1, 0, −1), (0, 1, 1) and (1, 1, 1).
Solution: Objective is to find all linear combinations of vectors
(1, 0, −1), (0, 1, 1) and (1, 1, 1).
That is, find all (x, y , z) such that there exist a, b, c ∈ R such that

a(1, 0, −1) + b(0, 1, 1) + c(1, 1, 1) = (x, y , z).

That is, find a, b, c (if exist) such that


    
1 0 1 a x
 0 1 1  b  =  y 
    

−1 1 1 c z

=⇒ AX = Y
20
Problem 4 contd.

Find a row-reduced echelon matrix which is row-equivalent to A.


       
1 0 1 1 0 1 1 0 1 1 0 0
A= 0 1 1 ∼ 0 1 1 ∼ 0 1 1 ∼ 0 1 0 
       

−1 1 1 0 1 2 0 0 1 0 0 1

By Theorem 12, A is invertible (A ∼ I ). By Theorem 13, the system


AX = Y has a solution X for all Y . Hence for every
Y t = (x, y , z) ∈ R3 , there exists X t = (a, b, c) such that

a(1, 0, −1) + b(0, 1, 1) + c(1, 1, 1) = (x, y , z).

21
Subspace

Definition
Let V be a vector space over a field F . A subspace of V is a subset
W of V which itself a vector space over F with the operations of
vector addition and scalar multiplication defined on V .

22
Examples of subspaces

Example-1. Let V be a vector space over the field F . Then the


subset {0} of V is a subspace of V and it is called the zero
subspace.

23
Theorem 1

Let V be a vector space over the field F . A non-empty subset W of


V is a subspace of V if and only if

∀ α, β ∈ W , c ∈ F =⇒ cα + β ∈ W .

Proof:
Case 1: Suppose that W is a subspace of V . So, W is a vector
space over the field F . Thus, if c ∈ F and α ∈ W , then cα ∈ W
(closed under scalar multiplication). Again, as W is closed under
vector addition, for any vector β ∈ W one has cα + β ∈ W . Hence

∀ α, β ∈ W , c ∈ F =⇒ cα + β ∈ W .

24
Theorem 1 contd.

Case 2: Suppose that W is a non-empty subset of V with the


property that

∀ α, β ∈ W , c ∈ F =⇒ cα + β ∈ W . − − − −(1)

Since W ̸= ϕ, there exists γ ∈ W and hence 0 = (−1)γ + γ ∈ W ,


by equation (1).
For all α, β ∈ W , α + β = 1 · α + β ∈ W by equation (1).
For all α ∈ W , cα = cα + 0 ∈ W by equation (1).
Furthermore, −α = (−1)α + 0 ∈ W for all α ∈ W by (1).
Since W ⊆ V , ⟨W , F , +, .⟩ satisfies the rest of the axioms (verify!)
of a vector space and thus W is a subspace of V .

25
Examples of subspaces

Example-2. Note that W = {(0, x2 . . . , xn ) : xi ∈ F } is a subspace


of F n .
Proof: Clearly 0 = (0, 0, . . . , 0) ∈ W . So ϕ ̸= W ⊆ F n . Let
α = (0, x2 , . . . , xn ), β = (0, y2 , . . . , yn ) ∈ W and c ∈ F .

cα + β = (0, cx2 + y2 , . . . , cxn + yn ) ∈ W

By Theorem 1, W is a subspace of F n .

Example-3. Prove that W = {(1 + x2 , x2 , x3 , . . . , xn ) : xi ∈ F } is


not a subspace of F n .
Reason: 0 = (0, 0, . . . , 0) ∈
/ W.

26
Examples contd.

Example-4. Prove that the solution set of the homogeneous system


AX = 0 is subspace of F n×1 where A ∈ F m×n .

Let S = {X ∈ F n×1 : AX = 0}. Clearly 0 ∈ S ̸= ϕ.


Let X1 , X2 ∈ S. Then AX1 = AX2 = 0.

=⇒ A(cX1 + X2 ) = cAX1 + AX2 = 0,

This implies

∀ X1 , X2 ∈ S, c ∈ F =⇒ cX1 + X2 ∈ S.

Hence, by Theorem 1, S is a subspace of F n×1 .

27
Theorem 2

Let V be a vector space over the field F . Let W1 , W2 be two


subspaces of V . Then W1 ∩ W2 is a subspace of V .
Proof: Since W1 and W2 are subspace of V . Since W1 and W2 are
themselves vector spaces, 0 ∈ W1 and 0 ∈ W2 . Thus,
0 ∈ W1 ∩ W2 , that implies W1 ∩ W2 ̸= ∅.
Now, let α, β ∈ W1 ∩ W2 and c ∈ F . This implies α, β ∈ W1 and
α, β ∈ W2 . Since W1 and W2 are subspaces of V , by Theorem 1,
we have cα + β ∈ W1 and cα + β ∈ W2 . This implies
cα + β ∈ W1 ∩ W2 . Hence, by Theorem 1, W1 ∩ W2 is a subspace
of V .
Corollary: Intersection of any collection of subspaces of a vector
space V is a subspace of V .

28
The subspace spanned by S

Definition: Let S be a subset of a vector space V . The subspace


spanned by S is defined as the intersection all subspaces of V which
contains S.
\
Subspace spanned by S = {W : S ⊆ W , W is a subspace of V }.

Note-1. Subspace spanned by S is the smallest subspace which


contains S.
Note-2. If S = {α1 , α2 , . . . , αn }, we call the subspace spanned by
S as the subspace spanned by the vectors α1 , α2 , . . . , αn .

29
L(S)= the set of all linear combinations of vectors in S

Let S be a non-empty subset of a vector space V . The set of all


linear combinations of vectors in S is denoted by L(S). In other
words
( n )
X
L(S) = ci αi : ci ∈ F , αi ∈ S, n ∈ N .
i=1

Example. Let S = {(1, 0, 0), (0, 0, 1)}. Then

L(S) = {a(1, 0, 0) + b(0, 0, 1) : a, b ∈ R}


= {(a, 0, b) : a, b ∈ R} .

30
Lemma

S ⊆ L(S) and L(S) is a subspace of V .

Proof: Since S ̸= ϕ, let S = {α1 , α2 , . . . , αn }. Observe that


α1 = 1 · α1 + 0 · α2 + · · · + 0 · αn . So, α1 ∈ L(S). Similarly, it can
be shown that αi ∈ L(S). Therefore, S ⊆ L(S) and hence, L(S) is
nonempty.
To show L(S) is a subspace of V , by Theorem 1, it is equivalent to
check that for all α, β ∈ L(S) and c ∈ F , cα + β ∈ L(S).
Since α, β ∈ L(S), we have α = ni=1 ci αi and β = ni=1 di αi .
P P

Thus, cα + β = ni=1 (cci )αi + ni=1 di αi = ni=1 (cci + di )αi is a


P P P

linear combination of vectors in S. Thus cα + β ∈ L(S). Hence,


L(S) is a subspace of V .

31
Theorem 3

Let S be a non-empty subset of a vector space V over the field F .


Then the subspace spanned by the set S is the set of all linear
combinations of vectors in S.
Proof.
Let \
W ∗ = {W : S ⊆ W , W is a subspace of V }

We want to show that

W ∗ = L(S) − − − (a)

By the previous lemma, S ⊆ L(S) and L(S) is a subspace of V , and


thus
W ∗ ⊆ L(S) − − − −(i).

32
Theorem 3 contd.

It remains to show that L(S) ⊂ W ∗ . It is enough to show that, if


W is a subspace containing S, then L(S) ⊆ W .
Let x ∈ L(S). Then x is a linear combination of vectors in S. Since
W is a subspace of V and S ⊆ W , every linear combination of
vectors in S is also a member of W and thus x ∈ W .

x ∈ L(S) =⇒ x ∈ W . Thus L(S) ⊆ W .

Hence we proved that


\
L(S) ⊆ {W : S ⊆ W , W is a subspace of V } = W ∗ − −(ii).

By (i) and (ii),


\
W∗ = {W : S ⊆ W , W is a subspace of V } = L(S)−−−(a).
33
Note 1: ( Visit previous lecture notes)

Find the solution space of the system RX = 0


 
0 1 −3 0 21 )
x2 − 3x3 + 21 x5 = 0
R= 0 0 0 1 2 
 
x4 + 2x5 = 0
0 0 0 0 0

No. of non-zero rows of R, r = 2, No. of variables, n = 5.


k1 = 2, k2 = 4 =⇒ Pivot variables = {xk1 , xk2 } = {x2 , x4 }.
No. of free variables = n − r = 5 − 2 = 3.
Free variables = {x1 , x3 , x5 }.
Set the free variables as: x1 = a, x3 = b, x5 = c
=⇒ x2 = 3b − 12 c, x4 = −2c.

34
Note 1 contd. (back to chapter one !)

Solution set S = (a, 3b − 12 c, b, −2c, c) : a, b, c ∈ R




 
1
S = a(1, 0, 0, 0, 0) + b(0, 3, 1, 0, 0) + c(0, − , 0, −2, 1) : a, b, c ∈ R
2

 
1
= Span of (1, 0, 0, 0, 0), (0, 3, 1, 0, 0), (0, − , 0, −2, 1) .
2

35
Problem

Let W be set of all (x1 , x2 , x3 , x4 , x5 ) ∈ R5 which satisfies

2x1 − x2 + 34 x3 − x4 =0
2
x1 + 3 x3 − x5 = 0
9x1 − 3x2 + 6x3 − 3x4 − 3x5 = 0

Find a finite set of vectors which spans W .

36
Row space and Column space of a matrix

Let A ∈ F m×n with rows {R1 , R2 , . . . , Rm } and columns


{C1 , C2 , . . . , Cn }. Then

Row space of A = The subspace spanned by R1 , R2 , . . . , Rm .

Column space of A = The subspace spanned by C1 , C2 , . . . , Cn .

Note .: Row space of A ⊆ F 1×n and Column space of


A ⊆ F m×1 .

37
Example

" #
1 0 0
Let A =
0 1 0
where
! ! !
1 0 0
R1 = (1, 0, 0), R2 = (0, 1, 0), C1 = , C2 = , C3 =
0 1 0

Row space of A

= {a(1, 0, 0) + b(0, 1, 0) : a, b ∈ F } = {(a, b, 0) : a, b ∈ F } .

Column Space of A
( ! ! ! ) ( !)
1 0 0 a
= a +b +c : a, b, c ∈ F = . 38
0 1 0 b
Observations

(i) Column space of AB is same as column space of A.

(ii) Row space of AB is same as row space of B.

39
Observations cont...


" # x
a11 a12 a13  
AX =  y 
a21 a22 a23
z
" #
a11 x + a12 y + a13 z
=
a21 x + a22 y + a23 z
! ! !
a11 a12 a13
=x +y +z
a21 a22 a23

= xC1 + yC2 + zC3 (Ci is the ith column of A).

(1) AX is a linear combination of columns of the matrix A.


(2) Every column of AB is a linear combination of columns of A.
(3) Every row of AB is a linear combination of rows of B.
40
Linearly Dependent (L.D.) and Linearly Independent (L.I.)

Definition:
Let V be a vector space over the field F . A subset
S = {α1 , α2 , . . . , αn } of V is said to be linearly dependent if there
exist scalars c1 , c2 , . . . , cn ∈ F , ci ̸= 0 for at least one i, such that

c1 α1 + c2 α2 + . . . + cn αn = 0.

Definition:
A set which is not linearly dependent is called a linearly independent
set.
In other words, if c1 α1 + c2 α2 + . . . + cn αn = 0, then
c1 = c2 = · · · = cn = 0.

41
Note

1. Any set which contains a linearly dependent set is linearly


dependent.
2. Any subset of a linearly independent set is linearly
independent.
3. Any set which contains the 0 vector is linearly dependent.
Reason 1 · 0 = 0.

42
Problem 1

Show that α1 = (3, 0, −3), α2 = (−1, 1, 2), α3 = (4, 2, −2) and


α4 = (2, 1, 1) are linearly dependent (L.D.) on R 3 .

Solution: Find scalars c1 , c2 , c3 , c4 (at least one ci ̸= 0) such that


c1 α1 + c2 α2 + c3 α3 + c4 α4 = 0.

2α1 + 2α2 − α3 + 0α4 = 0.

43
Problem 2

Show that e1 = (1, 0, 0), e2 = (0, 1, 0) and e3 = (0, 0, 1) is a


linearly independent subset of F 3 .
Solution: Consider c1 e1 + c2 e2 + c3 e3 = 0

=⇒ c1 (1, 0, 0) + c2 (0, 1, 0) + c3 (0, 0, 1) = (0, 0, 0)

=⇒ (c1 , c2 , c3 ) = (0, 0, 0)

=⇒ c1 = c2 = c3 = 0.

Hence {e1 , e2 , e3 } is a L.I . subset of F 3 .

44
Note

Note:
{e1 = (1, 0, 0, . . . , 0, 0), e2 = (0, 1, 0, . . . , 0, 0) . . . , en = (0, 0, 0, . . . , 0, 1)}
is a linearly independent subset of F n .

45
Applications

(i) The columns of A forms a linearly independent set if and


only if AX = 0 has only trivial solution.

(ii) If A is an invertible matrix if and only if the columns of A


forms a linearly independent set (By note (i) and Theorem 13,
chapter 1).

46
Basis

Definition:
Let V be a vector space over the field F . A non-empty set B ⊆ V
is a basis for V if

1. B is a linearly independent subset of V and


2. V = span B (= L(B) ).

Definition:
A vector space V is called finite dimensional if it has a finite basis.

47
Example-1

The set B =
{e1 = (1, 0, 0, . . . , 0, 0), e2 = (0, 1, 0, . . . , 0, 0) . . . , en = (0, 0, 0, . . . , 0, 1)}
is a basis of Rn .
Verification:
Claim 1: B is a linearly independent set in Rn .
Consider

c1 (1, 0, . . . , 0)+c2 (0, 1, 0, . . . , 0)+· · ·+cn (0, 0, . . . , 0, 1) = (0, 0, . . . , 0)

=⇒ (c1 , c2 , . . . , cn ) = (0, 0, . . . , 0)

=⇒ c1 = c2 = · · · = cn = 0

=⇒ B is a L.I . set.

48
Example-1 contd.

Claim 2 : Rn = span B. That is every vector of Rn can be written


as a linear combination of vectors of B.
Let x = (x1 , x2 , . . . , xn ) be any arbitrary vector in Rn . As

(x1 , x2 , . . . , xn ) = x1 (1, 0, . . . , 0)+x2 (0, 1, . . . , 0)+· · ·+xn (0, . . . , 0, 1)

=⇒ (x1 , x2 , . . . , xn ) = x1 e1 + x2 e2 + . . . + xn en

we get that
span B = Rn .

Hence, by claims 1 and 2, B is a basis of Rn .

Note: B = {e1 , e2 , . . . , en } is called the standard basis of Rn .

49
Example-2

The set B =
{e1 = (1, 0, 0, . . . , 0, 0), e2 = (0, 1, 0, . . . , 0, 0) . . . , en = (0, 0, 0, . . . , 0, 1)}
is a basis of the vector space F n , where F is a field.

50
Example-3

The set B = {(0, 1, 1), (1, 0, 1), (1, 1, 0)} is a basis for R3 .
Verification:
Claim 1: B is a linearly independent set in R3 .
Consider

c1 (0, 1, 1) + c2 (1, 0, 1) + c3 (1, 1, 0) = (0, 0, 0)

=⇒ (0, c1 , c1 ) + (c2 , 0, c2 ) + (c3 , c3 , 0) = (0, 0, 0)

=⇒ (c2 + c3 , c1 + c3 , c1 + c2 ) = (0, 0, 0)

=⇒ c2 + c3 = 0, c1 + c3 = 0, c1 + c2 = 0

On solving these three equations we obtain


c1 = 0, c2 = 0 and c3 = 0. Thus, B is a linearly independent set
of R3 .
51
Example-3 contd.

Claim 2: R3 = span B. That is, we have to show that every vector


of R3 is a linear combination of vectors of B.
Let (x, y , z) be an arbitrary vector of R3 . We want to find scalars
c1 , c2 , c3 such that

c1 (0, 1, 1) + c2 (1, 0, 1) + c3 (1, 1, 0) = (x, y , z)

=⇒ (0, c1 , c1 ) + (c2 , 0, c2 ) + (c3 , c3 , 0) = (x, y , z)

=⇒ (c2 + c3 , c1 + c3 , c1 + c2 ) = (x, y , z)

=⇒ c2 + c3 = x, c1 + c3 = y , c1 + c2 = z

52
Example-3 contd.

Upon solving these three equations we obtain


c1 = −x+y
2
+z
, c2 = x−y2 +z and c3 = x+y2 −z . Thus,

(x, y , z)
−x + y + z x −y +z x +y −z
= (0, 1, 1) + (1, 0, 1) + (1, 1, 0).
2 2 2
This implies
span B = R3 .

Hence, B is a basis of R3 .

53
Problem-3:

Find a basis of the solution space of the system RX = 0, where


 
0 1 −3 0 12
R =  0 0 0 1 2 .
 

0 0 0 0 0

Solution.
No. of non-zero rows of R, r = 2, No. of variables, n = 5.
k1 = 2, k2 = 4 =⇒ Pivot variables = {xk1 , xk2 } = {x2 , x4 }.
No. of free variables = n − r = 5 − 2 = 3.
Free variables = {x1 , x3 , x5 }.
Set the free variables as: x1 = a, x3 = b, x5 = c
=⇒ x2 = 3b − 12 c, x4 = −2c.

54
Problem-3 contd...

Solution set S = (a, 3b − 12 c, b, −2c, c) : a, b, c ∈ R




 
1
S = a(1, 0, 0, 0, 0) + b(0, 3, 1, 0, 0) + c(0, − , 0, −2, 1) : a, b, c ∈ R
2

 
1
= Span of (1, 0, 0, 0, 0), (0, 3, 1, 0, 0), (0, − , 0, −2, 1) .
2

Let
1
E1 = (1, 0, 0, 0, 0), E3 = (0, 3, 1, 0, 0) and E5 = (0, − , 0, −2, 1).
2

55
Problem-3 contd...

We prove that {E1 , E3 , E5 } is a linearly independet set.


Let
1
c1 (1, 0, 0, 0, 0) + c2 (0, 3, 1, 0, 0) + c3 (0, − , 0, −2, 1) = (0, 0, 0, 0, 0)
2
1
⇒ (c1 , 3c2 − c3 , c2 , −2c3 , c3 ) = (0, 0, 0, 0, 0)
2
This implies c1 = 0, c2 = 0 and c3 = 0. Thus, {E1 , E3 , E5 } is a
linearly independet set.
Hence {E1 , E3 , E5 } is a basis of the solution space S.

56
Problem 4 (assignment)

Find a basis for the solution set S of the system of equations

2x1 − x2 + 34 x3 − x4 =0
2
x1 + 3 x3 − x5 = 0 .
9x1 − 3x2 + 6x3 − 3x4 − 3x5 = 0

57
Problem-5

Find a basis for the vector space F m×n of all m × n matrices over
the field F .
Solution: Define the matrices Ei,j in the following way:
(i). The ij th entry of matrix Ei,j is one
(ii). All other entries are zero.
Verify that the set B = {Eij : i = 1, 2, . . . , m; j = 1, 2, . . . , n} is a
basis for F m×n .

This is an extension of the idea of the standard basis of F n .

58
The column vectors of an invertible matrix form a basis

Let P ∈ F n×n be an invertible matrix. Let P1 , P2 , . . . , Pn be the


columns of P. Show that B = {P1 , P2 , . . . , Pn } is a basis of F n×1 .
Claim 1: B is a L.I. set.
Consider x1 P1 + x2 P2 + . . . + xn Pn = 0.
 
x1
h i x2 
=⇒ P1 P2 . . . Pn =0
 

 ... 
xn

=⇒ PX = 0

=⇒ X = 0, ( P is an invertible matrix)

59
Contd.

=⇒ x1 = x2 = . . . = xn = 0 =⇒ B is an L.I. set

Claim 2: F n×1 = Span B. That is, every column vector can be


expressed as a linear combination of column vectors of matrix P.
Let Y ∈ F n×1 be any column vector. Consider the matrix
multiplication P −1 Y , denote this by X . So, Y = PX . As the
matrix PX can be written as a linear combination of column vectors
of P, we have

Y = PX = x1 P1 + x2 P2 + . . . + xn Pn ∈ L({P1 , P2 , . . . , Pn })

Thus, F n×1 = Span B.


Hence, by claims 1 and 2, B (the set of all columns of P) is a basis
of F n×1 . 60
A simple way of checking a set of vectors in F n is a basis or not

Idea: Given any set B = {α1 , α2 , . . . , αn } of n vectors in F n ,


consider the matrix A where the column vectors are precisely
α1 , α2 , . . . , αn . Show that A is invertible. Then by above
observation the set B = {α1 , α2 , . . . , αn } is a basis for F n .

61
Theorem 4

Let V be a vector space which is spanned by a finite set of vectors


β1 , β2 , . . . , βm . Then any linearly independent set of vectors in V
contains no more than m elements.
Proof: It suffices to show that every subset of V which contains
more than m elements is linearly dependent (L.D.). Let S be such a
set. Let S = {α1 , α2 , . . . , αn }, where n > m.
Since span B = V , we have
α1 = A11 β1 + A21 β2 + · · · + Am1 βm
α2 = A12 β1 + A22 β2 + · · · + Am2 βm
..
.
αn = A1n β1 + A2n β2 + · · · + Amn βm .

62
Theorem 4 contd.

For each j = 1, 2, . . . , n, we have


m
X
αj = A1j β1 + A2j β2 + · · · + Amj βm = Aij βi .
i=1
We want to show that S is L.D. That means we are searching for
some x1 , x2 , . . . , xn , at least one of them is not zero, such that

x1 α1 + x2 α2 + . . . + xn αn = 0 − − − − − −(i)

63
Theorem 4 contd.

Consider
x1 α1 + x2 α2 + . . . + xn αn
n
X
= xj αj
j=1

n m
!
X X
= xj Aij βi
j=1 i=1
 
m
X n
X
=  Aij xj  βi (This is just an rearrangement of terms)
i=1 j=1
     
n
X n
X Xn
= A1j xj  β1 +  A2j xj  β2 + · · · +  Amj xj  βm .
j=1 j=1 j=1

64
Theorem 4 contd.

Consider the system of equations


n
X
A1j xj = 0
j=1

n
X
A2j xj = 0
j=1

..
.
n
X
Amj xj = 0
j=1

This is a homogeneous linear system with m equations and n


variables.
65
Theorem 4 contd.

Since m < n, this system has a non-trivial solution say


x1∗ , x2∗ , . . . , xn∗ (at least one xj∗ ̸= 0), that is
X n
Aij xj∗ = 0, for each i = 1, 2, . . . , m − − − (ii).
j=1

Therefore we got x1∗ , x2∗ , . . . , xn∗ (at least one xj∗ ̸= 0) such that
 
m
X Xn
x1∗ α1 + x2∗ α2 + . . . + xn∗ αn =  Aij xj∗  βi = 0 (by (ii))
i=1 j=1

Hence the set S = {α1 , α2 , . . . , αn } is a L.D. set.


This completes the proof.

66
Corollaries to Theorem 4

Corollary 1. If V is a finite dimensional vector space, then any two


bases of V have the same number of elements.
Proof: B1 = {β1 , β2 , . . . , βm } and B2 = {α1 , α2 , . . . , αn } be two
bases of V . Then by definition
(i) B1 is L.I. and span B1 = V .
(ii) B2 is L.I. and span B2 = V .
Since span B1 = V and B2 is an L.I. set, n ≤ m − − − (a) (by
Theorem 4).
Similarly, since span B2 = V and B1 is an L.I. set, m ≤ n − − − (b)
(by Theorem 4).
Hence, by (a) and (b), m = n.
This completes the proof of the corollary.
67
Dimension

Definition:
The dimension of a finite-dimensional vector space V is the number
of elements in a basis for V .
The dimension of the zero vector space is zero.
If V is not a finite dimensional vector space we say it is an infinite
dimensional vector space.

68
Examples

Example-1.
As B = {e1 = (1, 0, 0, . . . , 0, 0), e2 = (0, 1, 0, . . . , 0, 0), . . . ,
en = (0, 0, 0, . . . , 0, 1)} is a basis of F n over F ,

dimension of F n = dim(F n ) = n.

Example-2. The dimension of F m×n = mn,


as the set {Eij : i = 1, 2, . . . , m; j = 1, 2, . . . , n} (see previous
lecture) is a basis of F m×n .

69
Examples

Example-3. Let AX = 0 be a system of homogeneous equations,


where A is an m × n matrix, X is an n × 1 column vector, and 0 is
the m × 1 zero vector. Let R be the row-reduced-echelon form of
matrix A and r be the number of non-zero rows of R. Then the
dimension of the solution space of the homogeneous system of linear
equations AX = 0 is of dimension n − r (the number of free
variables).

70
Examples

For example consider the system of equations RX = 0, where


 
0 1 −3 0 12
R =  0 0 0 1 2 .
 

0 0 0 0 0

Solution space
 
1
S = Span of (1, 0, 0, 0, 0), (0, 3, 1, 0, 0), (0, − , 0, −2, 1) .
2

We have also verified that the set


(1, 0, 0, 0, 0), (0, 3, 1, 0, 0), (0, − 21 , 0, −2, 1) is a basis of S.


Hence, the dimension of the solution space S is 3 = 5 − 2.


71
Examples of infinite dimensional vector spaces

Example-4. Let P be the set of all polynomials with real coefficients.


Verify that the set {1, x, x 2 , x 3 , . . . , x n , . . .} is a basis for P. Since
{1, x, x 2 , x 3 , . . . , x n , . . .} is an infinite set, the vector space P is an
infinite dimensional vector space.
Example-5. Let C [a, b] be the set of all continuous functions from
[a, b] to R. Then C [a, b] is an infinite dimensional vector space.

72
Corollaries to Theorem 4

Corollary 2. Let V be a finite dimensional vector space and let n =


dim V . Then
1. any subset of V which contains more than n vectors is a L.D.;
2. no subset of V which contains fewer than n vectors can span
V.
Proof. Let B = {β1 , β2 , . . . , βn } be a basis of V . Then
(i) B is L.I. and
(ii) span B = V .
Proof of (1). Obvious from Theorem 4.
Proof of (2). Suppose that V = span {γ1 , γ2 , . . . , γp }. Since B is a
linearly independent set and span B = V , by Theorem 4 we get
n ≤ p. That means, any set of vectors which spans V contains at
least n vectors. This proves (2).
73
Lemma

Let S be a linearly independent subset of a vector space V .


Suppose that there is a vector β ∈ V − L(S). Then S ∪ {β} is
an L.I. subset of V .
Proof. Let S = {α1 , α2 , . . . , αn }.
Claim. The set {α1 , α2 , . . . , αn , β} is an L.I. set.
Consider

c1 α1 + c2 α2 + · · · + cn αn + bβ = 0 − − − − − (i)

Then b = 0; otherwise β = − cb1 α1 − c2


b α2 − ··· − cn
b αn ∈ L(S), a
contradiction.

74
Proof continued

From Equation (1) we have

c1 α1 + c2 α2 + · · · + cn αn = 0.

Since S is an L.I. set, we must have c1 = c2 = · · · = cn = 0. As we


already have cn+1 = 0, the set {α1 , α2 , . . . , αn , β} is an L.I. subset
of V .

75
Theorem 5

In a finite-dimensional vector space V every non-empty linearly


independent set of vectors is part of a basis.
Proof: Let S0 be a linearly independent subset of V .
Claim: S0 is a part of a basis for W .
We extend S0 to a basis for V in the following way.
If span S0 = V , then we are done, as S0 is L.I. and span S0 = V ,
hence a basis for V .
If not, that is span S0 ̸= V , then there is an element (say)
β1 ∈ V − span S0 . Let S1 = S0 ∪ {β1 }. By the previous lemma S1
is an L.I. subset of V .

76
Theorem 5 contd

If span S1 = V , then we are done, as S1 is L.I. and span S1 = V ,


hence a basis for V .
If not, apply the previous lemma to obtain a β2 ∈ V − span (S1 )
such that S2 = S1 ∪ {β2 } is an L.I. set.
If we continue in this way, then (in not more than dim V steps) we
reach an L.I. set

Sm = S0 ∪ {β1 , β2 , . . . , βm }

which is a basis for V .

77
Example

Let S0 = {(1, 1, 1)}. Find a basis for R 3 which contains S0 .


Solution :

L(S0 ) = {a(1, 1, 1) : a ∈ R} = {(a, a, a) : a ∈ R}

Clearly, β1 = (1, 1, 0) ∈
/ L(S0 ). By Theorem 5,
S1 = S0 ∪ {β1 } = {(1, 1, 1), (1, 1, 0)} is a L.I. subset of R 3 .

L(S1 ) = {a(1, 1, 1) + b(1, 1, 0) = (a + b, a + b, a) : a, b ∈ R}

Clearly β2 = (1, 0, 0) ∈
/ L(S1 ). By Theorem 5,
S2 = S1 ∪ {β2 } = {(1, 1, 1), (1, 1, 0), (1, 0, 0)} is a L.I. set. Verify
that L(S2 ) = R 3 . Hence, S2 is a basis for R 3 .
78
Ordered Basis

Definition:
Let V be a finite-dimensional vector space. An ordered basis for V
is a finite sequence of vectors which is linearly independent and
spans V .

Note: Let B = {α1 , α2 , . . . , αn } be an ordered basis for V . Let


α ∈ V = span {α1 , α2 , . . . , αn }. Then there exist some scalars
x1 , x2 , . . . , xn such that

α = x1 α1 + x2 α2 + · · · + xn αn − − − −(1)

This representation is unique.

79
If not, then there exist scalars y1 , y2 , . . . , yn ∈ F such that

α = y1 α1 + y2 α2 + · · · + yn αn − − − −(2)

From (1) and (2), we get

x1 α1 + x2 α2 + · · · + xn αn = α = y1 α1 + y2 α2 + · · · + yn αn .

This implies

(x1 − y1 )α1 + (x2 − y2 )α2 + · · · + (xn − yn )αn = 0.

Since B is linearly independent, we have

x1 − y1 = x2 − y2 = · · · = xn − yn = 0.

Thus, x1 = y1 , x2 = y2 , . . . , xn = yn . Hence the representation is


unique. 80
Coordinate Matrix

Definition:
The coordinate matrix of the vector α relative to the ordered basis
B is  
x1
 x2 
 
[α]B =  . 

.
 .. 
xn

81
Example

Example: Find the coordinate matrices of the vector α = (1, 2, 3)


with respect to the bases
B1 = {ϵ1 = (1, 0, 0), ϵ2 = (0, 1, 0), ϵ3 = (0, 0, 1)} and
B2 = {α1 = (1, 1, 1), α2 = (0, 1, 1), α3 = (0, 0, 1)}.
Solution: Note that

α = (1, 2, 3) = 1 · ϵ1 + 2 · ϵ2 + 3 · ϵ3

and
α = (1, 2, 3) = 1 · α1 + 1 · α2 + 1 · α3 .

Therefore    
1 1
[α]B1 =  2  and [α]B2 =  1  .
   

3 1
82
Relation between [α]B1 and [α]B2 ?

Note that
α1 = ϵ1 + ϵ2 + ϵ3 ,

α2 = 0ϵ1 + ϵ2 + ϵ3

and
α3 = 0ϵ1 + 0ϵ2 + ϵ3 .

So, we have
     
1 0 0
P1 = [α1 ]B1 =  1  , P2 = [α2 ]B1 =  1  and P3 = [α3 ]B1 =  0  .
     

1 1 1

83
Relation between [α]B1 and [α]B2 ?

Let  
1 0 0
P = [P1 , P2 , P3 ] =  1 1 0 
 

1 1 1
Then
    
1 0 0 1 1
P · [α]B2 =  1 1 0   1  =  2  = [α]B1 .
    

1 1 1 1 3

Verify that the matrix P is invertible and [α]B2 = P −1 [α]B1 .

84
Relation between [α]B1 and [α]B2 ?

This phenomenon also holds in an arbitrary finite dimensional


vector space V .

Proof: Let B1 = {α1 , α2 , . . . , αn } and B2 = {β1 , β2 , . . . , βn } be


two ordered bases of a finite-dimensional vector space V . Let
α ∈ V . Let
Xn Xn
α= xi αi and α = yj βj
i=1 j=1

be the unique representations of α with respect to the bases B1 and


B2 , respectively. Thus,
   
x1 y1
x2 y2
   
   
[α]B1 = ..  and [α]B = 
2 .. .
. .
   
   
xn yn 85
contd..

Since βj ∈ V = span {α1 , α2 , . . . , αn } for each j = 1, 2, . . . , n,


there exist unique scalars P1j , P2j , . . . , Pnj such that

βj = P1j α1 + P2j α2 + · · · + Pnj αn .

That is
β1 = P11 α1 + P21 α2 + · · · + Pn1 αn

β2 = P12 α1 + P22 α2 + · · · + Pn2 αn


..
.

βn = P1n α1 + P2n α2 + · · · + Pnn αn .

86
contd.

Now,
α = y1 β1 + y2 β2 + · · · + yn βn
= y1 (P11 α1 + P21 α2 + · · · + Pn1 αn )
+ y2 (P12 α1 + P22 α2 + · · · + Pn2 αn )
..
.
+ yn (P1n α1 + P2n α2 + · · · + Pnn αn )
= (P11 y1 + P12 y2 + · · · + P1n yn )α1
+ (P21 y1 + P22 y2 + · · · + P2n yn )α2
..
.
+ (Pn1 y1 + Pn2 y2 + · · · + Pnn yn )αn .

87
contd.

But, we already have α = x1 α1 + x2 α2 + · · · + xn αn , and since the


representation of α relative to the basis B1 is unique, we have

x1 = P11 y1 + P12 y2 + · · · + P1n yn

x2 = P21 y1 + P22 y2 + · · · + P2n yn


..
.

xn = Pn1 y1 + Pn2 y2 + · · · + Pnn yn .

88
contd.

Writing these equations in matrix form we get


    
x1 P11 P12 · · · P1n y1
x2 P21 P22 · · · P2n y2
    
    
 .. = .. .. .. ..  .. .
. . . . . .
    
    
xn Pn1 Pn2 · · · Pnn yn

=⇒ X = PY , − − − − −(1)

where X = [α]B1 and Y = [α]B2 .

89
contd.

Next, we claim that the matrix P is invertible, that we show in the


following two steps:
Claim (1): X = 0 ⇐⇒ Y = 0.
Proof: Let X = 0. This implies x1 = x2 = · · · = xn = 0, that
further implies α = 0.
As α = y1 β1 + y2 β2 + · · · + yn βn = 0 and B2 = {β1 , β2 , . . . , βn } is
linearly independent, we get y1 = y2 = · · · = yn = 0. Thus, Y = 0.

By the same argument we can prove that if Y = 0 then X = 0.


Claim (2): P is an invertible matrix.
Proof: Consider the system of equations PY = 0. As X = PY , we
get X = 0. Then from Claim 1 it follows that Y = 0. This means
the sytem PY = 0 has only the trivial solution. Hence, by Theorem
7 (of chapter 1) we get that P is invertible. 90
Theorem 7

We just proved the following theorem:


Theorem 7.
Let V be a n-dimensional vector space over the field F and let
B1 = {α1 , α2 , . . . , αn } and B2 = {β1 , β2 , . . . , βn } be two ordered
bases of V . Then there is a unique, necessarily invertible, n × n
matrix P with entries in F such that (i) [α]B1 = P[α]B2 and (ii)
[α]B2 = P −1 [α]B1 for every α ∈ V . The columns of P are given by
Pj = [βj ]B1 for j = 1, 2, . . . , n.

91
An application

Note: For a given ordered basis B1 of a finite-dimensional vector


space V and a given invertible matrix P, it is possible to construct
another ordered basis B2 of V .

Example: Let B1 =
{α1 = (0, 1, 1, 1), α2 = (1, 0, 1, 1), α3 = (1, 1, 0, 1), α4 = (1, 1, 1, 0)}
be an ordered basis for R 4 and let
 
1 0 1 0
 1 0 0 0 
P=
 

 0 1 0 0 
0 1 4 2

be an invertible matrix. Find an ordered basis for R 4 different from


the basis B1 .
92
Solution

We want to find β1 , β2 , β3 , and β4 such that


   
1 0
 1   0 
[β1 ]B1 = P1 =  , [β2 ]B1 = P2 =  ,
   
 0   1 
0 1
   
1 0
 0   0 
[β3 ]B1 = P3 =  , [β4 ]B1 = P4 =  .
   
 0   0 
4 2

93
solution contd.

From the definition of coordinate matrices, it follows that

β1 = 1α1 + 1α2 + 0α3 + 0α4 = (1, 1, 2, 2)

β2 = 0α1 + 0α2 + 1α3 + 1α4 = (2, 2, 1, 1)

β3 = 1α1 + 0α2 + 0α3 + 4α4 = (4, 5, 5, 1)

β4 = 0α1 + 0α2 + 0α3 + 2α4 = (2, 2, 2, 0).

Hence, we got another ordered basis

B2 = {(1, 1, 2, 2), (2, 2, 1, 1), (4, 5, 5, 1), (2, 2, 2, 0)}

different from the given ordered basis B1 .

94
Row rank / Column rank

Let A be an m × n matrix over a field F . Let {R1 , R2 , . . . , Rm } be


the rows of A and {C1 , C2 , . . . , Cn } be the columns of A.
Row space of A = span {R1 , R2 , . . . , Rm }.
Column space of A = span {C1 , C2 , . . . , Cn }.
Then the row space of A is a subspace of F 1×n and the column
space of A is a subspace of F m×1 .

Definitions:
Row rank of A = dimension of row space of A.
Column rank of A = dimension of column space of A.

95
Basis of a row-reduced echelon matrix

Let A be an m × n matrix over the field F . Let R be the


row-reduced echelon form of A. The non-zero rows of R forms a
basis for the row space of R. Since two row-equivalent matrices
have same row space, thus a basis for the row space of R is also a
basis for the row space of A.

Hence

Row rank of A = Row rank of R = No. of non-zero rows of R.

96

You might also like