Introduction to Financial Econometrics
Appendix Matrix Algebra Review
Eric Zivot
Department of Economics
University of Washington
January 3, 2000
This version: February 6, 2001

1 Matrix Algebra Review
A matrix is just an array of numbers. The dimension of a matrix is determined by
the number of its rows and columns. For example, a matrix A with n rows and m
columns is illustrated below
пЈ® пЈ№
a11 a12 . . . a1m
пЈЇ a21 a22 . . . a2m пЈє
пЈЇ пЈє
A =пЈЇ . .пЈє
.
пЈ°. . ... . пЈ»
. . .
(nГ—m)
an1 an2 . . . anm

where aij denotes the ith row and j th column element of A.
A vector is simply a matrix with 1 column. For example,
пЈ® пЈ№
x1
пЈЇ x2 пЈє
пЈЇ пЈє
x =пЈЇ . пЈє
пЈ°.пЈ».
(nГ—1)
xn
is an n Г— 1 vector with elements x1 , x2 , . . . , xn . Vectors and matrices are often written
in bold type (or underlined) to distinguish them from scalars (single elements of
vectors or matrices).
The transpose of an n Г— m matrix A is a new matrix with the rows and columns
of A interchanged and is denoted A0 or A| . For example,
пЈ® пЈ№
В· Вё 14
123
, A0 = пЈ° 2 5 пЈ»
A=
456 (3Г—2)
(2Г—3)
36

1
пЈ® пЈ№
1 ВЈ В¤
= пЈ° 2 пЈ», x0 =
x 123.
(3Г—1) (1Г—3)
3

A symmetric matrix A is such that A = A0 . Obviously this can only occur if A
is a square matrix; i.e., the number of rows of A is equal to the number of columns.
For example, consider the 2 Г— 2 matrix
В· Вё
12
A= .
21

Clearly, В· Вё
12
A0 = A = .
21

1.1 Basic Matrix Operations
1.1.1 Addition and subtraction
Matrix addition and subtraction are element by element operations and only apply
to matrices of the same dimension. For example, let
В· Вё В· Вё
49 20
A= , B= .
21 07

Then
В· Вё В· Вё В· Вё В· Вё
49 20 4+2 9+0 69
A+B = + = = ,
21 07 2+0 1+7 28
В· Вё В· Вё В· Вё В· Вё
4в€’2 9в€’0
49 20 29
Aв€’B = в€’ = = .
2в€’0 1в€’7 2 в€’6
21 07

1.1.2 Scalar Multiplication
Here we refer to the multiplication of a matrix by a scalar number. This is also an
element-by-element operation. For example, let c = 2 and
В· Вё
3 в€’1
A= .
05

Then В· Вё В· Вё
2 В· 3 2 В· (в€’1) 6 в€’2
cВ·A= = .
2 В· (0) 2В·5 0 10

2
1.1.3 Matrix Multiplication
Matrix multiplication only applies to conformable matrices. A and B are conformable
matrices of the number of columns in A is equal to the number of rows in B. For
example, if A is mГ— n and B is m Г— p then A and B are conformable. The mechanics
of matrix multiplication is best explained by example. Let
В· Вё В· Вё
12 121
and B =
A= .
34 342
(2Г—2) (2Г—3)

Then
В· ВёВ· Вё
12 121
AВ·B В·
=
34 342
(2Г—2) (2Г—3)
В· Вё
1В·1+2В·3 1В·2+2В·4 1В·1+2В·2
=
3В·1+4В·3 3В·2+4В·4 3В·1+4В·2
В· Вё
7 10 5
= =C
15 22 11 (2Г—3)

The resulting matrix C has 2 rows and 3 columns. In general, if A is n Г— m and B
is m Г— p then C = A В· B is n Г— p.
As another example, let
В· Вё В·Вё
12 2
A = 3 4 and B = 6 .
(2Г—2) (2Г—1)

Then
В· ВёВ· Вё
12 5
AВ·B В·
=
34 6
(2Г—2) (2Г—1)
В· Вё
1В·5+2В·6
=
3В·5+4В·6
В· Вё
17
= .
39

As a & example, let
nal
пЈ® пЈ№ пЈ® пЈ№
1 4
x = пЈ° 2 пЈ», y = пЈ° 5 пЈ».
3 6

Then пЈ®
пЈ№
4
ВЈ В¤
x0 y = 1 2 3 В· пЈ° 5 пЈ» = 1 В· 4 + 2 В· 5 + 3 В· 6 = 32
6

3
1.2 The Identity Matrix
The identity matrix plays a similar role as the number 1. Multiplying any number by
1 gives back that number. In matrix algebra, pre or post multiplying a matrix A by
a conformable identity matrix gives back the matrix A. To illustrate, let
В· Вё
10
I=
01

denote the 2 dimensional identity matrix and let
В· Вё
a11 a12
A=
a21 a22

denote an arbitrary 2 Г— 2 matrix. Then
В· ВёВ· Вё
10 a11 a12
IВ·A = В·
01 a21 a22
В· Вё
a11 a12
= =A
a21 a22

and
В· ВёВ· Вё
a11 a12 10
AВ·I = В·
a21 a22 01
В· Вё
a11 a12
= = A.
a21 a22

1.3 Inverse Matrix
To be completed.

1.4 Representing Summation Using Vector Notation
Consider the sum
n
X
xk = x1 + В· В· В· + xk.
k=1

Let x = (x1 , . . . , xn )0 be an n Г— 1 vector and 1 = (1, . . . , 1)0 be an n Г— 1 vector of
ones. Then
пЈ®пЈ№
1 n
X
ВЈ В¤ пЈЇ.пЈє
0
x 1 = x1 . . . xn В· пЈ° . пЈ» = x1 + В· В· В· + xk = xk
.
1 k=1

4
and пЈ®
пЈ№
x1 n
X
ВЈ В¤пЈЇ.пЈє
0
1 x = 1 . . . 1 В· пЈ° . пЈ» = x1 + В· В· В· + xn = xk .
.
xn k=1

Next, consider the sum of squared x values
n
X
x2 = x2 + В· В· В· + x2 .
k 1 n
k=1

This sum can be conveniently represented as
пЈ® пЈ№
x
В¤ пЈЇ .1 n
X
ВЈ пЈє
x0 x = x1 . . . xn В· пЈ° . 2 2
x2 .
пЈ» = x1 + В· В· В· + xn =
. k
xn k=1

Last, consider the sum of cross products
n
X
xk yk = x1 y1 + В· В· В· xn yn .
k=1

This sum can be compactly represented by
пЈ® пЈ№
y
В¤ пЈЇ .1 пЈє Xn
ВЈ
0
x y = x1 . . . xn В· пЈ° . пЈ» = x1 y1 + В· В· В· xn yn = xk yk .
.
yn k=1

Note that x0 y = y0 x.

1.5 Representing Systems of Linear Equations Using Matrix
Algebra
Consider the system of two linear equations

(1)
x+y = 1
2x в€’ y = 1 (2)

which is illustrated in Figure xxx. Equations (1) and (2) represent two straight lines
which intersect at the point x = 2 and y = 1 . This point of intersection is determined
3 3
by solving for the values of x and y such that x + y = 2x в€’ y 1 .
1
Soving for x gives x = 2y. Substituting this value into the equation x + y = 1 gives 2y + y = 1
and solving for y gives y = 1/3. Solving for x then gives x = 2/3.

5
The two linear equations can be written in matrix form as
В· ВёВ· Вё В· Вё
11 x 1
=
2 в€’1 y 1
or
AВ·z=b
where В· Вё В· Вё В· Вё
11 x 1
and b =
A= , z= .
2 в€’1 y 1
If there was a (2 Г— 2) matrix B, with elements bij , such that B В· A = I, where I
is the (2 Г— 2) identity matrix, then we could solve for the elements in z as follows. In
the equation A В· z = b, pre-multiply both sides by B to give

BВ·AВ·z = BВ·b
=в‡’ I В· z = B В· b
=в‡’ z = B В· b

or В· Вё В· ВёВ· Вё В· Вё
b11 В· 1 + b12 В· 1
x b11 b12 1
= =
b21 В· 1 + b22 В· 1
y b21 b22 1
If such a matrix B exists it is called the inverse of A and is denoted Aв€’1 . In-
tuitively, the inverse matrix Aв€’1 plays a similar role as the inverse of a number.
1
Suppose a is a number; e.g., a = 2. Then we know that a В· a = aв€’1 a = 1. Similarly,
in matrix algebra Aв€’1 A = I where I is the identity matrix. Next, consider solving
1
the equation ax = 1. By simple division we have that x = a x = aв€’1 x. Similarly, in
matrix algebra if we want to solve the system of equation Ax = b we multiply by
Aв€’1 and get x = Aв€’1 b.
Using B = Aв€’1 , we may express the solution for z as

z = Aв€’1 b.

As long as we can determine the elements in Aв€’1 then we can solve for the values of
x and y in the vector z. Since the system of linear equations has a solution as long as
the two lines intersect, we can determine the elements in Aв€’1 provided the two lines
are not parallel.
nding the elements of Aв€’1 and typical
There are general numerical algorithms for &
spreadsheet programs like Excel have these algorithms available. However, if A is a
(2 Г— 2) matrix then there is a simple formula for Aв€’1 . Let A be a (2 Г— 2) matrix such
that В· Вё
a11 a12
A= .
a21 a22

6
Then В· Вё
1 a22 в€’a12
Aв€’1 = .
в€’a21 a11
a11 a22 в€’ a21 a12
By brute force matrix multiplication we can verify this formula
В· ВёВ· Вё
1 a22 в€’a12 a11 a12
Aв€’1 A =
a11 a22 в€’ a21 a12 в€’a21 a11 a21 a22
В· Вё
1 a22 a11 в€’ a12 a21 a22 a12 в€’ a12 a22
=
a11 a22 в€’ a21 a12 в€’a21 a11 + a11 a21 в€’a21 a12 + a11 a22
В· Вё
1 a22 a11 в€’ a12 a21 0
=
в€’a21 a12 + a11 a22
0
a11 a22 в€’ a21 a12
В· a22 a11 в€’a12 a21 Вё
0
a11 a22 в€’a21 a12
= в€’a21 a12 +a11 a22
0 a11 a22 в€’a21 a12
В· Вё
10
= .
01

Let s apply the above rule to & the inverse of A in our example:
nd
В· Вё В·1 1 Вё
1 в€’1 в€’1
Aв€’1 = = 3 в€’1 . 3
2
в€’1 в€’ 2 в€’2 1 3 3

Notice that В· ВёВ· Вё В· Вё
1 1
11 10
Aв€’1 A = 3 3 = .
2 в€’1
2 в€’1 01
3 3
Our solution for z is then

z = В· в€’1 b Вё В· Вё
A
1 1
1
3 3
= 2 в€’1
1
3
Вё3 В· Вё
В·2
x
3
= =
1
y
3

so that x = 2 and y = 1 .
3 3
In general, if we have n linear equations in n unknown variables we may write the
system of equations as

a11 x1 + a12 x2 + В· В· В· + a1n xn = b1
a21 x1 + a22 x2 + В· В· В· + a2n xn = b2
.
.=. .
. .
an1 x1 + an2 x2 + В· В· В· + ann xn = bn

7
which we may then express in matrix form as
пЈ® пЈ№пЈ® пЈ№ пЈ® пЈ№
a12 В· В· В· a1n
a11 x1 b1
пЈЇ a21 a22 В· В· В· a2n пЈє пЈЇ пЈєпЈЇ пЈє
x2 b2
пЈЇ пЈєпЈЇ пЈєпЈЇ пЈє
пЈЇ. . пЈєпЈЇ пЈє=пЈЇ пЈє
. .
пЈ°. . пЈ»пЈ° . .
пЈ»пЈ° пЈ»
. . . .
an2 В· В· В· ann
an1 xn bn
or
A В· x = b.
(nГ—1) (nГ—1)
(nГ—n)

The solution to the system of equations is given by

x = Aв€’1 b

where Aв€’1 A = I and I is the (n Г— n) identity matrix. If the number of equations is
greater than two, then we generally use numerical algorithms to & the elements in
nd
Aв€’1 .

Excellent treatments of portfolio theory using matrix algebra are given in Ingersol
(1987), Huang and Litzenberger (1988) and Campbell, Lo and MacKinlay (1996).

3 Problems
To be completed

References
 Campbell, J.Y., Lo, A.W., and MacKinlay, A.C. (1997). The Econometrics of
Financial Markets. Priceton, New Jersey: Princeton University Press.

 Huang, C.-F., and Litzenbeger, R.H. (1988). Foundations for Financial Eco-
nomics. New York: North-Holland.

 Ingersoll, J.E. (1987). Theory of Financial Decision Making. Totowa, New Jersey:
Rowman & Little& eld.

8