image
image
image
image
image
image
image
image
image
image

Matrices Formula Sheet

This page will help you to revise formulas and concepts of Matrices instantly for various exams.
Shape 2
Shape 3
Shape 4
Shape 5
Shape 7
Shape 8
Shape 9
Shape 10

Matrices are rectangular arrays of numbers or functions arranged in rows and columns, used in various fields to represent and solve systems of linear equations, perform linear transformations, and handle data elements in a structured form.

Neetesh Kumar | May 05, 2024                                       \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space Share this Page on: Reddit icon Discord icon Email icon WhatsApp icon Telegram icon

1. Definition:

A rectangular array of mn numbers in the form of m horizontal lines (called rows) and n vertical lines (called columns), is called a matrix of order m by n, written as m × n matrix.
In compact form, the matrix is represented by A = [aij_{ij}]m×n_{m \times n}

2. Special Type of Matrices:

(a)\bold{(a)} Column Matrix (Column vector):\bold{Column \space Matrix \space (Column \space vector):} A = [a11_{11}, a12_{12}, ........a1n_{1n}] i.e. row matrix has exactly one row.

(b)\bold{(b)} Row Matrix (Row vector):\bold{Row \space Matrix \space (Row \space vector):} A = [a11a21..am1]\begin{bmatrix} a_{11} \\ a_{21} \\ . \\ . \\ a_{m1} \end{bmatrix} i.e. column matrix has exactly one column.

(c)\bold{(c)} Zero or Null Matrix:\bold{Zero \space or \space Null \space Matrix:} (A = Om×n_{m \times n}), An m × n matrix whose all entries are zero.

(d)\bold{(d)} Horizontal Matrix:\bold{Horizontal \space Matrix:} A matrix of order m × n is a horizontal matrix if n > m.

(e)\bold{(e)} Vertical Matrix:\bold{Vertical \space Matrix:} A matrix of order m × n is a vertical matrix if m > n.

(f)\bold{(f)} Square Matrix:\bold{Square \space Matrix:} (Order n) If number of rows = number of column, then the matrix is square.

Note:\bold{Note:}
(i)\bold{(i)} The pair of elements aij_{ij} & aji_{ji} are called Conjugate Elements.

(ii)\bold{(ii)} The elements a11_{11}, a22_{22}, a33_{33},....... ann_{nn} are called Diagonal Elements. The line along which the diagonal elements lie is called the principal or Leading diagonal. "The quantity Σaii\Sigma a_{ii} = trace of the matrix written as, tr_{r} (A)

3. Square Matrices:

Triangular Matrix

(i)\bold{(i)} Upper Triangular:\bold{Upper \space Triangular:} aij_{ij} = 0 ∀ i > j For example A = [132024005]\begin{bmatrix} 1 & 3 & -2 \\ 0 & 2 & 4 \\ 0 & 0 & 5 \end{bmatrix}

(ii)\bold{(ii)} Lower Triangular:\bold{Lower \space Triangular:} aij_{ij} = 0 ∀ i < j For example A = [132024005]\begin{bmatrix} 1 & 3 & -2 \\ 0 & 2 & 4 \\ 0 & 0 & 5 \end{bmatrix}

Both the above Matrices can also be called Diagonal Matrix denoted as A = diag(a11_{11}, a22_{22}, a33_{33},....... ann_{nn}) where aij_{ij} = 0 for i \ne j

(iii)\bold{(iii)} Scalar Matrix:\bold{Scalar \space Matrix:} [a000a000a]\begin{bmatrix} a & 0 & 0 \\ 0 & a & 0 \\ 0 & 0 & a \end{bmatrix} if a11_{11} = a22_{22} = a33_{33} = a

(iv)\bold{(iv)} Unit or Identity Matrix:\bold{Unit \space or \space Identity \space Matrix:} aij_{ij} = {1if i=j0if ij\begin{cases} 1 &\text{if } i = j \\ 0 &\text{if } i \ne j \end{cases}       \space \space \space \space \space \space if a11_{11} = a22_{22} = a33_{33} = 1

Note:

(i)\bold{(i)} Minimum number of zeros in triangular matrix of order n = n(n1)2\frac{n(n-1)}{2}.
(ii)\bold{(ii)} Minimum number of zeros in a diagonal matrix of order n = n(n – 1).
(iii)\bold{(iii)} Null square matrix is also a diagonal matrix.

4. Equality of Matrices:

Matrices A = [aij_{ij}] & B = [bij_{ij}] are equal if,
(a)\bold{(a)} both have the same order.
(b)\bold{(b)} aij_{ij} = bij_{ij} for each pair of i & j.

5. Algebra of Matrices:

(i)\bold{(i)} Addition:\bold{Addition:} A + B = [aij_{ij} + bij_{ij}] where A & B are of the same order.

  • Addition of matrices is commutative: A + B = B + A
  • Matrix addition is associative : (A + B) + C = A + (B + C)
  • A + O = O + A (Additive identity)
  • A + (–A) = (–A) + A = O (Additive inverse)

(ii)\bold{(ii)} Multiplication of a Matrix by a Scalar:\bold{Multiplication \space of \space a \space Matrix \space by \space a \space Scalar:} If A = [abcbcacab]\begin{bmatrix} a & b & c \\ b & c & a \\ c & a & b \end{bmatrix} then kA = [kakbkckbkckakckakb]\begin{bmatrix} ka & kb & kc \\ kb & kc & ka \\ kc & ka & kb \end{bmatrix}

(iii)\bold{(iii)} Multiplication of a Matrices (Row by Column):\bold{Multiplication \space of \space a \space Matrices \space (Row \space by \space Column):} Let A be a matrix of order m × n and B be a matrix of order p × q, then the matrix multiplication AB is possible if and only if n = p.
Let Am×n_{m \times n} = [aij_{ij}] and Bn×p_{n \times p} = [bij_{ij}], then order of AB is m x p.
We can also write (AB)ij_{ij} = r=1nairbir\displaystyle\sum_{r=1}^n a_{ir}b_{ir}

(iv)\bold{(iv)} Properties of Matrix Multiplication:\bold{Properties \space of \space Matrix \space Multiplication:}

  • AB = O \nRightarrow A = O or B = O (in general)
  • Note:\bold{Note:} If A and B are two non-zero matrices such that AB = O, then A and B are called the divisors of zero. If A and B are two matrices such that
    (i)\bold{(i)} AB = BA then A and B are said to commute
    (ii)\bold{(ii)} AB = –BA then A and B are said to anti-commute.
  • Matrix Multiplication Is Associative : If A, B & C are conformable for the product AB & BC, then (AB)C = A(BC)
  • Distributivity : A(B+C)=AB+AC(A+B)C=AC+BC}\begin{rcases} A(B + C) = AB + AC \\ (A + B)C = AC + BC \end{rcases}⇒ Provided A, B & C are conformable for respective products.

(v)\bold{(v)} Integral\bold{Integral} Power of a Square Matrix\bold{Power \space of \space a \space Square \space Matrix}
Case(i)\bold{Case(i)} If the Power is Positive Integer

  • Am^mAn^n = Am+n^{m + n}
  • (Am)n^m)^n = Amn^{mn} = (An)m^n)^m
  • Im^m = I, m,n \in N

Case(ii)\bold{Case(ii)} If the Power is Negative Integer

  • An=(An)1A^{-n} = (A^n)^{-1}

6. Characteristic Equation:

Let A be a square matrix. Then the polynomial in x, |A – xII|, is called the characteristic polynomial of A and
the equation |A – xII| = 0 is called the characteristic equation of A.

7. Cayley-Hamilton Theorem:

Every square matrix A satisfies its characteristic equation i.e. a0xn+a1xn1+........+an1x+ana_0x_n + a_1x_{n–1} + ........ + a_{n–1}x + a_n = 0 is the characteristic equation of matrix A, then
a0An+a1An1+........+an1A+anIa_0A_n + a_1A_{n–1} + ........ + a_{n–1}A + a_nI = 0

8. Transpose of a Matrix:

Changing rows and columns:\bold{Changing \space rows \space and \space columns:} Let A be any matrix of order m × n. Then, the transpose of A is ATA^T or A' of order n × m and (AT)ij=(A)ji(A^T)_{ij} = (A)_{ji}.
Properties of Transpose:\bold{Properties \space of \space Transpose:} If ATA^T & BTB^T denote the transpose of A and B

  • (A+B)T=AT+BT(A + B)^T = A^T + B^T; note that A & B have the same order.
  • (AB)T=BTAT(AB)^T = B^TA^T (Reversal law) A & B are conformable for matrix product AB.
  • (AT)T(A^T)^T = A
  • (kA)T(kA)^T = kATkA^T, where kk is a scalar.
  • In General: (reversal law for transpose) \rarr (A1.A2.....An)T(A_1.A_2.....A_n)^T = AnT.....A2T.A1T{A_n}^T.....{A_2}^T. {A_1}^T

9. Orthogonal Matrix:

A square matrix is said to be an orthogonal matrix if A.AT^T = I
Note:\bold{Note:}
(i)\bold{(i)} The determinant value of the orthogonal matrix is either 1 or –1. Hence, the orthogonal matrix is always invertible.
(ii)\bold{(ii)} A.AT^T = I = AT^T.A Hence A1^{-1} = AT^T

10. Some Special Square Matrices:

(a) Idempotent Matrix:

A square matrix is idempotent, provided A2^2 = A. The following conditions also need to be kept in mind.

  • An^n = A ∀ n \in N.
  • The determinant value of the idempotent matrix is either 0 or 1
  • If the idempotent matrix is invertible, it will be an identity matrix, i.e., II.

(b) Periodic Matrix:

For some positive integer K, a square matrix that satisfies the relation Ak+1^{k+1} = A, is periodic. The period of the matrix has the least value of k, which holds this true.
Note that the period of an idempotent matrix is 1.

(c) Niloptent Matrix:

A square matrix is a nilpotent matrix of order m, m ∀ N, if Am^m = O, Am1^{m-1} \ne O.
Note that a nilpotent matrix will not be invertible.

(d) Involuntary Matrix:

If A2^2 = I, the matrix is said to be an involutary matrix.
Note that A = A1^{–1} for an involutary matrix.
If A and B are square matrices of the same order and AB = BA, then
(A + B)n^n = (n0)An+(n1)An1B+(n2)An2B2+.....+(nn)Bn\binom{n}{0}A^n + \binom{n}{1}A^{n-1}B + \binom{n}{2}A^{n-2}B^2 + .....+ \binom{n}{n}B^n

11. SYMMETRIC & SKEW SYMMETRIC MATRIX :

(a) Symmetric matrix :

For symmetric matrix A = AT^T i.e. aij_{ij} = aji_{ji} ∀ I, j.
Note:\bold{Note:} Maximum number of distinct entries in any symmetric matrix of order n is n(n+1)2\frac{n(n+1)}{2}.

(b) Skew symmetric matrix :

Square matrix A = [aij_{ij}] is said to be skew-symmetric if AT^T = –A, i.e., aij_{ij} = –aji_{ji} ∀ i & j. Hence if A is skew-symmetric, then aii_{ii} = -aii_{ii} \rArr aii_{ii} = 0 ∀ i.
Thus the diagonal elements of a skew square matrix are all zero, but not the converse.

(c) Properties of symmetric & skew-symmetric matrix :

(i)\bold{(i)} Let A be any square matrix then, A + AT^T is a symmetric matrix & A – AT^T is a skew-symmetric matrix.
(ii)\bold{(ii)} The sum of two symmetric matrices is a symmetric matrix, and the sum of two skew-symmetric matrices is a skew-symmetric matrix.
(iii)\bold{(iii)} If A & B are symmetric matrices then,

  • AB + BA is a symmetric matrix
  • AB - BA is a skew-symmetric matrix.

(iv)\bold{(iv)} Every square matrix can be uniquely expressed as a sum or difference between a symmetric and a skew-symmetric matrix.
A = Symmetric Matrix + Skew-Symmetric Matrix
A = 12(A+AT)Symmetric\underbrace{\frac{1}{2}(A + A^T)}_{\text{Symmetric}} + 12(AAT)Skew-Symmetric\underbrace{\frac{1}{2}(A - A^T)}_{\text{Skew-Symmetric}} and A = 12(AT+A)12(ATA)\frac{1}{2}(A^T + A) -\frac{1}{2}(A^T - A)

12. Adjoint of a Square Matrix:

Let A = [aij_{ij}] = [a11a12a13a21a22a23a31a32a33]\begin{bmatrix} a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33} \end{bmatrix} be a square matrix and let the matrix formed by the cofactors of [aij_{ij}] in determinant |A| is [C11C12C13C21C22C23C31C32C33]\begin{bmatrix} C_{11} & C_{12} & C_{13} \\ C_{21} & C_{22} & C_{23} \\ C_{31} & C_{32} & C_{33} \end{bmatrix}.
Then (adj A) = Transpose of cofactor matrix = [C11C21C31C12C22C32C13C23C33]\begin{bmatrix} C_{11} & C_{21} & C_{31} \\ C_{12} & C_{22} & C_{32} \\ C_{13} & C_{23} & C_{33} \end{bmatrix}
Note:\bold{Note:}
If A is a square matrix of order n, then
(i)\bold{(i)} A.(adj A) = |A| InI_n = (adj A).A
(ii)\bold{(ii)} |adj A| = |A|n1^{n–1}, n \ge 2
(iii)\bold{(iii)} adj(adj A) = |A|n2^{n–2}A, |A| \ne 0
(iv)\bold{(iv)} adj(AB) = (adj B)(adj A)
(v)\bold{(v)} adj(kA) = kn1k^{n-1}(adj A), where K is a scalar

13. Inverse of a Matrix (Reciprocal Matrix):

A square matrix A is said to be invertible (nonsingular) if there exists a matrix B such that
AB = I (Note that AB = I \Leftrightarrow BA = I) B is called the inverse (reciprocal) of A and is denoted by A1^{–1}.
Thus A1^{–1} = B \Leftrightarrow AB = I = BA
We have, A.(adj A) = |A|InI_n
\Rightarrow A1^{–1}.A(adj A) = A1^{–1} A1^{–1} |A|
\Rightarrow InI_n (adj A) = A1^{–1} |A| InI_n
\Rightarrow A1^{–1} = adjAA\frac{adj A}{|A|}

Note:\bold{Note:}

  • The necessary and sufficient condition for a square matrix A to be invertible is that |A| \ne 0
  • If A is an invertible matrix, then AT^{T} is also invertible & (AT)1^{T})^{-1} = (A1)T^{–1})^{T}.
  • If A is invertible, then
    (a) (A1)1^{–1})^{-1} = A
    (b) (Ak)1^{k})^{-1} = (A1)k^{–1})^{k} = Ak;^{–k}; k \in N
    (c) |A1^{-1}| = 1A\frac{1}{|A|}

Theorem:\bold{Theorem:} If A & B are invertible matrices of the same order, then (AB)1^{-1} = B1A1B^{-1}A^{-1}.

14. System of Equation & Criteria for Consistency:

Gauss-Jordan Method :

a1x+b1y+c1z=d1a_1x + b_1y + c_1z = d_1
a2x+b2y+c2z=d2a_2x + b_2y + c_2z = d_2
a3x+b3y+c3z=d3a_3x + b_3y + c_3z = d_3

\Rightarrow [a1x+b1y+c1za2x+b2y+c2za3x+b3y+c3z]\begin{bmatrix} a_1x + b_1y + c_1z \\ a_2x + b_2y + c_2z \\ a_3x + b_3y + c_3z \end{bmatrix} = [d1d2d3]\begin{bmatrix} d_1 \\ d_2 \\ d_3 \end{bmatrix} \Rightarrow [a1b1c1a2b2c2a3b3c3]\begin{bmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \end{bmatrix} [xyz]\begin{bmatrix} x \\ y \\ z \end{bmatrix} = [d1d2d3]\begin{bmatrix} d_1 \\ d_2 \\ d_3 \end{bmatrix}

\Rightarrow AX = B \Rightarrow A1^{-1}AX = A1^{-1}B, if |A| \ne 0.
\Rightarrow X = A1^{-1}B = AdjAA\frac{Adj A}{|A|}.B

Note:\bold{Note:}
(i)\bold{(i)} If |A| \ne 0 , system is consistent having unique solution
(ii)\bold{(ii)} If |A| \ne 0 & (adj A).B \ne O (Null matrix) system is consistent having unique, non-trivial solutions.
(iii)\bold{(iii)} If |A| = 0 & (adj A).B \ne O (Null matrix) system is consistent having trivial solutions.
(iv)\bold{(iv)} If |A| = 0 , then matrix method fails {If (adjA).B=O(null matrix)Infinite or No SolutionsIf (adjA).BOInconsistent (No solution)\begin{cases} If \space (adj A).B = O (null \space matrix) \rightarrow Infinite \space or \space No \space Solutions \\ If \space (adj A).B \ne O \rightarrow Inconsistent \space (No \space solution) \end{cases}

Related Pages:\color{red} \bold{Related \space Pages:}
Operation on Matrices
Determinants formula sheet