Matrices are rectangular arrays of numbers or functions arranged in rows and columns, used in various fields to represent and solve systems of linear equations, perform linear transformations, and handle data elements in a structured form.
A rectangular array of mn numbers in the form of m horizontal lines
(called rows) and n vertical lines (called columns), is called a matrix
of order m by n, written as m × n matrix.
In compact form, the matrix is represented by A = [aij]m×n
2. Special Type of Matrices:
(a)ColumnMatrix(Columnvector): A = [a11, a12, ........a1n] i.e. row matrix has exactly one row.
(b)RowMatrix(Rowvector): A = a11a21..am1 i.e. column
matrix has exactly one column.
(c)ZeroorNullMatrix: (A = Om×n), An m × n matrix whose all entries are zero.
(d)HorizontalMatrix: A matrix of order m × n is a horizontal matrix if n > m.
(e)VerticalMatrix: A matrix of order m × n is a vertical matrix if m > n.
(f)SquareMatrix: (Order n) If number of rows = number of
column, then the matrix is square.
Note: (i) The pair of elements aij & aji are called Conjugate Elements.
(ii) The elements a11, a22, a33,....... ann are called Diagonal Elements. The line along which the diagonal elements lie is called the principal or Leading diagonal. "The quantity Σaii = trace of the matrix written as, tr (A)
3. Square Matrices:
Triangular Matrix
(i)UpperTriangular: aij = 0 ∀ i > j For example A = 100320−245
(ii)LowerTriangular: aij = 0 ∀ i < j For example A = 100320−245
Both the above Matrices can also be called Diagonal Matrix denoted as A = diag(a11, a22, a33,....... ann) where aij = 0 for i = j
(iii)ScalarMatrix:a000a000a if a11 = a22 = a33 = a
(i) Minimum number of zeros in triangular matrix of order n = 2n(n−1). (ii) Minimum number of zeros in a diagonal matrix of order n
= n(n – 1). (iii) Null square matrix is also a diagonal matrix.
4. Equality of Matrices:
Matrices A = [aij] & B = [bij] are equal if, (a) both have the same order. (b) aij = bij for each pair of i & j.
5. Algebra of Matrices:
(i)Addition: A + B = [aij + bij] where A & B are of the same
order.
(iii)MultiplicationofaMatrices(RowbyColumn):
Let A be a matrix of order m × n and B be a matrix of order p
× q, then the matrix multiplication AB is possible if and only if n = p.
Let Am×n = [aij] and Bn×p = [bij], then order of AB is m x p.
We can also write (AB)ij = r=1∑nairbir
(iv)PropertiesofMatrixMultiplication:
AB = O ⇏ A = O or B = O (in general)
Note: If A and B are two non-zero matrices such that AB = O, then A and B are called the divisors of zero. If A and B are two matrices such that (i) AB = BA then A and B are said to commute (ii) AB = –BA then A and B are said to anti-commute.
Matrix Multiplication Is Associative : If A, B & C are conformable for the product AB & BC, then (AB)C = A(BC)
Distributivity : A(B+C)=AB+AC(A+B)C=AC+BC}⇒ Provided A, B & C are conformable for respective products.
Let A be a square matrix. Then the polynomial in x, |A – xI|, is called the characteristic polynomial of A and
the equation |A – xI| = 0 is called the characteristic equation of A.
7. Cayley-Hamilton Theorem:
Every square matrix A satisfies its characteristic equation i.e. a0xn+a1xn–1+........+an–1x+an = 0 is the characteristic equation of matrix A, then a0An+a1An–1+........+an–1A+anI = 0
8. Transpose of a Matrix:
Changingrowsandcolumns: Let A be any matrix of order m × n. Then, the transpose of A is AT or A' of order n × m and (AT)ij=(A)ji. PropertiesofTranspose: If AT & BT denote the transpose of A and B
(A+B)T=AT+BT; note that A & B have the same order.
(AB)T=BTAT (Reversal law) A & B are conformable for matrix product AB.
(AT)T = A
(kA)T = kAT, where k is a scalar.
In General: (reversal law for transpose) →(A1.A2.....An)T = AnT.....A2T.A1T
9. Orthogonal Matrix:
A square matrix is said to be an orthogonal matrix if A.AT = I Note: (i) The determinant value of the orthogonal matrix is either 1 or –1. Hence, the orthogonal matrix is always invertible. (ii) A.AT = I = AT.A Hence A−1 = AT
10. Some Special Square Matrices:
(a) Idempotent Matrix:
A square matrix is idempotent, provided A2 = A. The following conditions also need to be kept in mind.
An = A ∀ n ∈ N.
The determinant value of the idempotent matrix is either 0 or 1
If the idempotent matrix is invertible, it will be an identity matrix, i.e., I.
(b) Periodic Matrix:
For some positive integer K, a square matrix that satisfies the relation Ak+1 = A, is periodic. The period of the matrix has the least value of k, which holds this true.
Note that the period of an idempotent matrix is 1.
(c) Niloptent Matrix:
A square matrix is a nilpotent matrix of order m, m ∀ N, if Am = O, Am−1= O.
Note that a nilpotent matrix will not be invertible.
(d) Involuntary Matrix:
If A2 = I, the matrix is said to be an involutary matrix.
Note that A = A–1 for an involutary matrix.
If A and B are square matrices of the same order and AB = BA, then
(A + B)n = (0n)An+(1n)An−1B+(2n)An−2B2+.....+(nn)Bn
11. SYMMETRIC & SKEW SYMMETRIC MATRIX :
(a) Symmetric matrix :
For symmetric matrix A = AT i.e. aij = aji ∀ I, j. Note: Maximum number of distinct entries in any symmetric matrix of order n is 2n(n+1).
(b) Skew symmetric matrix :
Square matrix A = [aij] is said to be skew-symmetric if AT = –A, i.e., aij = –aji ∀ i & j. Hence if A is skew-symmetric, then aii = -aii⇒ aii = 0 ∀ i.
Thus the diagonal elements of a skew square matrix are all zero, but not the converse.
(c) Properties of symmetric & skew-symmetric matrix :
(i) Let A be any square matrix then, A + AT is a symmetric matrix & A – AT is a skew-symmetric matrix. (ii) The sum of two symmetric matrices is a symmetric matrix, and the sum of two skew-symmetric matrices is a skew-symmetric matrix. (iii) If A & B are symmetric matrices then,
AB + BA is a symmetric matrix
AB - BA is a skew-symmetric matrix.
(iv) Every square matrix can be uniquely expressed as a sum or difference between a symmetric and a skew-symmetric matrix.
A = Symmetric Matrix + Skew-Symmetric Matrix
A = Symmetric21(A+AT) + Skew-Symmetric21(A−AT) and A = 21(AT+A)−21(AT−A)
Let A = [aij] = a11a21a31a12a22a32a13a23a33 be a square matrix and let the matrix formed by the cofactors of [aij] in determinant |A| is C11C21C31C12C22C32C13C23C33.
Then (adj A) = Transpose of cofactor matrix = C11C12C13C21C22C23C31C32C33 Note:
If A is a square matrix of order n, then (i) A.(adj A) = |A| In = (adj A).A (ii) |adj A| = |A|n–1, n ≥ 2 (iii) adj(adj A) = |A|n–2A, |A| = 0 (iv) adj(AB) = (adj B)(adj A) (v) adj(kA) = kn−1(adj A), where K is a scalar
A square matrix A is said to be invertible (nonsingular) if there exists a matrix B such that AB = I (Note that AB = I ⇔ BA = I) B is called the inverse (reciprocal) of A and is denoted by A–1.
Thus A–1 = B ⇔ AB = I = BA
We have, A.(adj A) = |A|In ⇒ A–1.A(adj A) = A–1 A–1 |A| ⇒In (adj A) = A–1 |A| In ⇒ A–1 = ∣A∣adjA
Note:
The necessary and sufficient condition for a square matrix A to be invertible is that |A| = 0
If A is an invertible matrix, then AT is also invertible & (AT)−1 = (A–1)T.
If A is invertible, then
(a) (A–1)−1 = A
(b) (Ak)−1 = (A–1)k = A–k; k ∈ N
(c) |A−1| = ∣A∣1
Theorem: If A & B are invertible matrices of the same order, then (AB)−1 = B−1A−1.
14. System of Equation & Criteria for Consistency:
⇒ AX = B ⇒ A−1AX = A−1B, if |A| = 0. ⇒ X = A−1B = ∣A∣AdjA.B
Note: (i) If |A| = 0 , system is consistent having unique solution (ii) If |A| = 0 & (adj A).B = O (Null matrix) system is consistent having unique, non-trivial solutions. (iii) If |A| = 0 & (adj A).B = O (Null matrix) system is consistent having trivial solutions. (iv) If |A| = 0 , then matrix method fails {If(adjA).B=O(nullmatrix)→InfiniteorNoSolutionsIf(adjA).B=O→Inconsistent(Nosolution)