Guest

Revision Notes on Matrices & Determinants

  • Two matrices are said to be equal if they have the same order and each element of one is equal to the corresponding element of the other.

  • An m x n matrix A is said to be a square matrix if m = n i.e. number of rows = number of columns.

  • In a square matrix the diagonal from left hand side upper corner to right hand side lower corner is known as leading diagonal or principal diagonal.

  • The sum of the elements of a square matrix A lying along the principal diagonal is called the trace of A i.e. tr(A). Thus if A = [aij]n×n, then tr(A) = ∑ni=1  aii = a11 + a22 +......+ ann.

  • For a square matrix A = [aij]n×n, if all the elements other than in the leading diagonal are zero i.e. aij = 0, whenever i ≠ j then A is said to be a diagonal matrix.

  • A matrix A = [aij]n×n is said to be a scalar matrix if aij = 0, i ≠ j

                                                                                       = m, i = j, where m ≠ 0

Properties of various types of matrices:

Given a square matrix A = [aij]n×n,

  • For upper triangular matrix, aij = 0, ∀ i > j

  • For lower triangular matrix, aij = 0, ∀ i < j

  • Diagonal matrix is both upper and lower triangular.

  • A triangular matrix A = [aij]n×n is called strictly triangular if aii = 0 for ∀1 < i < n.

Transpose of a matrix and its properties: 

If A = [aij]m×n and transpose of A i.e. A' = [bij]n×m then bij =aji, ∀i, j.

  • (A')' = A

  • (A + B)' = A' + B', A and B being conformable matrices

  • (αA)' = αA', α being scalar

  • (AB)' = B'A', A and B being conformable for multiplication 

Properties of Conjugate of A i.e. \bar{A}

1. \bar{\bar{(A)}} = A

2. \overline{(A + B)} = \bar{A} + \bar{B}

3. \overline{(\alpha A)} = \bar{\alpha } \bar{A}, where α is any number real or complex

4. \overline{(AB)} = \bar{A} \bar{B} , where A and B are comformable for multiplication

Properties of Transpose conjugate:

The transpose conjugate of A is denoted by \bar{(A)} = AΘ.

If A = [aij]m × nthen AΘ = [bji]n×m where  b_{ji} = \bar{a}_{ij} i.e. the (j, i)th element of AΘ = the conjugate of (i, j)th element of A

1) (Aθ)θ = A

2) (A + B)θ = Aθ + Bθ

3) (kA)θ = Aθ, k being any number

4) (AB)θ = BθAθ

Addition of matrices:

1) Only matrices of the same order can be added or subtracted.

2) Addition of matrices is commutative as well as associative.

3) Cancellation laws hold well in case of addition.

4) The equation A + X = 0 has a unique solution in the set of all m × n matrices.

5) All the laws of ordinary algebra hold for the addition or subtraction of matrices and their multiplication by scalar.

Matrix Multiplication:

1) Matrix multiplication may or may not be commutative. i.e., AB may or may not be equal to BA

2) If AB = BA, then matrices A and B are called Commutative Matrices.

3) If AB ≠ BA, then matrices A and B are called Anti-Commutative Matrices.

4) Matrix multiplication is Associative

5) Matrix multiplication is Distributive over Matrix Addition.

6) Cancellation Laws need not hold good in case of matrix multiplication i.e., if AB = AC then B may or may not be equal to C even if A ≠ 0.

7) AB = 0 i.e., Null Matrix, does not necessarily imply that either A or B is a null matrix.

Special Matrices

  • A square matrix A = [aij] is said to be symmetric when aij = aji for all i and j.

  • If aij = -aji for all i and j and all the leading diagonal elements are zero, then the matrix is called a skew symmetric matrix.

  • A square matrix A = [aij] is said to be Hermitian matrix if Aθ = A.

1) Every diagonal element of a Hermitian Matrix is real.

2) A Hermitian matrix over the set of real numbers is actually a real symmetric matrix.

  • A square matrix, A = [aij] is said to be a skew-Hermitian matrix if Aθ = -A.

1) If A is a skew-Hermitian matrix then the diagonal elements must be either purely imaginary or zero.

2) A skew-Hermitian Matrix over the set of real numbers is actually a real skew- symmetric matrix.

  • Any square matrix A of order n is said to be orthogonal if AA' = A'A = In.

  • A matrix such that A2 = I is called involuntary matrix.

  • Let A be a square matrix of order n. Then A(adj A) = |A| In = (adj A)A.

  • The adjoint of a square matrix of order 2 can be easily obtained by interchanging the diagonal elements and changing the signs of off-diagonal (left hand side lower corner to right hand side upper corner) elements.

  • A non-singular square matrix of order n is invertible if there exists a square matrix B of the same order such that AB = In = BA.

Elementary row/column operations:

  • The following three operations can be applied on rows or columns of  a matrix:

1) Interchange of any two rows (columns)

2) Multiplying all elements of a row (column) of a matrix by a non-zero scalar. If the elements of ith row (column) are multiplied by non-zero scalar k, it will be denoted by Ri→Ri (k) [Ci→Ci (k)] or Ri→kRi [Ci→kCi].

3) Adding to the elements of a row (column), the corresponding elements of any other row (column) multiplied by any scalar k.

Rank of a matrix:

  • A number ‘r’ is called the rank of a matrix if:

1) Every square sub matrix of order (r +1) or more is singular

2) There exists at least one square sub matrix of order r which is non-singular.

  • It also equals the number of non-zero rows in the row echelon form of the matrix.

  • The rank of the null matrix is not defined and the rank of every non null matrix is greater than or equal to 1.

  • Elementary transformations do not alter the rank of  amtrix.

Minors, Cofactors and Determinant:

  • Minor of the element at the ith row is the determinant obtained by deleting the ith row and the jth column

  • The cofactor of this element is (-1)i+j (minor).

     

           where A1, B1 and C1 are the cofactors of a1, b1 and c1 respectively.

  • The determinant can be expanded along any row or column, i.e.

           Δ = a2A2 + b2B2 + c2C2 or Δ = a1A1 + a2A2 + a3A3 etc.

  • The following result holds true for determinants of any order:

  • The following result holds true for determinants of any order:

           aiA+ biB+ ciCj = Δ  if i = j,

                                  = 0   if i ≠ j.

Adjoint and Inverse of a matrix:

Let A = [aij] be a square matrix of order n and let Cij be cofactor of aij in A. Then the transpose of the matrix of cofactors of elements of A is called the adjoint of A and is denoted by adj A.

The inverse of A is given by A-1 = 1/|A|.adj A.

1) Every invertible matrix possesses a unique inverse.

2) If A and B are invertible matrices of the same order, then AB is invertible and (AB)-1 = B-1A-1. This is also termed as the reversal law.

3) In general,if A,B,C,...are invertible matrices then (ABC....)-1 =..... C-1 B-1 A-1.

4) If A is an invertible square matrix, then AT is also invertible and (AT)-1 = (A-1)T.

(5) If A is a non-singular square matrix of order n, then |adj A| = |A|n-1.

(6) If A and B are non-singular square matrices of the same order, then adj (AB) = (adj B) (adj A).

(7) If A is an invertible square matrix, then adj(AT) = (adj A)T.

(8) If A is a non-singular square matrix, then adj(adjA) = |A|n-1A.

Important Properties:

  • If rows be changed into columns and columns into the rows, then the values of the determinant remains unaltered.

  • If any two rows (or columns) of a determinant are interchanged, the resulting determinant is the negative of the original determinant.

  • If two rows (or two columns) in a determinant have corresponding elements that are equal, the value of determinant is equal to zero.

  • If each element in a row (or column) of a determinant is written as the sum of two or more terms then the determinant can be written as the sum of two or more determinants.

  • If to each element of a line (row or column) of a determinant, some multiples of corresponding elements of one or more parallel lines are added, then the determinant remains unaltered.

  • If each element in any row (or any column) of determinant is zero, then the value of determinant is equal to zero.

  • If a determinant D vanishes for x = a, then (x - a) is a factor of D, in other words, if two rows (or  two columns) become identical for x = a, then (x-a) is a factor of D.

  • In general, if r rows (or r columns) become identical when a is substituted for x, then (x-)r-1 is a factor of D.

  • The minor of an element of a determinant is again a determinant (of lesser order) formed by excluding the row and column of the element.

  • The determinant can be evaluated by multiplying the elements of a single row or a column with their respective co-factors and then adding them, i.e.

           Δ = ∑mi =1  aij .Cij,  j = 1, 2,         ...... m.

               = ∑mj =1  aij.Cij,   i = 1, 2,         ...... m.

  • Sarrus Rule: This is the rule for evaluating the determinant of order 3. The process is as given below:

  1. Write down the three rows of a determinant.

  2. Rewrite the first two rows.

  3. The three diagonals sloping down the right give the three positive terms and the three diagonals sloping down to the left give the three negative terms.

Sarrus Rule

Multiplication of two Determinants:

  1. Two determinants can be multiplied together only if they are of same order.

  2. Take the first row of determinant and multiply it successively with 1st, 2nd & 3rd rows of other determinant.

  3. The three expressions thus obtained will be elements of 1st row of resultant determinant. In a similar manner the element of 2nd & 3rd rows of determinant are obtained.

  • Symmetric determinant:

The elements situated at equal distance from the diagonal are equal both in magnitude and sign. Eg:

  • Skew- Symmetric determinant: 

  • In a skew symmetric determinant, all the diagonal elements are zero and   the elements situated at equal distance from the diagonal are equal in   magnitude but opposite in sign. The value of a skew symmetric   determinant of odd order is zero.

  • If the elements of a determinant are in cyclic arrangement, such a determinant is termed as a Circulant determinant.

  • A system of equations AX = D is called a homogeneous system if D = O. Otherwise it is called a non-homogeneous systems of equations.

  • If the system of equations has one or more solutions, then it is said to be a consistent system of equations, otherwise it is an inconsistent system of equations.

  • Let A be the co-efficient matrix of the linear system:    

    ax + by = e   &

    cx + dy = f.

    If det A ≠ 0, then the system has exactly one solution. The solution is:

          

  • Let A be the co-efficient matrix of the linear system:    

          ax + by + cz = j,    

          dx + ey + fz = k, and   

          gx + hy + iz = l.

          If det A ≠ 0, then the system has exactly one solution. The solution is:

We have the following two cases:

Case I. When Δ ≠ 0

In this case we have,

x = Δ1/Δ, y = Δ2/Δ, z = Δ3

Hence unique value of x, y, z will be obtained.

Case II: When Δ = 0

(a) When at least one of Δ1, Δ2 and Δ3 is non zero then the system is inconsistent.

Let Δ1 ≠ 0, then from case I, Δ1 = x Δ  will not be satisfied for any value of x because Δ = 0 and Δ1 ≠ 0 and hence no value of x is possible in this case.

Similarly when Δ2 ≠ 0, and Δ2 = yΔ and similarly for Δ3 ≠ 0.

(b) When Δ = 0 and Δ1 = Δ2 = Δ3 = 0 and we have,

 Δ1 = xΔ, Δ2 = yΔ and Δ3 = zΔ will be true for all values of x, y and z. But then only two of x, y, z will be independent and third will be dependent on other two, therefore if Δ = Δ1 = Δ2 = Δ3 = 0, then the system of equations will be consistent and it will have infinitely many solutions.

  • If A is a non-singular matrix, then the system of equations given by AX = D has a unique solutions given by X = A-1 D.

  • If A is a singular matrix, and (adj A) D = O, then the system of equations given by AX = D is consistent, with infinitely many solutions.

  • If A is a singular matrix, and (adj A) D ≠ O, then the system of equation given by AX = D is inconsistent.

  • Let AX = O be a homogeneous system of n linear equation with n unknowns. Now if A is non-singular then the system of equations will have a unique solution i.e. trivial solution and if A is singular then the system of equations will have infinitely many solutions. 

  • Suppose we have the following system:

a11x1+a12x2+.... + a1nxn = b1

a21x1+a22x2+.... + a2nxn = b2

…....        ….....  …....   ….... 

am1x1+am2x2+.... + amnxn = bn

Then the system is consistent iff the coefficient matrix A and the augmented matrix (A|B) have the same rank. We then have the following caes:

Case 1: The system is consistent and m ≥ n

  1. If r(a) = r((A|B) = n, then the system has a unique solution
  2. If r(a) = r((A|B) = k < n ,then (n-k) unknowns are assigned arbitrary values.

Case 2: The system is consistent and m < n

  1. If r(a) = r((A|B) = m, then (n-m) unknowns can be assigned arbitrary values.
  2. If r(a) = r((A|B) = k < m ,then (n-k) unknowns are assigned arbitrary values.

Note: here ‘r’ denotes the “rank”.


TOP Your EXAMS!

Upto 50% Scholarship on Live Classes

Course Features

  • Video Lectures
  • Revision Notes
  • Previous Year Papers
  • Mind Map
  • Study Planner
  • NCERT Solutions
  • Discussion Forum
  • Test paper with Video Solution

r