Question
Question: Using matrices, solve the following system of linear equations: \(x+y-z=3;2x+3y+z=10;3x-y-7z=1\)....
Using matrices, solve the following system of linear equations:
x+y−z=3;2x+3y+z=10;3x−y−7z=1.
Solution
First of all write the equations given in the above problem in the form of AX=B where “X” represents the column matrix contains x, y and z, “A” is a 3×3 matrix which comprises of the coefficients of x, y and z and “B” is a column matrix contains the numbers 3, 10 and 1. After that find the inverse of matrix A and pre multiply A−1 on both the sides of AX=B. The inverse of matrix “A” is calculated using this formula A−1=∣A∣adj(A). And solutions of the above system of equations are calculated from X=A−1B.
Complete step by step answer:
The system of equations given in the above problem is as follows:
x+y−z=3;2x+3y+z=10;3x−y−7z=1
We have to find the solutions of the above system of equations using matrices.
Let us reduce all the above equations to AX=B form in which A, X and B are shown below:
The matrix A is a 3×3 matrix which contains the coefficients of x, y and z and the first, second and third column of the matrix contains coefficients of x, y and z respectively.
A=1 2 3 13−1−11−7
The matrix X is a column matrix which contains three elements x, y and z.
X=x y z
The matrix “B” is a column matrix containing the constants of the three equations such as 3, 10 and 1.
B=3 10 1
Now, combine all the above elements A, X and B by substituting A, B and X in AX=B.
AX=B........ Eq. (1)
1 2 3 13−1−11−7x y z =3 10 1
Now, pre multiplying A−1 on both the sides of eq. (1) we get,
A−1AX=A−1B
We know that multiplication of A with its inverse gives an identity matrix.
IX=A−1B
Now, we can skip writing the identity matrix, it doesn’t make a big difference then the above equation will look like:
X=A−1B
We can find the inverse of A as follows:
A−1=∣A∣adj(A)………Eq. (2)
In the above formula, ∣A∣&adj(A) are the determinant of A and adjoint of A so we are going to find the determinant of A and adjoint of A and then substitute their values in the above formula to get the inverse of A.
Finding ∣A∣ as follows:
A=1 2 3 13−1−11−7
Taking determinant on both the sides we get,
∣A∣=1 2 3 13−1−11−7
We are going to solve the above determinant by expanding the determinant along the first row.
∣A∣=1(3(−7)−(−1))−1(2(−7)−3)+(−1)(2(−1)−3(3))⇒∣A∣=1(−21+1)−1(−14−3)−1(−2−9)⇒∣A∣=−20−1(−17)−1(−11)⇒∣A∣=−20+17+11⇒∣A∣=8
Hence, we have found the determinant of A as 6.
Now, we are going to find the adjoint of A.
The matrix A is equal to:
A=1 2 3 13−1−11−7
To find the adjoint of any matrix, first of all we find the cofactor of that matrix then take the transpose of the cofactor matrix.
The cofactor of matrix A contains cofactor of first row and first column, first row and second column likewise you can name all the positions of the elements. In the below, we are showing the cofactor at the position of first row and first column.
The cofactor of the element lies at ith row and jth column is given below:
Cij=(−1)i+j(Mij)……….. Eq. (3)
In the above formula, Mij is the minor corresponding to that ith&jth position. Now, we are going to show the minor for element 2 in following matrix A.
A=1 2 3 13−1−11−7
To find the minor for 2 hide the row which contains 2 and hide the column which contains 2. In the below figure, we are hiding them by coloring them red.
A=1 2 3 13−1−11−7
Now, we will construct the matrix which contains non – red colored numbers as follows:
A=1 −1 −1−7
Now, multiplying 1 by -7 and then subtract this result of multiplication with the multiplication of -1 by -1 we get,
1(−7)−(−1)(−1)=−7−1=−8
Now, ith&jth corresponding to element 2 is 2, 1 so substituting the value of i, j, minor in eq. (2) we get,
C21=(−1)2+1(M21)⇒C21=−1(−8)⇒C21=8
Similarly, you can find cofactor of all the elements given in the matrix A. In the below, we have shown calculation of some of the cofactors of the remaining elements:
Finding M11 as follows:
A=1 2 3 13−1−11−7
Now, the determinant value of the non – red colored numbers is equal to:
(3)(−7)−(1)(−1)=−21+1=−20
Hence, we got the value of M11 equals -22.
In the below, we are finding the value of M12,M13,M22,M23,M31,M32,M33 as follows:
M12=A=1 2 3 13−1−11−7
M12=2(−7)−3⇒M12=−14−3=−17
M13=1 2 3 13−1−11−7M13=2(−1)−9=−2−9=−11
M21=1 2 3 13−1−11−7M21=1(−7)−(−1)(−1)=−7−1=−8
M22=1 2 3 13−1−11−7M22=1(−7)−(−3)=−7+3=−4
M23=1 2 3 13−1−11−7M23=1(−1)−(3)=−1−3=−4
M31=1 2 3 13−1−11−7M31=1(1)−(−3)=1+3=4
M32=1 2 3 13−1−11−7M32=1(1)−(−2)=1+2=3
M33=1 2 3 13−1−11−7M33=1(3)−(2)=3−2=1
Now, you can substitute these values of minors in the cofactor expression and find the values of cofactors.
C11=(−1)1+1(−20)⇒C11=−20
C12=(−1)1+2(−17)⇒C12=17
C13=(−1)1+3(−11)⇒C12=−11
C21=(−1)2+1(−8)⇒C21=8
C22=(−1)2+2(−4)⇒C22=−4
C23=(−1)2+3(−4)⇒C23=4
C31=(−1)3+1(4)⇒C31=4
C32=(−1)3+2(3)⇒C32=−3
C33=(−1)3+3(1)⇒C33=1
Now, writing the cofactors of the matrix A we get,
−20 8 4 17−4−3−1141
Now, taking transpose of the above matrix we get,
−20 8 4 17−4−3−1141T=−20 17 −11 8−444−31
Hence, we have got the adjoint of the matrix A as:
adj(A)=−20 17 −11 8−444−31
Now, substituting the value of determinant of A and adjoint of A in eq. (2) we get,
A−1=∣A∣adj(A)
A−1=81−20 17 −11 8−444−31
Now, multiplying 61 to every element of the above matrix we get,