Solveeit Logo

Question

Question: Let \[\omega {\text{ }} \ne {\text{ }}1\]be a cube root of unity and S be the set of all non-singula...

Let ω  1\omega {\text{ }} \ne {\text{ }}1be a cube root of unity and S be the set of all non-singular matrices of the form \left( {\begin{array}{*{20}{c}} 1&a;&b; \\\ \omega &1&c; \\\ {{\omega ^2}}&\omega &1 \end{array}} \right),
where each of a, b, and c is either ω\omega \,orω2{\omega ^2}. Then, the number of distinct matrices in the set S is
A) 2
B) 6
C) 4
D) 8

Explanation

Solution

A root of unity is a complex number that, when raised to a positive integer power, results in 1. Roots of unity have connections to many areas of mathematics, including the geometry of regular polygons, group theory, and number theory.
Unity means 1. The symbol ω\omega \, is the cube root of ‘1’.

Complete step-by-step answer:
Since, the given matrix \left( {\begin{array}{*{20}{c}} 1&a;&b; \\\ \omega &1&c; \\\ {{\omega ^2}}&\omega &1 \end{array}} \right)is non-singular.
So, \Delta = \left( {\begin{array}{*{20}{c}} 1&a;&b; \\\ \omega &1&c; \\\ {{\omega ^2}}&\omega &1 \end{array}} \right) \ne 0
Here are the steps to go through to find the determinant.
Pick any row or column in the matrix. It does not matter which row or which column you use, the answer will be the same for any row. There are some rows or columns that are easier than others, but we'll get to that later.
Multiply every element in that row or column by its cofactor and add. The result is the determinant.
Let's expand our matrix along the first row.

1&c; \\\ \omega &1 \end{array}} \right) - a\left( {\begin{array}{*{20}{c}} \omega &c; \\\ {{\omega ^2}}&1 \end{array}} \right) + b\left( {\begin{array}{*{20}{c}} \omega &1 \\\ {{\omega ^2}}&\omega \end{array}} \right) \ne 0$$ $ \Rightarrow 1(1 - \omega c) - a(\omega - {\omega ^2}c) + b({\omega ^2} - {\omega ^2}) \ne 0$ Open the brackets and solve the equation. $ \Rightarrow 1 - \omega c - a\omega + a{\omega ^2}c + b(0) \ne 0$ Taking ω common from the second and third term. $$ \Rightarrow 1 - \omega (a + c) + {\omega ^2}ac \ne 0$$ As we know the sum of the three cube roots of unity is zero i.e.,$$1{\text{ }} + {\text{ }}\omega {\text{ }} + {\text{ }}{\omega ^2}{\text{ }} = {\text{ }}0$$. $ \Rightarrow a + c \ne - 1\;and\;ac \ne 1$ So, the above equation implies a = ω or ω2 and c = ω or ω2 and already given in the question. If a = $ω^2$ and c= $ω^2$, $$ \Rightarrow 1 - \omega (a + c) + {\omega ^2}ac \ne 0$$becomes

\Rightarrow 1 - \omega ({\omega ^2} + {\omega ^2}) + {\omega ^2}{\omega ^2}{\omega ^2} \ne 0 \\
\Rightarrow 1 - 2{\omega ^3} + {\omega ^6} \ne 0 \\

As we know $${\omega ^3} = 1$$ and $${\omega ^6}$$=1, so $$ \Rightarrow 1 - 2 + 1 \ne 0$$(So, this equation is not possible.) Similarly, If a = ω2 and c=ω, $$ \Rightarrow 1 - \omega (a + c) + {\omega ^2}ac \ne 0$$becomes

\Rightarrow 1 - \omega ({\omega ^2} + \omega ) + {\omega ^2}{\omega ^2}\omega \ne 0 \\
\Rightarrow 1 - {\omega ^2} - {\omega ^3} + {\omega ^5} \ne 0 \\

As we know $${\omega ^3} = 1$$ and $${\omega ^5} = 1({\omega ^2})$$, so $$ \Rightarrow 1 - {\omega ^2} - 1 + {\omega ^2} \ne 0$$(So, this equation is not possible.) Similarly, If a = ω and c=ω2, $$ \Rightarrow 1 - \omega (a + c) + {\omega ^2}ac \ne 0$$ becomes

\Rightarrow 1 - \omega (\omega + {\omega ^2}) + {\omega ^2}{\omega ^2}\omega \ne 0 \\
\Rightarrow 1 - {\omega ^2} - {\omega ^3} + {\omega ^5} \ne 0 \\

As we know $${\omega ^3} = 1$$ and $${\omega ^5} = 1({\omega ^2})$$, so $$ \Rightarrow 1 - {\omega ^2} - 1 + {\omega ^2} \ne 0$$(So, this equation is not possible.) At last, If a = ω and c=ω, $$ \Rightarrow 1 - \omega (a + c) + {\omega ^2}ac \ne 0$$ becomes

\Rightarrow 1 - \omega (\omega + \omega ) + \omega \omega \omega \ne 0 \\
\Rightarrow 1 - 2{\omega ^2} + {\omega ^3} \ne 0 \\

As we know $${\omega ^3} = 1$$, so $$ \Rightarrow 1 - 2{\omega ^2} + 1 \ne 0$$(So, this equation is possible.) Hence, when we have a=ω and b=ω. Since, the determinant value is independent of b . So b can be ω or $ω^2$. Hence, c = ω, a = ω, b = ω or $ω^2$. As a, b and c are complex cube roots of unity. So, a and c can take only one value i.e. ω while b can take 2 values i.e. ω or $ω^2$. Therefore, Total number of distinct matrices= $1 \times 1 \times 2 = 2$ **So, option (A) is the correct answer.** **Note:** Be very careful when substituting the values into the right places in the formula for expanding the matrices. Common errors occur when students become careless during the initial step of substitution of values. In addition, take your time to make sure your arithmetic is also correct. Otherwise, a single error somewhere in the calculation will yield a wrong answer in the end.