Question
Question: If the a, b and c are cube root of unity, \[\left| \begin{matrix} {{e}^{a}} & {{e}^{2a}} & {{...
If the a, b and c are cube root of unity,
{{e}^{a}} & {{e}^{2a}} & {{e}^{3a}} \\\ {{e}^{b}} & {{e}^{2b}} & {{e}^{3b}} \\\ {{e}^{c}} & {{e}^{2c}} & {{e}^{3c}} \\\ \end{matrix} \right|-\left| \begin{matrix} {{e}^{a}} & {{e}^{2a}} & 1 \\\ {{e}^{b}} & {{e}^{2b}} & 1 \\\ {{e}^{c}} & {{e}^{2c}} & 1 \\\ \end{matrix} \right|$$ 1\. $$0$$ 2\. $$e$$ 3\. $$e^2$$ 4\. $$e^3$$Explanation
Solution
Here in the given question is a, b and c are cube roots of unity. That meansa=1, b=1andc=1. In first step we have to take the common eaebec onea eb ec e2ae2be2ce3ae3be3candea eb ec e2ae2be2c111= 1 1 1 eaebece2ae2be2c By using the property of the determinant, if we interchange the column twice, the determinant will be the same.
Complete step by step answer:
Assign the value Δto this Question (You can assign any value for simplicity)