English

If a = 1 3 ⎡ ⎢ ⎣ 1 1 2 2 1 − 2 X 2 Y ⎤ ⎥ ⎦ is Orthogonal, Then X + Y = - Mathematics

Advertisements
Advertisements

Question

If \[A = \frac{1}{3}\begin{bmatrix}1 & 1 & 2 \\ 2 & 1 & - 2 \\ x & 2 & y\end{bmatrix}\] is orthogonal, then x + y =

(a) 3
(b) 0
(c) − 3
(d) 1

Options

  • 3

  • 0

  • -3

  • 1

  • None of these

MCQ

Solution

None of these

\[\text{ We have, }A = \frac{1}{3}\begin{bmatrix}1 & 1 & 2 \\ 2 & 1 & - 2 \\ x & 2 & y\end{bmatrix}\]
\[ \Rightarrow A^T = \frac{1}{3}\begin{bmatrix}1 & 2 & x \\ 1 & 1 & 2 \\ 2 & - 2 & y\end{bmatrix}\]
\[\text{ Now,} A^T A = I\]
\[ \Rightarrow \begin{bmatrix}x^2 + 5 & 2x + 3 & xy - 2 \\ 3 + 2x & 6 & 2y \\ xy - 6 & 2y & y^2 + 8\end{bmatrix} = \begin{bmatrix}9 & 0 & 0 \\ 0 & 9 & 0 \\ 0 & 0 & 9\end{bmatrix}\]
The corresponding elements of two equal matrices are not equal . 
Thus, the matrix A is not orthogonal .

shaalaa.com
  Is there an error in this question or solution?
Chapter 7: Adjoint and Inverse of a Matrix - Exercise 7.4 [Page 38]

APPEARS IN

RD Sharma Mathematics [English] Class 12
Chapter 7 Adjoint and Inverse of a Matrix
Exercise 7.4 | Q 25 | Page 38

RELATED QUESTIONS

Find the adjoint of the matrices.

`[(1,2),(3,4)]`


Find the adjoint of the matrices.

`[(1,-1,2),(2,3,5),(-2,0,1)]`


Find the inverse of the matrices (if it exists).

`[(1,-1,2),(0,2,-3),(3,-2,4)]`


Find the inverse of the matrices (if it exists).

`[(1,0,0),(0, cos alpha, sin alpha),(0, sin alpha, -cos alpha)]`


For the matrix A = `[(1,1,1),(1,2,-3),(2,-1,3)]` show that A3 − 6A2 + 5A + 11 I = O. Hence, find A−1.


Compute the adjoint of the following matrix:

\[\begin{bmatrix}2 & - 1 & 3 \\ 4 & 2 & 5 \\ 0 & 4 & - 1\end{bmatrix}\]

Verify that (adj A) A = |A| I = A (adj A) for the above matrix.


Find A (adj A) for the matrix  \[A = \begin{bmatrix}1 & - 2 & 3 \\ 0 & 2 & - 1 \\ - 4 & 5 & 2\end{bmatrix} .\]


Find the inverse of the following matrix:

\[\begin{bmatrix}a & b \\ c & \frac{1 + bc}{a}\end{bmatrix}\]

Find the inverse of the following matrix:

\[\begin{bmatrix}2 & 5 \\ - 3 & 1\end{bmatrix}\]

Find the inverse of the following matrix.

\[\begin{bmatrix}1 & 2 & 5 \\ 1 & - 1 & - 1 \\ 2 & 3 & - 1\end{bmatrix}\]

Find the inverse of the following matrix.

\[\begin{bmatrix}1 & 0 & 0 \\ 0 & \cos \alpha & \sin \alpha \\ 0 & \sin \alpha & - \cos \alpha\end{bmatrix}\]

For the following pair of matrix verify that \[\left( AB \right)^{- 1} = B^{- 1} A^{- 1} :\]

\[A = \begin{bmatrix}3 & 2 \\ 7 & 5\end{bmatrix}\text{ and }B \begin{bmatrix}4 & 6 \\ 3 & 2\end{bmatrix}\]


Given \[A = \begin{bmatrix}2 & - 3 \\ - 4 & 7\end{bmatrix}\], compute A−1 and show that \[2 A^{- 1} = 9I - A .\]


Let
\[F \left( \alpha \right) = \begin{bmatrix}\cos \alpha & - \sin \alpha & 0 \\ \sin \alpha & \cos \alpha & 0 \\ 0 & 0 & 1\end{bmatrix}\text{ and }G\left( \beta \right) = \begin{bmatrix}\cos \beta & 0 & \sin \beta \\ 0 & 1 & 0 \\ - \sin \beta & 0 & \cos \beta\end{bmatrix}\]

Show that

(i) \[\left[ F \left( \alpha \right) \right]^{- 1} = F \left( - \alpha \right)\]
(ii) \[\left[ G \left( \beta \right) \right]^{- 1} = G \left( - \beta \right)\]
(iii) \[\left[ F \left( \alpha \right)G \left( \beta \right) \right]^{- 1} = G \left( - \beta \right)F \left( - \alpha \right)\]

Show that \[A = \begin{bmatrix}6 & 5 \\ 7 & 6\end{bmatrix}\] satisfies the equation \[x^2 - 12x + 1 = O\]. Thus, find A−1.


For the matrix \[A = \begin{bmatrix}1 & 1 & 1 \\ 1 & 2 & - 3 \\ 2 & - 1 & 3\end{bmatrix}\] . Show that

\[A^{- 3} - 6 A^2 + 5A + 11 I_3 = O\]. Hence, find A−1.

Find the matrix X satisfying the equation 

\[\begin{bmatrix}2 & 1 \\ 5 & 3\end{bmatrix} X \begin{bmatrix}5 & 3 \\ 3 & 2\end{bmatrix} = \begin{bmatrix}1 & 0 \\ 0 & 1\end{bmatrix} .\]

Find the inverse by using elementary row transformations:

\[\begin{bmatrix}7 & 1 \\ 4 & - 3\end{bmatrix}\]


Find the inverse by using elementary row transformations:

\[\begin{bmatrix}5 & 2 \\ 2 & 1\end{bmatrix}\]


Find the inverse by using elementary row transformations:

\[\begin{bmatrix}3 & 10 \\ 2 & 7\end{bmatrix}\]


Find the inverse by using elementary row transformations:

\[\begin{bmatrix}- 1 & 1 & 2 \\ 1 & 2 & 3 \\ 3 & 1 & 1\end{bmatrix}\]


If \[A = \begin{bmatrix}2 & 3 \\ 5 & - 2\end{bmatrix}\] , write  \[A^{- 1}\] in terms of A.


If A is an invertible matrix of order 3, then which of the following is not true ?


If A is a singular matrix, then adj A is ______.


If A, B are two n × n non-singular matrices, then __________ .


If B is a non-singular matrix and A is a square matrix, then det (B−1 AB) is equal to ___________ .


For any 2 × 2 matrix, if \[A \left( adj A \right) = \begin{bmatrix}10 & 0 \\ 0 & 10\end{bmatrix}\] , then |A| is equal to ______ .


If \[\begin{bmatrix}1 & - \tan \theta \\ \tan \theta & 1\end{bmatrix} \begin{bmatrix}1 & \tan \theta \\ - \tan \theta & 1\end{bmatrix} - 1 = \begin{bmatrix}a & - b \\ b & a\end{bmatrix}\], then _______________ .


If A = `[(x, 5, 2),(2, y, 3),(1, 1, z)]`, xyz = 80, 3x + 2y + 10z = 20, ten A adj. A = `[(81, 0, 0),(0, 81, 0),(0, 0, 81)]`


Find x, if `[(1,2,"x"),(1,1,1),(2,1,-1)]` is singular


For matrix A = `[(2,5),(-11,7)]` (adj A)' is equal to:


If A = `[(0, 1),(0, 0)]`, then A2023 is equal to ______.


If for a square matrix A, A2 – A + I = 0, then A–1 equals ______.


Given that A is a square matrix of order 3 and |A| = –2, then |adj(2A)| is equal to ______.


Share
Notifications

Englishहिंदीमराठी


      Forgot password?
Use app×