Some Interesting Linear Algebra Problems
Published:
In this blog, I collected some interesting (I thought) theorems, conclusions, and problems in linear algebra. It’s mainly for my own reference and for fun! I’ll update them and related proofs frequently ;).
-
For a given matrix $A \in \mathcal{C}^{m\times n}$, its row rank equals the
column rank.
There are several ways of showing this relation.
Reduced form: As we know that elementary row/column operations do not change the rank of the matrix, we can apply them on $A$ to get the reduced form: \begin{equation} PAQ = \begin{bmatrix} \boldsymbol{I}_{r\times r} &\\ & {\mathbf{0}}_{(m-r) \times (n-r)} \end{bmatrix} \end{equation} where $P$ and $Q$ are the corresponding row and column operations (both invertable). We can immediately see that the row and column ranks of $A$ equal the row and column ranks of $PAQ$ and hence $\boldsymbol{I}_{r \times r}$, which are both $r$.
Row/Column Space: Assume the column rank of $A$ is $r$, and let $c_1, c_2, ..., c_r$ be a set of basis for the column space of $A$. Now we can write $A$ as $A = C_{m\times r}R_{r \times n}$, where $C = [c_1, c_2, ..., c_r]$, and $R$ contains the coefficients when express each column vector of $A$ using these bases. On the other hand, $A=CR$ also implies that each row of $A$ is a linear combination of the rows in $R$. Therefore, the row rank of $A$ cannot exceed the row rank of $R$, which is at most $r$. As a result, we have proved that row rank is less or equal than the column rank. Similar results also applies to $A^T$, which then completes the proof that row rank equals the column rank. - For a square upper-trianular matrix, its inverse is also upper-triangular. Similar results also hold for lower-triangular matrix.
- The nonzero singular values of $A$ are the square roots of the nonzero eigenvalues of $A^TA$ or $AA^T$.
- If $A = A^T$, then the singular values of $A$ are the absolute values of the eigenvalues of $A$.
- If $P$ is an orthogonal projector, then $I-2P$ is unitary.
- Any square projector $P$ has a 2-norm greater or equal to 1, and the equality only holds when $P$ is an orthogonal projector.
-
Hadamard's inequality
:
Let $A$ be an $m\times m$ matrix, and let $a_j$ be its $j^{\text{th}}$ column,
we have
\begin{equation*} |\det A| \le \prod_{j=1}^m \|a_j\|_2. \end{equation*} Proof: If matrix $A$ is not full-rank, the proof is trivial. Assume $A$ is full-rank, this inequality can be shown by the $QR$ decomposition of matrix $A$, \begin{equation} A = QR = \left[ \begin{array}{c|c|c|c|c} & & & & \\ q_1 & q_2 & q_3 & \cdots & q_m\\ & & & & \end{array}\right] \left[ \begin{array}{ccccc} r_{11} & r_{12} & r_{13} & \cdots & r_{1m}\\ & r_{22} & r_{23} & \cdots & r_{2m}\\ & & r_{33} & \cdots & r_{3m}\\ & & & \ddots & \vdots\\ & & & & r_{mm} \end{array} \right]. \end{equation} Since $Q$ is unitary, $\det{A} = \det{R} = \prod r_{ii}$. The diagonal entries of $R$ can be uniquely determined by \begin{equation} r_{ii} = \left\| a_i - \sum_{j=1}^{i-1} q_j^Ta_i q_j \right \|_2 = \|a_i - b_i \|_2, \end{equation} where $b_i = \sum_{j=1}^{i-1} q_j^T a_i q_j$. We can easily find that $a_i-b_i \perp b_i$.
Hence, given $\forall i$, we have \begin{equation} r_{ii}^2 = \|a_i - b_i\|_2 \le \|a_i - b_i\|^2_2 + \|b_i\|^2_2 = \|a_i\|_2^2 \Rightarrow r_{ii} \le \|a_i\|_2 \end{equation} the equality holds only when $a_i$ is orthogonal to $q_j$ for $\forall j<i$. The equation above completes the proof of the original inequality, and the equality only holds when $A$ is an orthogonal matrix.
Apply the Hadamard's inequality for a hermitian (symmetric) matrix, we can have a tighter bound for the determinant, \begin{equation} \| \det A\| \le \prod_{i=1}^m a_{ii}. \end{equation} This can be proved by the Cholesky decomposition of $A$. - Householder reflections preserve better orthogonality than Gram-Schmidt orthogonalization. MATLAB uses householder, I believe, in its QR factorization.
- Gerschgorin's theorem
- Baver-Fike theorem