The applied side of approximation theory is like black magic to me: even if I don’t want to do research in the area, at least I respect it enough to find it interesting. So, I recommend bit-player’s post on Nick Trefethen’s Chebfun package. Especially check out the links (Trefethen’s an engaging author).
Month: December 2011
Verify these matrix properties (easy and fun)
I’m looking at products like \(\mat{G}^t \mat{A} \mat{G}\) where the columns of \(\mat{G}\) are (nonisotropic) normal vectors. Specifically, I’d like to know the distribution of the eigen/singular-values of this product. Surprisingly, I was unable to find any results in the literature on this, so I started reading Gupta and Nagar to see if I could work it out using Jacobian magic.
In the preliminary material, they list some basic matrix algebra facts. Your mission, should you choose to accept it, is to prove the following:
- If \(\mat{A} > \mat{0},\) \(\mat{B} > \mat{0},\) and \(\mat{A} – \mat{B} > \mat{0},\) then \(\mat{B}^{-1} – \mat{A}^{-1} > \mat{0}\)
- If \(\mat{A}>\mat{0}\) and \(\mat{B} > \mat{0},\) then \(\det(\mat{A}+\mat{B}) > \det(\mat{A}) + \det(\mat{B})\)
The second’s easy, but for the first I found it necessary to use the fact that every positive matrix has a unique positive square root.
Some more, this time on Kronecker products:
- If \(\mat{A}\) has eigenvalues \(\{\alpha_i\}\) and \(\mat{B}\) has eigenvalues \(\{\beta_j\},\) then \(\mat{A} \otimes \mat{B}\) has eigenvalues \(\{\alpha_i \beta_j\}.\)
- If \(\mat{A}\) is \(m \times m\) and \(\mat{B}\) is \(n \times n\), then \(\det(\mat{A} \otimes \mat{B}) = \det(\mat{A})^n \det(\mat{B})^m\)
It’s useful here to note that
\[
(\mat{A} \otimes \mat{B}) (\mat{C} \otimes \mat{D}) = \mat{AC} \otimes \mat{BD},
\]
which has some obvious consequences, like that the Kronecker product of two orthogonal matrices is orthogonal. To be clear, the first Kronecker question can be addressed without any tedious calculations.