Well damn. It’s been a long while. Time for another post about my not having posted in a while.
I’m wrapping up my internship here at IBM, preparing the exit talk on my research. It’s entitled “New results on the accuracy of randomized spectral methods.” I’ll put up the talk after we’ve wrapped up the paper and submitted it to ArXiv, in a month or so, but here’s a synopsis. Motivated by the fact that a lot of applications of randomized projection-based low-rank approximations use them to approximate subspaces rather than matrices (the canonical example being spectral clustering), we investigated the accuracy of the subspaces so obtained. It turns out that you have a pretty nice dependence on the number of steps of the power method you take, and the oversampling … the error bounds are almost usable. That’s a coup for machine learning applications! However, I think a sharper analysis (ours is as simple as possible) might be able to make the bounds tight enough for practical use. Along the way, we revisited the analysis of the classical subspace iteration method (unintentionally), and found a geometric interpretation of that “random space interaction with singular spaces” term that shows up in the analysis of random projection-based low-rank approximation error bounds.
I think the results are nice enough for a conference paper, but I’m (as usual) a little leery of claiming a strong original contribution, since a lot of what we did turns out to be only a few steps removed from others’ work. But them’s the breaks when you work in numerical linear algebra!