“Some estimates of norms of random matrices” and “Tensor sparsification via a bound on the spectral norm of random tensors”

Every now and again, I get it in my head to revisit Latala’s paper “Some estimates of norms of random matrices” where he gives a sharp bound on the expected spectral norm of a mean zero random matrix with independent entries.

His proof is damn near incomprehensible (certainly so for me), results in a bound with an unspecified universal constant, and doesn’t give moment bounds. But it’s clear that all of these problems can be gotten around. It looks like he’s using an entropy-concentration tradeoff, as Rudelson and Vershynin call it, but his proof is so convoluted that it’s hard to pin down exactly what’s going on.

But whenever I get it in my head to attempt to clarify and extend his proof, I remember that Nguyen, Drineas, and Tran have already done so. They’ve actually extended it to bound the expected norm of mean zero random tensors with independent entries. Unfortunately they don’t seem to give Latala his full amount of credit, since they claim they’re using a new approach (viz, the entropy-concentration tradeoff) due to Rudelson and Vershynin, when it seems to me that they’re revising Latala’s proof to make it more readable and making obvious extensions. This isn’t to say that the NDT paper isn’t worthwhile: I think it’s a great example of power of the Gaussian symmetrization techniques and a very approachable demonstration of the entropy-concentration tradeoff. Just I think Latala should be given more credit for observing this entropy-concentration tradeoff.

This case serves as a supporting point in my argument that it’s not enough to just publish results. Your exposition should be clear about the intuition and techniques that are used, not simply a chain of technical arguments. If Latala had spent more time refining his exposition, the NDT result would be clearly seen as a descendant of his work.

Initio

This site’s purpose is to track and encourage my mathematical reading while giving me a central location to record my observations and comments on whatever I’m reading. I hope that others will be motivated to share their own thoughts and comments, or suggest relevant readings.

For now the blog’s name is “Readings in modern applied math.” Modern as opposed to classical: I’m NOT interested in classical applied math (PDEs and related numerical methods). I’m interested in the more modern ‘informatic aspects’ of applied math and the associated theory. What I mean by that should become clear as I post (actually, here’s a precis of the research I’ve done so far, and some things I’m interested on working on in the future), but essentially for now think of the type of work being done by the likes of:

(applied people) Christos Boutisidis, Petros Drineas, Michael Mahoney, Benjamin Recht, Joel Tropp, Emmanuel Candes, Daniel Hsu, Mark Tygert, Nam Nguyen

(more theoretical) Daniel Spielman, Mark Rudelson, Roman Vershynin, Pascal Massart, Michel Ledoux, Michel Talagrand, Sara van de Geer

So, if you’re interested, subscribe to the blog to get updates, and comment to let me know of any work or researchers you think I may find interesting.