Norms of sums of random vectors sampled without replacement

Let \(\mathbb{P}_{WOR(\ell)}\) denote a probability under the model where a set \(I\) of \(\ell\) indices is sampled without replacement from \(\{1,2,\ldots,n\}\), and \(\mathbb{P}_{WR(\ell)}\) denote a probability under the model where the \(\ell\) indices are chosen with replacement. Then is it the case that, for any collection \(\vec{v}_i\) of \(n\) vectors in a vector space,
\[
\mathbb{P}_{WOR(\ell)}\left(\left\| \sum\nolimits_{i \in I} \vec{v}_i \right\| \geq t \right) \leq \mathbb{P}_{WR(\ell)}\left(\left\| \sum\nolimits_{i \in I} \vec{v}_i \right\| \geq t \right)
\]
up to constants?
In general, the \(WOR(\ell)\) model is harder to work with, because the terms in the resulting sum are dependent, whereas the terms in the \(WR(\ell)\) sum are independent. So if this inequality held, it would make life easier.

I don’t know if this inequality is true in general, but there are classical exponential tail bounds for the rhs, and I believe that those bounds hold up to constants for the lhs. Here’s an outline of the argument. The key fact is that
\[
\E_{WOR(\ell)}f\left(\sum\nolimits_{i \in I} \vec{v}_i\right) \leq \E_{WR(\ell)}f\left(\sum\nolimits_{i \in I} \vec{v}_i\right)
\]
for any convex function \(f.\) This was originally realized by Hoeffding, but Gross provides a more readable account. Take \(f(x) = \exp(\|x\|)\) to see that the moment generating function under the \(WOR(\ell)\) model is dominated by the moment generating function under the \(WR(\ell)\) model.

Next step. We know that the deviation of the norm of the sum from its expectation under the \(WR(\ell)\) model satisfies a Bennett inequality (see Ledoux and Talagrand, chapter 6):
\[
\mathbb{P}_{WR(\ell)}\left( \left|\left\|\sum\nolimits_{i \in I} \vec{v}_i\right\| – \E_{WR(\ell)} \left\|\sum\nolimits_{i \in I} \vec{v}_i \right\| \right| \geq t \right) \leq \exp\left[\frac{t}{2a} – \left(\frac{t}{2a} + \frac{b^2}{4a^2} \right) \log\left(1 + \frac{2 a t}{b^2} \right) \right].
\]
Here \(a\) is a bound on the norms of the individual summands and \(b^2 = \ell/n \cdot \sum_{i=1}^n \|\vec{v}_i\|^2.\)
From the tail bound/moment/mgf connection, we know that from this tail bound we can recover a bound on the mgf of this deviation under the \(WR(\ell)\) model that allows recovery of the Bennett inequality from the usual Chernoff arguments (up to changes in the constants).

(Alternatively, I realized latterly, you can just look for a proof this or a similar inequality which proceeds by bounding the mgf, and use that bound directly rather than reverse engineering to get a similar but looser bound. The issue here is that you have to find a proof of this inequality with uses the mgf. I was not able to do this, but I didn’t try too hard, and there are similar results in chapter 1 of Ledoux and Talagrand which are good enough for my purposes)

Now we do some algebra to bound an mgf involving the norm of the sum formed under the \(WOR(\ell)\) model:
\[
\E_{WOR(\ell)} \exp\left(\lambda \left\|\sum\nolimits_{i \in I} \vec{v}_i \right\|\right) \leq \E_{WR(\ell)} \exp\left(\lambda \left\|\sum\nolimits_{i \in I} \vec{v}_i \right\|\right)
\]
from our first observation, and the right hand side is smaller than
\[
\E_{WR(\ell)} \exp\left(\lambda \bigg|\left\|\sum\nolimits_{i \in I} \vec{v}_i \right\| – \E_{WR(\ell)}\left\|\sum\nolimits_{i \in I} \vec{v}_i \right\|\bigg| \right) \exp\left(\lambda \E_{WR(\ell)}\left\|\sum\nolimits_{i \in I} \vec{v}_i \right\|\right),
\]
so rearranging gives
\[
\E_{WOR(\ell)} \exp\left(\lambda \left[\left\|\sum\nolimits_{i \in I} \vec{v}_i \right\| – \E_{WR(\ell)}\left\|\sum\nolimits_{i \in I} \vec{v}_i \right\|\right] \right) \leq \E_{WR(\ell)} \exp\left(\lambda \bigg|\left\|\sum\nolimits_{i \in I} \vec{v}_i \right\| – \E_{WR(\ell)}\left\|\sum\nolimits_{i \in I} \vec{v}_i \right\|\bigg| \right).
\]

Recall that we have a bound on the rhs from the Bennett inequality. Thus we have a bound on the mgf of the deviation of the norm of the sum under the \(WOR(\ell)\) model above the expectation of the norm of the sum under the \(WR(\ell)\) model, and this bound is sufficiently strong to generate a Bennett inequality. Thus, caveat! up to changes in the constants (since I have not done the calculations), we have that
\[
\mathbb{P}_{WOR(\ell)}\left(\left\| \sum\nolimits_{i \in I} \vec{v}_i \right\| \geq t + \E_{WR(\ell)}\left\|\sum\nolimits_{i \in I} \vec{v}_i \right\| \right) \leq \exp\left[\frac{t}{2a} – \left(\frac{t}{2a} + \frac{b^2}{4a^2} \right) \log\left(1 + \frac{2 a t}{b^2} \right) \right].
\]