This has been the (to me very surprising) result of my enquiry into the following beautiful problem, posed to the author by Prof. Ben Green. I am recording my findings primarily for my own reference, and would urge any reader not familiar with the problem to at least try to attack it themselves before reading on.

Question: If points (if you like, vectors) $x,y$ are selected randomly and independently with uniform probability from $\ [ 0,1]^d$, show there exists $c>0$ such that $\mathbb{P}(||x|-|y|| \leq 1) \geq c$ for all $d$.

My initial attempts to solve this problem were not particularly successful, since trying to relate $|x|$ and $|y|$ as they both varied over their distributions seemed tricky. My initial heuristic was along the lines of: “The random variable $|x|$ ranges over the interval $0 \leq |x| \leq \sqrt{d}$ so as $d$ grows the distribution must spread itself across a greater distance, hence be less at any given point under a suitable continuous transform to keep the distribution ‘appropriate’ (with mean at the correct, also growing, place). Thus while the distribution does peak near the middle, it does not seem likely that a given $O(1)$ length neighbourhood can have $\Omega (1)$ probability as $d \rightarrow \infty$.”

With this heuristic in mind, the only method of approach which seemed likely to work was to take some kind of sum or integral over various (and a number growing with $d$) disjoint intervals on the distribution to get a spread-out enough estimate to eliminate the lack of an $\Omega(1)$ probability on any given interval. However, actually going ahead and doing something like this proved to be very tricky, especially for one such as myself with little formal probabilistic training.

So I decided to leave the above heuristic aside and just see how far from $\Omega(1)$ the probability looked like around a natural-looking modal point. I achieved the following surprising result.

Theorem 1: For a fixed constant $\alpha$,

$\mathbb{P}(||x|-\sqrt{\frac{d}{3}}| \leq \alpha) \geq 1 - (\sqrt{15}\alpha-\frac{3\sqrt{5}\alpha^2}{2\sqrt{d}})^{-2}$.

In particular, the above question is a trivial consequence of the case $\alpha=\frac{1}{2}$.

Owing to the nature of the Euclidean metric, it seemed natural to set up variables as follows:

Let $X_1,X_2,...,X_d \in [0,1]$ be selected randomly and uniformly, and set $F = \sum_i X_i^2$. A simple application of the linearity of expectation showed that $\mathbb{E}(F) = d/3$, and as $F = |x|^2$ seemed a much easier object to work with than $|x|$ I decided to take a look at $\mathbb{P}(||x|-\sqrt{d/3}| \leq \alpha)$ for some constant $\alpha >0$ to be played with later.

This is equal to $\mathbb{P}(||x| \in [\sqrt{d/3}-\alpha, \sqrt{d/3}+\alpha])$

$= \mathbb{P}(F \in [d/3-2\alpha\sqrt{d/3}+\alpha^2, d/3+2\alpha\sqrt{d/3} + \alpha^2])$

which is then clearly $\geq \mathbb{P}(|F-d/3| \leq 2\alpha\sqrt{d/3} - \alpha^2)$.

Which looks like the kind of quantity one can bound above by Chebyshev’s inequality and get a lower bound around $\Omega (d^{1/2}\text{Var}(F)^{-1/2})$. The initial heuristic relied on the ‘near uniformity’ assumption that this variance would be of magnitude just a little less than $d^2$, but here I would be very pleasantly surprised.

$\mathbb{E}(F^2) = \sum_i \mathbb{E}(X_i^4) + \sum_{i\not=j} \mathbb{E}(X_i^2X_j^2) = \frac{d}{5} + \frac{d(d-1)}{9}$

But this implies the much lower than expected $\text{Var}(F) = \frac{d}{5}-\frac{d}{9} = \frac{4d}{45}$. In other words, the heuristic based on assuming $F$ is roughly smoothly distributed was fallacious. Instead, the distribution of $F$ has a ‘spike’ at its expected value and is almost zero away from it.

So $\mathbb{P}(||x| \in [\sqrt{d/3}-\alpha, \sqrt{d/3}+\alpha]) \geq 1 - \mathbb{P}(|F-d/3| > 2\alpha\sqrt{d/3} - \alpha^2)$.

Writing $2\alpha\sqrt{d/3} - \alpha^2 = (\sqrt{15}\alpha-\frac{3\sqrt{5}\alpha^2}{2\sqrt{d}})\text{Var}^{1/2}(F)$ we can apply Chebyshev’s inequality to gain the lower bound $\mathbb{P}(||x| \in [\sqrt{d/3}-\alpha, \sqrt{d/3}+\alpha]) \geq 1 - (\sqrt{15}\alpha-\frac{3\sqrt{5}\alpha^2}{2\sqrt{d}})^{-2}$, which is precisely theorem 1.

So my heuristic has been torn to pieces, and, instead, as $d$ gets large, the ‘amount of space’ in the $d$-dimensional unit cube whose distance from the origin is within a certain (small) distance of $\sqrt{d/3}$  actually must be rather a lot. For example, take $\alpha=\frac{1}{2}$. Then $P(||x|-\sqrt{d/3} \geq 1-(\frac{\sqrt{15}}{2}-O(d^{-1/2}))^{-2}$, so as $d$ tends to infinity the probability approaches a value of at least $\frac{11}{15}$, so more than two thirds of points in a unit cube of high dimension are (with maximum error $\frac{1}{2}$) roughly distance $\sqrt{\frac{d}{3}}$ from the origin (or, by symmetry, any other vertex).

I found this result initially very surprising (and sent Prof. Green a semi-apologetic email in which I acknowledged that I could just be having an evening of insanity and the next morning discover my estimates were based on false assumptions). The basic reason the smooth expected distribution ended up being a spike was that in the variance calculation the quadratic terms cancelled out, leaving only smaller order terms. In another post in the next few days I shall try to investigate whether this phenomenon (where the quadratic parts of $E(X^2), E(X)^2$ are equal) can be found in any interesting families of general situations, as this now seems likely and should be investigated to inform my intuition in future.