Equal-slices distribution for DHJ(k)

From Polymath1Wiki
Jump to: navigation, search

For [math]k \in {\mathbb N}[/math], define [math]\nu_k[/math] to be the uniform distribution on the [math]k[/math]-simplex [math]\{(p_1, \dots, p_k) : p_i \geq 0\ \forall i, \sum \pi_i = 1\}[/math]. It will be useful to think of [math]\nu_k[/math] as first drawing [math]k[/math] i.i.d. [math]\mathrm{Exponential}(1)[/math] rv's [math]Y_1, \dots, Y_k[/math], and then setting [math]p_j = Y_j/(\sum Y_i)[/math].

Some comments:

1. The fact that [math]\nu_k[/math] can be thought of this way is because the density function of [math](Y_1, \dots, Y_k)[/math] is [math]\exp(-y_1)\cdots\exp(-y_k) = \exp(-(y_1 + \cdots + y_k))[/math], which only depends on [math]s = y_1 + \cdots + y_k[/math]; i.e., it's constant on each simplex [math]Y_1 + \cdots + Y_k = s[/math].

2. One can also think of the [math]Y_j[/math]'s as interarrival times in a Poisson Process. By a well-known fact about the location of events in a Poisson Process, it follows that [math]\nu_k[/math] is also equivalent to picking [math]k-1[/math] independent uniformly random points in [math][0,1][/math] and then letting [math]p_j[/math] be the length of the [math]j[/math]th segment formed.

3. If we draw [math](p_1, \dots, p_k)[/math] from [math]\nu_k[/math], and then draw a string in [math][k]^n[/math] according to the product distribution defined by [math](p_1, \dots, p_k)[/math], then we get an "equal-slices" draw from [math][k]^n[/math]. (You can take this as the definition of "equal-slices" if you like.)

4. Given that we're going to be doing this, it's not really essential to divide all of the [math]Y_i[/math]'s by their sum. We can be more relaxed and just say that [math]\nu_k[/math] determines [math](Y_1, \dots, Y_k)[/math], and then we make a draw from [math][k]^n[/math] according to the product distribution in which [math]j \in [k][/math] is chosen with probability proportional to [math]Y_j[/math].

5. Summarizing, we can draw from equal-slices on [math][k]^n[/math] as follows: pick [math](Y_1, \dots, Y_k)[/math] as i.i.d. [math]\mathrm{Exponential}(1)[/math]'s; then draw from the product distribution "proportional to [math](Y_1, \dots, Y_k)[/math]".


The point of this article is the following:


Goal: To define a distribution on combinatorial lines [math](x^1, \dots, x^k, z) \in [k+1]^n[/math] in such a way that: (i) [math]z[/math] is distributed according to equal-slices on [math][k+1]^n[/math]; (ii) each [math]x^j[/math] is distributed according to equal-slices on [math][k]^n[/math]. Please note that [math]x^j[/math] will not necessarily be the point in the line where the wildcards are [math]j[/math]'s.


Let us record an easy-to-see fact:

Proposition 1 Suppose we first draw a string in [math][k+1]^n[/math] from the product distribution [math]\mu[/math] proportional to [math](y_1, \dots, y_{k+1})[/math], and then we change all the [math](k+1)[/math]'s in the string to [math]j[/math]'s, where [math]j \in [k][/math]. Then the resulting string is distributed according to the product distribution [math]\mu_j[/math] on [math][k]^n[/math] proportional to [math](y_1, \dots, y_{j-1}, y_j + y_{k+1}, y_{j+1}, \dots, y_k)[/math].


The following proposition achieves the essence of the goal:

Proposition 2 Suppose [math]\lambda[/math] is a distribution on the [math]k[/math]-simplex defined as follows. First we draw i.i.d. Exponentials [math](Y_1, \dots, Y_{k+1})[/math] as in [math]\nu_{k+1}[/math]. Then we set [math]W^{(j)} = (Y_1, \dots, Y_{j-1}, Y_j + Y_{k+1}, Y_{j+1}, \dots, Y_k)[/math] (as in Proposition 1). Next, we choose [math]J \in [k][/math] uniformly at random. Following this, we define [math]\vec{W} = W^{(J)}[/math]. Finally, we scale [math]\vec{W}[/math] so as to get a point [math](p_1, \dots, p_k)[/math] on the [math]k[/math]-simplex.
Then [math]\lambda[/math] is identical to [math]\nu_k[/math] (even though the components of [math]\vec{W}[/math] are not i.i.d. Exponentials).
Proof: It suffices to check that density function [math]f(w_1, \dots, w_k)[/math] of the vector [math]\vec{W}[/math] depends only on [math]w_1 + \cdots + w_k[/math]. Now [math]\vec{W}[/math] has a mixture distribution, a uniform mixture of the [math]W^{(j)}[/math] distributions. The density of [math]W^{(j)}[/math] at [math](w_1, \dots, w_k)[/math] is

[math]\displaystyle \exp(-w_1)\cdots\exp(-w_{j-1}) \cdot \left(w_j \exp(-w_j)\right) \cdot \exp(-w_{j+1}) \cdots \exp(-w_k) \qquad [/math]

[math]\displaystyle \qquad = w_j \exp(-(w_1 + \cdots + w_k)). [/math]

This is because the density of a sum of two independent [math]\mathrm{Exponential}(1)[/math]'s is [math]t\exp(-t)[/math]. Hence the density of [math]\vec{W}[/math] at [math](w_1, \dots, w_k)[/math] is

[math]\displaystyle \frac{w_1 + \cdots + w_k}{k} \exp(-(w_1 + \cdots + w_k)), [/math]

which indeed depends only on [math]w_1 + \cdots + w_k[/math]. [math]\Box[/math]


We can now achieve the goal by combining the previous two propositions:

Achieving the Goal: Draw [math]z \in [k+1]^n[/math] according to equal-slices on [math][k+1]^n[/math]. Next, let [math]v^j \in [k]^n[/math] be the string formed by changing all the [math](k+1)[/math]'s in [math]z[/math] to [math]j[/math]'s. Finally, pick a random permutation [math]\pi[/math] on [math][k][/math] and define [math]x^{i} = v^{\pi(i)}[/math].

Proof: We can think of [math]z[/math] as being drawn from the product distribution proportional to [math](Y_1, \dots, Y_{k+1})[/math], where [math](Y_1, \dots, Y_{k+1})[/math] is drawn as in [math]\nu_k[/math]. The strings [math](v^1, \dots, v^k, z)[/math] form a combinatorial line ("in order"), so clearly [math](x^1, \dots, x^k, z)[/math] form a combinatorial line (possibly "out of order"). We have that [math]v^j[/math] is distributed according to the product distribution proportional to [math]W^{(j)}[/math]. Thus [math]x^i[/math] is distributed according to a product distribution which itself is distributed as [math]\lambda = \nu_k[/math]. Hence [math]x^i[/math] is equal-slices-distributed on [math][k]^n[/math]. [math]\Box[/math]