A second Fourier decomposition related to Sperner's theorem

Introduction

It seems as though it should be possible to give a clean "purely Fourier" proof of Sperner's theorem that exploits positivity, if one uses equal-slices measure. Here we present a Fourier decomposition that should achieve this, but we have not yet managed to prove the positivity (which could turn out to be a hard problem---at this stage it is not clear).

We use Ryan's formulation of equal-slices measure on $[2]^n$: the density of a set $\mathcal{A}$ is $\mathbb{E}_p\mu_p(\mathcal{A}),$ where $\mu_p$ is the standard p-weighted measure on the cube: if A is a subset of [n] then $\mu_p(A)=p^{|A|}(1-p)^{n-|A|}.$ A useful way of thinking about $\mu_p$ is that $\mu_p(\mathcal{A}$ is the probability that $(X_1,\dots,X_n)\in\mathcal{A}$ if the $X_i$ are independent Bernoulli random variables with probability p of equalling 1.

We also need to define a measure on the space of all combinatorial lines. The natural measure seems to be this. Define two sequences $(X_1,\dots,X_n)$ and $(Y_1,\dots,Y_n)$ of Bernoulli random variables as follows. The joint distribution of $(X_i,Y_i)$ is that it equals (0,0),(0,1) and (1,1) with probabilities 1-q,q-p and p, respectively, and let different pairs $(X_i,Y_i)$ be independent. Then the $X_i$ are independent Bernoulli, as are the $Y_i,$ but they depend on each other in a way that guarantees that they form a combinatorial line.

We shall now be interested in the quantity $\mathbb{E}f(X)f(Y),$ where $f$ is some function and $X=(X_1,\dots,X_n), Y=(Y_1,\dots,Y_n).$ But ultimately what will interest us is the average of this quantity over all pairs $0\leq p\leq q\leq 1,$ which we can write as $\mathbb{E}_{p\leq q}\mathbb{E}f(X_{p,q})f(Y_{p,q}).$

A proof in physical space

Let us first prove positivity in physical space. To do this, we shall define the following equivalent model for the joint distribution of $X_{p,q}$ and $Y_{p,q}.$ We first pick a random variable $T=(t_1,\dots,t_n)$ uniformly from $[0,1]^n,$ and then we let $(X_{p,q})_i$ be $0$ if $p\leq t_i$ and $1$ if $p\gtt_i.$ Similarly, we let $Y_{p,q}$ be 0 if $q\leq t_i$ and 1 if $q\gtt_i.$ This gives us the nice property that it makes perfectly good sense even if $p\gtq,$ and we therefore have the equation $\mathbb{E}_{p\leq q}\mathbb{E}f(X_{p,q})f(Y_{p,q})=(1/2)\mathbb{E}_{t\in T}\mathbb{E}_{p,q}f(X_{p,q})f(Y_{p,q}).$

Now $X_{p,q}$ is just a function of p and t, and $Y_{p,q}$ is the same function applied to q and t. Therefore, the right-hand side is equal to $\mathbb{E}_{t\in T}(\mathbb{E}_pf(X_{p,q}))^2,$ which is positive.

We can say more. By Cauchy-Schwarz, the expectation is at least $(\mathbb{E}_{t\in T}\mathbb{E}_pf(X_{p,q}))^2,$ which is the square of the equal-slices integral of f. In particular, if f is the characteristic function of a set $\mathcal{A},$ then the combinatorial-line density is at least the square of the density of $\mathcal{A}.$

A Fourier translation of the problem

To be continued.