Influence of variables: Difference between revisions

From Polymath Wiki
Jump to navigationJump to search
Gave basic definition and three examples
 
Line 19: Line 19:
== Relevance to density Hales-Jewett ==
== Relevance to density Hales-Jewett ==


Somebody else is going to have to write these two sections ...
If a dense set <math>A \subset [2]^n</math> has very low average influence (say less than <math>\varepsilon</math>), then it is an easy matter to locate [[combinatorial subspace]]s inside A of largish size (about <math>\log 1/\varepsilon</math>), simply by taking a random element of A and flipping about <math>\log 1/\varepsilon</math> of the bits randomly.  One can adapt this observation to <math>[3]^n</math>; if there is very low influence with respect to flipping 0 to 1, 1 to 2, 2 to 0, or vice versa, then one can start building combinatorial subspaces in A easily.
 
If one only has very low influence with respect to interchanging 0 and 1 (i.e. A exhibits "01 insensitivity"), then one can essentially "identify" 0 and 1 and reduce [[DHJ(3)]] to DHJ(2).
 
One hope is that lack of [[quasirandomness]] for the DHJ problem will somehow induce something like a non-trivial reduction in influence in A - probably not enough of a reduction to apply the above argument directly, but perhaps it could be iterated until this is the case. (Details are very fuzzy at this point though.)
 
It also seems of interest to use influence machinery to try to prove [[Sperner's theorem]].

Revision as of 10:14, 16 February 2009

Let f be a Boolean function---that is, a function from [math]\displaystyle{ \{0,1\}^n }[/math] to [math]\displaystyle{ \{0,1\}. }[/math] Let us write a typical point of [math]\displaystyle{ \{0,1\}^n }[/math] as [math]\displaystyle{ (x_1,\dots,x_n). }[/math] The influence of the variable [math]\displaystyle{ x_i }[/math] on the function f is defined to be the probability that the value of f changes if you change the value of [math]\displaystyle{ x_i }[/math] (keeping the values of the other [math]\displaystyle{ x_j }[/math] fixed).

Examples

The parity function

The parity function takes a sequence x to the parity of the number of 1s in that sequence. Clearly, if you change the value of any variable, the value of f changes. Therefore, the influence of every variable is 1.

The majority function

Let f(x) be 1 if there are more 1s than 0s and 0 otherwise. Then changing the value of [math]\displaystyle{ x_i }[/math] from 0 to 1 changes the value of f(x) if and only if the number of 1s in the remaining n-1 variables is exactly [math]\displaystyle{ \lfloor n/2\rfloor. }[/math] The probability of this is proportional to [math]\displaystyle{ n^{-1/2}, }[/math] so in this case (after doing a similar calculation concerning changing the value of [math]\displaystyle{ x_i }[/math] from 1 to 0) we find that every variable has an influence proportional to [math]\displaystyle{ n^{-1/2}. }[/math]

The value of the first coordinate

Let [math]\displaystyle{ f(x)=x_1. }[/math] Then the influence of [math]\displaystyle{ x_1 }[/math] is 1 and the influence of all the other variables is 0.

Motivation

Relevance to density Hales-Jewett

If a dense set [math]\displaystyle{ A \subset [2]^n }[/math] has very low average influence (say less than [math]\displaystyle{ \varepsilon }[/math]), then it is an easy matter to locate combinatorial subspaces inside A of largish size (about [math]\displaystyle{ \log 1/\varepsilon }[/math]), simply by taking a random element of A and flipping about [math]\displaystyle{ \log 1/\varepsilon }[/math] of the bits randomly. One can adapt this observation to [math]\displaystyle{ [3]^n }[/math]; if there is very low influence with respect to flipping 0 to 1, 1 to 2, 2 to 0, or vice versa, then one can start building combinatorial subspaces in A easily.

If one only has very low influence with respect to interchanging 0 and 1 (i.e. A exhibits "01 insensitivity"), then one can essentially "identify" 0 and 1 and reduce DHJ(3) to DHJ(2).

One hope is that lack of quasirandomness for the DHJ problem will somehow induce something like a non-trivial reduction in influence in A - probably not enough of a reduction to apply the above argument directly, but perhaps it could be iterated until this is the case. (Details are very fuzzy at this point though.)

It also seems of interest to use influence machinery to try to prove Sperner's theorem.