Furstenberg correspondence principle: Difference between revisions
New page: The Furstenberg correspondence principle equates density Ramsey theorems in combinatorics with recurrence theorems in ergodic theory. In our context, the principle equates the following t... |
|||
(2 intermediate revisions by 2 users not shown) | |||
Line 5: | Line 5: | ||
'''DHJ(3), Version 6'''. If the free group <math>F_3</math> on three generators 1, 2, 3, acts in a measure-preserving fashion <math>(T_w: X \to X)_{w \in F_3}</math> on a probability space X, and E is a set of positive measure in X, then there exists a combinatorial line <math>w^1,w^2,w^3</math> such that <math>T_{w^1}^{-1}(E) \cap T_{w^2}^{-1}(E) \cap T_{w^3}^{-1}(E)</math> has positive measure. | '''DHJ(3), Version 6'''. If the free group <math>F_3</math> on three generators 1, 2, 3, acts in a measure-preserving fashion <math>(T_w: X \to X)_{w \in F_3}</math> on a probability space X, and E is a set of positive measure in X, then there exists a combinatorial line <math>w^1,w^2,w^3</math> such that <math>T_{w^1}^{-1}(E) \cap T_{w^2}^{-1}(E) \cap T_{w^3}^{-1}(E)</math> has positive measure. | ||
To prove this equivalence, it is convenient to introduce an intermediate stage: | To prove this equivalence, it is convenient to introduce an intermediate stage, in which the group action is replaced by a groupoid action: | ||
'''DHJ(3), Version 6a'''. Let <math>(X_n)_{n \in {\Bbb Z}}</math> be a family of probability spaces, and for each <math>w \in F_3</math> of length <math>|w| \in {\Bbb Z}</math> (where the inverse generators <math>1^{-1}, 2^{-1}, 3^{-1}</math> are considered to have length -1) and <math>n \in {\Bbb Z}</math>, let <math>T_{w, n+|w|}: X_{n+|w|} \to X_n</math> be an invertible measure-preserving map obeying the groupoid law | '''DHJ(3), Version 6a'''. Let <math>(X_n)_{n \in {\Bbb Z}}</math> be a family of probability spaces, and for each <math>w \in F_3</math> of length <math>|w| \in {\Bbb Z}</math> (where the inverse generators <math>1^{-1}, 2^{-1}, 3^{-1}</math> are considered to have length -1) and <math>n \in {\Bbb Z}</math>, let <math>T_{w, n+|w|}: X_{n+|w|} \to X_n</math> be an invertible measure-preserving map obeying the groupoid law | ||
Line 27: | Line 27: | ||
== Version 6a implies Version 1 == | == Version 6a implies Version 1 == | ||
This is the hard direction. We argue by contradiction. If Version 1 failed, then there exists <math>\delta > 0</math>, a sequence of n going off to infinity, and a sequence of line-free sets <math>A^{(n)} \subset [3]^n</math>, each of density at least <math>\delta<math>. | This is the hard direction. We argue by contradiction. If Version 1 failed, then there exists <math>\delta > 0</math>, a sequence of n going off to infinity, and a sequence of line-free sets <math>A^{(n)} \subset [3]^n</math>, each of density at least <math>\delta</math>. | ||
We let <math>m = m^{(n)}</math> be an integer growing | We let <math>m = m^{(n)}</math> be an integer growing slowly with n. For each <math>i \geq 0</math>, let <math>B_i</math> be the ball of radius i in <math>F_3</math> (i.e. the set of all strings of length at most i formed from the generators <math>1,2,3,1^{-1},2^{-1},3^{-1}</math>). For any d, define a ''simple'' d-dimensional [[combinatorial subspace]] to be a d-dimensional combinatorial subspace whose d wildcards <math>*_1,\ldots,*_d</math> each appear in just a single position <math>a_1,\ldots,a_d \in [n]</math>; we refer to these positions as the <math>directions</math> of the simple subspace. We say that a simple (d+1)-dimensional subspace is a ''j-extension'' of a simple d-dimensional subspace for some j=1,2,3, if the latter subspace can be obtained from the former by replacing the final wildcard <math>*_{d+1}</math> with j, and then we refer to the latter subspace as the ''j-slice'' of the former. | ||
We will assign a random simple <math>m+|w|</math>-dimensional subspace <math>V_w = \phi_w( [3]^{m+|w|} ) \subset [3]^n</math> to each word <math>w \in | We will assign a random simple <math>m+|w|</math>-dimensional subspace <math>V_w = \phi_w( [3]^{m+|w|} ) \subset [3]^n</math> to each word <math>w \in B_m</math>, by the following recursive procedure: | ||
* To the identity element id, we choose <math>V_{id}</math> uniformly at random among all m-dimensional simple combinatorial subspaces in <math>[3]^n</math>, such that the digits 1, 2, 3 each appear at least r times amongst the frozen coordinates. | * To the identity element id, we choose <math>V_{id}</math> uniformly at random among all m-dimensional simple combinatorial subspaces in <math>[3]^n</math>, such that the digits 1, 2, 3 each appear at least r times amongst the frozen coordinates. | ||
* Now suppose recursively that <math>V_w</math> has been chosen for all <math>v \in B_i</math>. | * Now suppose recursively that <math>V_w</math> has been chosen for all <math>v \in B_i</math> and some <math>0 \leq i < m</math>. | ||
* If <math>v \in B_i \backslash B_{i-1}</math> and <math>j \in \{1,2,3\}</math>, we define <math>V_{vj^{-1}}</math> to be the j=slice of <math>V_v</math>. | * If <math>v \in B_i \backslash B_{i-1}</math> and <math>j \in \{1,2,3\}</math>, we define <math>V_{vj^{-1}}</math> to be the j=slice of <math>V_v</math>. | ||
* If <math>v \in B_i \backslash B_{i-1}</math> and <math>j \in \{1,2,3\}</math>, we draw <math>V_{vj}</math> uniformly and independently at random from all j-extensions of <math>V_v</math>. | * If <math>v \in B_i \backslash B_{i-1}</math> and <math>j \in \{1,2,3\}</math>, we draw <math>V_{vj}</math> uniformly and independently at random from all j-extensions of <math>V_v</math>. | ||
Line 46: | Line 46: | ||
We can define a random element <math>x_0^{(n)} \in X_0</math> by the formula | We can define a random element <math>x_0^{(n)} \in X_0</math> by the formula | ||
:<math>x_0^{(n)} := \{ w \in | :<math>x_0^{(n)} := \{ w \in B_m \cap \Gamma_{-0}: \phi_w( 1^m ) \in A^{(n)} \}.</math> | ||
The distribution of this random element gives a probability measure <math>\mu_0^{(n)}</math>. | The distribution of this random element gives a probability measure <math>\mu_0^{(n)}</math>. | ||
Line 52: | Line 52: | ||
Let <math>E_0 \subset X_0</math> be the cylinder set <math>E_0 := \{ B \subset \Gamma_0: id \in B \}.</math> | Let <math>E_0 \subset X_0</math> be the cylinder set <math>E_0 := \{ B \subset \Gamma_0: id \in B \}.</math> | ||
'''Lemma 1''' (Density): <math>\mu_0^{(n)}(E_0) | '''Lemma 1''' (Density): <math>\mu_0^{(n)}(E_0) \geq \delta + o(1)</math> as <math>n \to \infty</math>. | ||
'''Proof'''. .. | '''Proof'''. The quantity <math>\mu_0^{(n)}(E_0)</math> is the probability that <math>\phi_{id}(1^m)</math> lies in <math>A^{(n)}</math>. This quantity is essentially uniformly distributed in <math>[3]^n</math> (up to an error of total variation o(1)) and the claim follows. | ||
<math>\Box</math> | |||
'''Lemma 2''' (Asymptotic stationarity): If <math>w \in \Gamma_0</math> and E is a cylinder set in <math>X_0</math>, then <math>\mu_0^{(n)}( T_{w,0}(E) ) = \mu_0( E ) + o(1)</math> as <math>n \to \infty</math>. | '''Lemma 2''' (Asymptotic stationarity): If <math>w \in \Gamma_0</math> and E is a cylinder set in <math>X_0</math>, then <math>\mu_0^{(n)}( T_{w,0}^{-1}(E) ) = \mu_0^{(n)}( E ) + o(1)</math> as <math>n \to \infty</math>. | ||
'''Proof'''. ... | '''Proof'''. The cylinder set E can be defined in terms of a finite number of events <math> \{ B \subset \Gamma_0: w_i \in B \}</math> for i=1,...,k (say). It then suffices to show that the joint distribution of <math>(\phi_{w_1}(1^m),\ldots,\phi_{w_k}(1^m))</math> and <math>(\phi_{w w_1}(1^m),\ldots,\phi_{w w_k}(1^m))</math> on <math>([3]^n)^k</math> differ by o(1) in total variation. This is some tedious calculation which I will omit here. <math>\Box</math> | ||
'''Lemma 3''' (Line-free-ness) For any combinatorial line <math>w^1,w^2,w^3</math>, the set <math>E_0 \cap T_{w^2 (w^1)^{-1},0}(E_0) \cap T_{w^3 (w^1)^{-1},0}(E_0)</math> has zero measure in <math>\mu_0^{(n)}</math>. | '''Lemma 3''' (Line-free-ness) For any combinatorial line <math>w^1,w^2,w^3</math>, the set <math>E_0 \cap T_{w^2 (w^1)^{-1},0}(E_0) \cap T_{w^3 (w^1)^{-1},0}(E_0)</math> has zero measure in <math>\mu_0^{(n)}</math>. | ||
'''Proof'''. . | '''Proof'''. By construction, <math>\phi_{id}(1^m), \phi_{w^2 (w^1)^{-1}}(1^m), \phi_{w^3 (w^1)^{-1}}(1^m)</math> form a combinatorial line, and thus cannot all lie in A, and the claim follows. | ||
<math>\Box</math> | |||
Now let <math>\mu_0</math> be a vague subsequential limit of the <math>\mu_0^{(n)}</math>, then by Lemma 1, <math>E_0</math> has positive measure, and by Lemma 2 the probability measure <math>\mu_0</math> is invariant with respect to <math>T_{w,0}</math> whenever <math>w \in \Gamma_0</math>, We can then uniquely define a probability measure <math>\mu_s</math> on <math>X_s</math> by requiring that <math>T_{w,s}: X_s \to X_0</math> map <math>\mu_s</math> to <math>\mu_0</math> for all <math>w \in \Gamma_s</math>. The hypotheses of Theorem 6a are now satisfied and so there exists a combinatorial line <math>w^1,w^2,w^3 \in [3]^s</math>, the set <math>E_0 \cap T_{w^2 (w^1)^{-1},0}(E_0) \cap T_{w^3 (w^1)^{-1},0}(E_0)</math> has positive measure with respect to <math>\mu_s</math>, but this contradicts Lemma 3 and we are done. | Now let <math>\mu_0</math> be a vague subsequential limit of the <math>\mu_0^{(n)}</math>, then by Lemma 1, <math>E_0</math> has positive measure, and by Lemma 2 the probability measure <math>\mu_0</math> is invariant with respect to <math>T_{w,0}</math> whenever <math>w \in \Gamma_0</math>, We can then uniquely define a probability measure <math>\mu_s</math> on <math>X_s</math> by requiring that <math>T_{w,s}: X_s \to X_0</math> map <math>\mu_s</math> to <math>\mu_0</math> for all <math>w \in \Gamma_s</math>. The hypotheses of Theorem 6a are now satisfied and so there exists a combinatorial line <math>w^1,w^2,w^3 \in [3]^s</math>, the set <math>E_0 \cap T_{w^2 (w^1)^{-1},0}(E_0) \cap T_{w^3 (w^1)^{-1},0}(E_0)</math> has positive measure with respect to <math>\mu_s</math>, but this contradicts Lemma 3 and we are done. |
Latest revision as of 22:42, 19 February 2009
The Furstenberg correspondence principle equates density Ramsey theorems in combinatorics with recurrence theorems in ergodic theory. In our context, the principle equates the following two statements:
DHJ(3), Version 1. For every [math]\displaystyle{ \delta \gt 0 }[/math] there exists n such that every subset [math]\displaystyle{ A \subset [3]^n }[/math] of density at least [math]\displaystyle{ \delta }[/math] contains a combinatorial line.
DHJ(3), Version 6. If the free group [math]\displaystyle{ F_3 }[/math] on three generators 1, 2, 3, acts in a measure-preserving fashion [math]\displaystyle{ (T_w: X \to X)_{w \in F_3} }[/math] on a probability space X, and E is a set of positive measure in X, then there exists a combinatorial line [math]\displaystyle{ w^1,w^2,w^3 }[/math] such that [math]\displaystyle{ T_{w^1}^{-1}(E) \cap T_{w^2}^{-1}(E) \cap T_{w^3}^{-1}(E) }[/math] has positive measure.
To prove this equivalence, it is convenient to introduce an intermediate stage, in which the group action is replaced by a groupoid action:
DHJ(3), Version 6a. Let [math]\displaystyle{ (X_n)_{n \in {\Bbb Z}} }[/math] be a family of probability spaces, and for each [math]\displaystyle{ w \in F_3 }[/math] of length [math]\displaystyle{ |w| \in {\Bbb Z} }[/math] (where the inverse generators [math]\displaystyle{ 1^{-1}, 2^{-1}, 3^{-1} }[/math] are considered to have length -1) and [math]\displaystyle{ n \in {\Bbb Z} }[/math], let [math]\displaystyle{ T_{w, n+|w|}: X_{n+|w|} \to X_n }[/math] be an invertible measure-preserving map obeying the groupoid law
- [math]\displaystyle{ T_{vw, n+|vw|} = T_{v,n+|v|} T_{w,n+|vw|} }[/math] (1)
for all integers n and group elements [math]\displaystyle{ v,w \in F_3 }[/math]. Let [math]\displaystyle{ E_0 }[/math] be a set of positive measure in [math]\displaystyle{ X_0 }[/math], then there exists a combinatorial line [math]\displaystyle{ w^1,w^2,w^3 \in [3]^n }[/math] such that [math]\displaystyle{ T_{w^1,n}^{-1}(E_0) \cap T_{w^2,n}^{-1}(E_0) \cap T_{w^3,n}^{-1}(E_0) }[/math] has positive measure in [math]\displaystyle{ X_n }[/math].
Version 1 implies Version 6
This is easy. Let [math]\displaystyle{ \delta }[/math] be the measure of E, and let n be as in Version 1. From the first moment method and the measure-preserving nature of T, we see that the expected density of [math]\displaystyle{ \{ w \in [3]^n: x \in T_w^{-1}(E) \} }[/math] is [math]\displaystyle{ \delta }[/math]. Thus for a set of positive measure, this set has density at least [math]\displaystyle{ \delta }[/math] and thus contains a combinatorial line. The claim now follows from the pigeonhole principle.
Version 6 implies Version 6a
This is also easy. Let the setup be as in Version 6a. We define the product space [math]\displaystyle{ X := \prod_{n \in {\Bbb Z}} X_n }[/math], and give it the action [math]\displaystyle{ T_w: X \to X }[/math] of the free group [math]\displaystyle{ F_3 }[/math] by the formula
- [math]\displaystyle{ T_w ( x_n )_{n \in {\Bbb Z}} := ( T_{w,n+|w|} x_{n+|w|} )_{n \in {\Bbb Z}}; }[/math]
the groupoid law (1) ensures that this is an action, and the measure-preserving nature of the [math]\displaystyle{ T_{w,n+|w|} }[/math] ensures that this is measure-preserving. The claim now follows by applying Version 6 to the cylinder set [math]\displaystyle{ E_0 \times \prod_{n \neq 0} X_n }[/math].
Version 6a implies Version 1
This is the hard direction. We argue by contradiction. If Version 1 failed, then there exists [math]\displaystyle{ \delta \gt 0 }[/math], a sequence of n going off to infinity, and a sequence of line-free sets [math]\displaystyle{ A^{(n)} \subset [3]^n }[/math], each of density at least [math]\displaystyle{ \delta }[/math].
We let [math]\displaystyle{ m = m^{(n)} }[/math] be an integer growing slowly with n. For each [math]\displaystyle{ i \geq 0 }[/math], let [math]\displaystyle{ B_i }[/math] be the ball of radius i in [math]\displaystyle{ F_3 }[/math] (i.e. the set of all strings of length at most i formed from the generators [math]\displaystyle{ 1,2,3,1^{-1},2^{-1},3^{-1} }[/math]). For any d, define a simple d-dimensional combinatorial subspace to be a d-dimensional combinatorial subspace whose d wildcards [math]\displaystyle{ *_1,\ldots,*_d }[/math] each appear in just a single position [math]\displaystyle{ a_1,\ldots,a_d \in [n] }[/math]; we refer to these positions as the [math]\displaystyle{ directions }[/math] of the simple subspace. We say that a simple (d+1)-dimensional subspace is a j-extension of a simple d-dimensional subspace for some j=1,2,3, if the latter subspace can be obtained from the former by replacing the final wildcard [math]\displaystyle{ *_{d+1} }[/math] with j, and then we refer to the latter subspace as the j-slice of the former.
We will assign a random simple [math]\displaystyle{ m+|w| }[/math]-dimensional subspace [math]\displaystyle{ V_w = \phi_w( [3]^{m+|w|} ) \subset [3]^n }[/math] to each word [math]\displaystyle{ w \in B_m }[/math], by the following recursive procedure:
- To the identity element id, we choose [math]\displaystyle{ V_{id} }[/math] uniformly at random among all m-dimensional simple combinatorial subspaces in [math]\displaystyle{ [3]^n }[/math], such that the digits 1, 2, 3 each appear at least r times amongst the frozen coordinates.
- Now suppose recursively that [math]\displaystyle{ V_w }[/math] has been chosen for all [math]\displaystyle{ v \in B_i }[/math] and some [math]\displaystyle{ 0 \leq i \lt m }[/math].
- If [math]\displaystyle{ v \in B_i \backslash B_{i-1} }[/math] and [math]\displaystyle{ j \in \{1,2,3\} }[/math], we define [math]\displaystyle{ V_{vj^{-1}} }[/math] to be the j=slice of [math]\displaystyle{ V_v }[/math].
- If [math]\displaystyle{ v \in B_i \backslash B_{i-1} }[/math] and [math]\displaystyle{ j \in \{1,2,3\} }[/math], we draw [math]\displaystyle{ V_{vj} }[/math] uniformly and independently at random from all j-extensions of [math]\displaystyle{ V_v }[/math].
For each integer s, let [math]\displaystyle{ \Gamma_s := \{w \in F_3: |w| = s \} }[/math], and let [math]\displaystyle{ X_s := 2^{\Gamma_{-s}} }[/math] be the power set of the countable set [math]\displaystyle{ \Gamma_{-n} }[/math]. We have the maps [math]\displaystyle{ T_{w,s+|w|}: X_{s+|w|} \to X_s }[/math] defined by
- [math]\displaystyle{ T_{w,s+|w|}(B) := w B }[/math];
these are invertible and obeys the groupoid relation (1).
We can define a random element [math]\displaystyle{ x_0^{(n)} \in X_0 }[/math] by the formula
- [math]\displaystyle{ x_0^{(n)} := \{ w \in B_m \cap \Gamma_{-0}: \phi_w( 1^m ) \in A^{(n)} \}. }[/math]
The distribution of this random element gives a probability measure [math]\displaystyle{ \mu_0^{(n)} }[/math].
Let [math]\displaystyle{ E_0 \subset X_0 }[/math] be the cylinder set [math]\displaystyle{ E_0 := \{ B \subset \Gamma_0: id \in B \}. }[/math]
Lemma 1 (Density): [math]\displaystyle{ \mu_0^{(n)}(E_0) \geq \delta + o(1) }[/math] as [math]\displaystyle{ n \to \infty }[/math].
Proof. The quantity [math]\displaystyle{ \mu_0^{(n)}(E_0) }[/math] is the probability that [math]\displaystyle{ \phi_{id}(1^m) }[/math] lies in [math]\displaystyle{ A^{(n)} }[/math]. This quantity is essentially uniformly distributed in [math]\displaystyle{ [3]^n }[/math] (up to an error of total variation o(1)) and the claim follows. [math]\displaystyle{ \Box }[/math]
Lemma 2 (Asymptotic stationarity): If [math]\displaystyle{ w \in \Gamma_0 }[/math] and E is a cylinder set in [math]\displaystyle{ X_0 }[/math], then [math]\displaystyle{ \mu_0^{(n)}( T_{w,0}^{-1}(E) ) = \mu_0^{(n)}( E ) + o(1) }[/math] as [math]\displaystyle{ n \to \infty }[/math].
Proof. The cylinder set E can be defined in terms of a finite number of events [math]\displaystyle{ \{ B \subset \Gamma_0: w_i \in B \} }[/math] for i=1,...,k (say). It then suffices to show that the joint distribution of [math]\displaystyle{ (\phi_{w_1}(1^m),\ldots,\phi_{w_k}(1^m)) }[/math] and [math]\displaystyle{ (\phi_{w w_1}(1^m),\ldots,\phi_{w w_k}(1^m)) }[/math] on [math]\displaystyle{ ([3]^n)^k }[/math] differ by o(1) in total variation. This is some tedious calculation which I will omit here. [math]\displaystyle{ \Box }[/math]
Lemma 3 (Line-free-ness) For any combinatorial line [math]\displaystyle{ w^1,w^2,w^3 }[/math], the set [math]\displaystyle{ E_0 \cap T_{w^2 (w^1)^{-1},0}(E_0) \cap T_{w^3 (w^1)^{-1},0}(E_0) }[/math] has zero measure in [math]\displaystyle{ \mu_0^{(n)} }[/math].
Proof. By construction, [math]\displaystyle{ \phi_{id}(1^m), \phi_{w^2 (w^1)^{-1}}(1^m), \phi_{w^3 (w^1)^{-1}}(1^m) }[/math] form a combinatorial line, and thus cannot all lie in A, and the claim follows. [math]\displaystyle{ \Box }[/math]
Now let [math]\displaystyle{ \mu_0 }[/math] be a vague subsequential limit of the [math]\displaystyle{ \mu_0^{(n)} }[/math], then by Lemma 1, [math]\displaystyle{ E_0 }[/math] has positive measure, and by Lemma 2 the probability measure [math]\displaystyle{ \mu_0 }[/math] is invariant with respect to [math]\displaystyle{ T_{w,0} }[/math] whenever [math]\displaystyle{ w \in \Gamma_0 }[/math], We can then uniquely define a probability measure [math]\displaystyle{ \mu_s }[/math] on [math]\displaystyle{ X_s }[/math] by requiring that [math]\displaystyle{ T_{w,s}: X_s \to X_0 }[/math] map [math]\displaystyle{ \mu_s }[/math] to [math]\displaystyle{ \mu_0 }[/math] for all [math]\displaystyle{ w \in \Gamma_s }[/math]. The hypotheses of Theorem 6a are now satisfied and so there exists a combinatorial line [math]\displaystyle{ w^1,w^2,w^3 \in [3]^s }[/math], the set [math]\displaystyle{ E_0 \cap T_{w^2 (w^1)^{-1},0}(E_0) \cap T_{w^3 (w^1)^{-1},0}(E_0) }[/math] has positive measure with respect to [math]\displaystyle{ \mu_s }[/math], but this contradicts Lemma 3 and we are done.