Selberg sieve variational problem

From Polymath1Wiki

(Difference between revisions)
Jump to: navigation, search
(More general variational problems)
(More general variational problems)
Line 85: Line 85:
:<math>{\mathcal R}'_k = \{ (t_1,\ldots,t_k) \in [0,1]^k: t_1+\ldots+t_k \leq 1 + \min(t_1,\ldots,t_k) \}</math>
:<math>{\mathcal R}'_k = \{ (t_1,\ldots,t_k) \in [0,1]^k: t_1+\ldots+t_k \leq 1 + \min(t_1,\ldots,t_k) \}</math>
-
provided that one works with a generalisation of <math>EH[\theta]</math> which controls more general Dirichlet convolutions than the von Mangoldt function (a precise assertion in this regard may be found in BFI).  In fact one should be able to work in the even larger region
+
provided that one works with a generalisation of <math>EH[\theta]</math> which controls more general Dirichlet convolutions than the von Mangoldt function (a precise assertion in this regard may be found in BFI).  In fact one should be able to work in any larger region <math>R</math> for which
-
:<math>{\mathcal R}''_{k,\theta} = \{ (t_1,\ldots,t_k) \in [0,1/\theta]^k: t_1+\ldots+t_k \leq 1 + \max(t_1,\ldots,t_k) \} \cup \frac{1}{\theta} \cdot {\mathcal R}_k</math>
+
:<math>R + R \subset \{ (t_1,\ldots,t_k) \in [0,2/\theta]^k: t_1+\ldots+t_k \leq 2 + \max(t_1,\ldots,t_k) \} \cup \frac{2}{\theta} \cdot {\mathcal R}_k</math>
provided that all the marginal distributions of F are supported on <math>{\mathcal R}_{k-1}</math>, thus (assuming F is symmetric)
provided that all the marginal distributions of F are supported on <math>{\mathcal R}_{k-1}</math>, thus (assuming F is symmetric)
:<math>\int_0^\infty F(t_1,\ldots,t_{k-1},t_k)\ dt_k = 0 </math> when <math>t_1+\ldots+t_{k-1} > 1.</math>
:<math>\int_0^\infty F(t_1,\ldots,t_{k-1},t_k)\ dt_k = 0 </math> when <math>t_1+\ldots+t_{k-1} > 1.</math>
 +
 +
For instance, one can take <math>R = \frac{1}{\theta} \cdot {\mathcal R}_k</math>, or one can take <math>R = \{ (t_1,\ldots,t_k) \in [0,1/\theta]^k: t_1 +\ldots +t_{k-1} \leq 1 </math> (although the latter option breaks the symmetry for F).  Perhaps other choices are also possible.

Revision as of 06:07, 13 December 2013

Let Mk be the quantity

\displaystyle M_k := \sup_F \frac{\sum_{m=1}^k J_k^{(m)}(F)}{I_k(F)}

where F ranges over square-integrable functions on the simplex

\displaystyle {\mathcal R}_k := \{ (t_1,\ldots,t_k) \in [0,+\infty)^k: t_1+\ldots+t_k \leq 1 \}

with I_k, J_k^{(m)} being the quadratic forms

\displaystyle I_k(F) := \int_{{\mathcal R}_k} F(t_1,\ldots,t_k)^2\ dt_1 \ldots dt_k

and

\displaystyle J_k^{(m)}(F) := \int_{{\mathcal R}_{k-1}} (\int_0^{1-\sum_{i \neq m} t_i} F(t_1,\ldots,t_k)\ dt_m)^2 dt_1 \ldots dt_{m-1} dt_{m+1} \ldots dt_k.

It is known that DHL[k,m + 1] holds whenever EH[θ] holds and M_k > \frac{2m}{\theta}. Thus for instance, Mk > 2 implies DHL[k,2] on the Elliott-Halberstam conjecture, and Mk > 4 implies DHL[k,2] unconditionally.

Contents

Upper bounds

We have the upper bound

\displaystyle M_k \leq \frac{k}{k-1} \log k (1)

that is proven as follows.

The key estimate is

 \displaystyle \int_0^{1-t_2-\ldots-t_k} F(t_1,\ldots,t_k)\ dt_1)^2 \leq \frac{\log k}{k-1} \int_0^{1-t_2-\ldots-t_k} F(t_1,\ldots,t_k)^2 (1 - t_1-\ldots-t_k+ kt_1)\ dt_1.. (2)

Assuming this estimate, we may integrate in t_2,\ldots,t_k to conclude that

\displaystyle J_k^{(1)}(F) \leq \frac{\log k}{k-1} \int F^2 (1-t_1-\ldots-t_k+kt_1)\ dt_1 \ldots dt_k

which symmetrises to

\sum_{m=1}^k J_k^{(m)}(F) \leq k \frac{\log k}{k-1} \int F^2\ dt_1 \ldots dt_k

giving the desired upper bound (1).

It remains to prove (2). By Cauchy-Schwarz, it suffices to show that

\displaystyle \int_0^{1-t_2-\ldots-t_k} \frac{dt_1}{1 - t_1-\ldots-t_k+ kt_1} \leq \frac{\log k}{k-1}.

But writing s = t_2+\ldots+t_k, the left-hand side evaluates to

\frac{1}{k-1} (\log k(1-s) - \log (1-s) ) = \frac{\log k}{k-1}

as required.

Lower bounds

...

World records

k Lower bound Upper bound
4 1.845 1.848
5 2.001162 2.011797
10 2.53 2.55842
20 3.05 3.1534
30 3.34 3.51848
40 3.52 3.793466
50 3.66 3.99186
59 3.95608 4.1479398

All upper bounds come from (1).

More general variational problems

It appears that for the purposes of establish DHL type theorems, one can increase the range of F in which one is taking suprema over (and extending the range of integration in the definition of J_k^{(m)}(F) accordingly). Firstly, one can enlarge the simplex {\mathcal R}_k to the larger region

{\mathcal R}'_k = \{ (t_1,\ldots,t_k) \in [0,1]^k: t_1+\ldots+t_k \leq 1 + \min(t_1,\ldots,t_k) \}

provided that one works with a generalisation of EH[θ] which controls more general Dirichlet convolutions than the von Mangoldt function (a precise assertion in this regard may be found in BFI). In fact one should be able to work in any larger region R for which

R + R \subset \{ (t_1,\ldots,t_k) \in [0,2/\theta]^k: t_1+\ldots+t_k \leq 2 + \max(t_1,\ldots,t_k) \} \cup \frac{2}{\theta} \cdot {\mathcal R}_k

provided that all the marginal distributions of F are supported on {\mathcal R}_{k-1}, thus (assuming F is symmetric)

\int_0^\infty F(t_1,\ldots,t_{k-1},t_k)\ dt_k = 0 when t_1+\ldots+t_{k-1} > 1.

For instance, one can take R = \frac{1}{\theta} \cdot {\mathcal R}_k, or one can take R = \{ (t_1,\ldots,t_k) \in [0,1/\theta]^k: t_1 +\ldots +t_{k-1} \leq 1 (although the latter option breaks the symmetry for F). Perhaps other choices are also possible.

Personal tools