Estimating a sum

From Polymath Wiki
Revision as of 15:41, 4 March 2018 by Teorth (talk | contribs) (Created page with "For any <math>\sigma, t>0</math> and natural number <math>N</math>, introduce the sum :<math>F_{\sigma,t}(N) := \sum_{n=1}^N \frac{1}{n^{\sigma + \frac{t}{4} \log\frac{N^2}{n...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

For any [math]\displaystyle{ \sigma, t\gt 0 }[/math] and natural number [math]\displaystyle{ N }[/math], introduce the sum

[math]\displaystyle{ F_{\sigma,t}(N) := \sum_{n=1}^N \frac{1}{n^{\sigma + \frac{t}{4} \log\frac{N^2}{n}}. }[/math]

This sum appears a number of times in the Polymath15 project, and it is therefore of interest to estimate it.

Lemma 1 Let [math]\displaystyle{ \sigma,t\gt 0 }[/math] and [math]\displaystyle{ N \geq N_0 \geq 1 }[/math]. Then

[math]\displaystyle{ \sum_{n=1}^{N_0} \frac{1}{n^{\sigma + \frac{t}{4} \log\frac{N^2}{n}} \leq F_{\sigma,t}(N) \leq \sum_{n=1}^{N_0} \frac{1}{n^{\sigma + \frac{t}{4} \log\frac{N^2}{n}} + \max( N_0^{1-\sigma - \frac{t}{4} \log \frac{N^2}{N_0}}, N^{1-\sigma - \frac{t}{4} \log N} ) \log \frac{N}{N_0}. }[/math]

Proof The left-hand inequality is obvious. To prove the right-hand inequality, observe (from writing the summand as [math]\displaystyle{ \frac{\exp( \frac{t}{4} (\log N - \log n)^2}{n^\sigma \exp(\frac{t}{4} \log^2 N)} }[/math]) that the summand is decreasing for [math]\displaystyle{ 1 \leq n \leq N }[/math], hence by the integral test one has

[math]\displaystyle{ F_{\sigma,t}(N) \leq \sum_{n=1}^{N_0} \frac{1}{n^{\sigma + \frac{t}{4} \log\frac{N^2}{n}} + \int_{N_0}^N \frac{1}{a^{\sigma + \frac{t}{4} \log \frac{N^2}{a}}\ da. }[/math]

Making the change of variables [math]\displaystyle{ a = e^u }[/math], the right-hand side becomes

[math]\displaystyle{ \sum_{n=1}^{N_0} \frac{1}{n^{\sigma + \frac{t}{4} \log\frac{N^2}{n}} + \int_{\log N_0}^{\log N} \exp( (1-\sigma) u + \frac{t}{4} (u^2 - 2u \log N) )\ du. }[/math]

The expression [math]\displaystyle{ (1-\sigma) u + \frac{t}{4} (u^2 - 2u \log N) }[/math] is convex in [math]\displaystyle{ u }[/math], and is thus bounded by the maximum of its values at the endpoints [math]\displaystyle{ u = \log N_0, \log N }[/math]; thus

[math]\displaystyle{ \exp( (1-\sigma) u + \frac{t}{4} (u^2 - 2u \log N) ) \leq \max( N_0^{1-\sigma - \frac{t}{4} \log \frac{N^2}{N_0}}, N^{1-\sigma - \frac{t}{4} \log N} ). }[/math]

The claim follows. [math]\displaystyle{ \Box }[/math]

Thus for instance if [math]\displaystyle{ \sigma = 0.7 }[/math], [math]\displaystyle{ t = 0.4 }[/math], and [math]\displaystyle{ N \geq N_0 = 2000 }[/math], one has

[math]\displaystyle{ F_{0.7, 0.4}(N) \leq \sum_{n=1}^{2000} \frac{1}{n^{0.7 + 0.1 \log \frac{N^2}{n}}} + \max( 2000^{0.3 - 0.1 \log \frac{N^2}{2000}}, N^{0.3 - 0.1 \log N}) \log \frac{N}{2000}. }[/math]

One can compute numerically that the second term on the RHS is at most 0.0087, thus

[math]\displaystyle{ F_{0.7, 0.4}(N) \leq \sum_{n=1}^{2000} \frac{1}{n^{0.7 + 0.1 \log \frac{N^2}{n}}} + 0.0087 }[/math]

for all [math]\displaystyle{ N \geq 2000 }[/math]. In particular

[math]\displaystyle{ F_{0.7, 0.4}(N) \leq F_{0.7, 0.4}(2000) + 0.0087 \leq 1.706. }[/math]

Similarly one has

[math]\displaystyle{ F_{0.3, 0.4}(N) \leq \sum_{n=1}^{2000} \frac{1}{n^{0.3 + 0.1 \log \frac{N^2}{n}}} + \max( 2000^{0.7 - 0.1 \log \frac{N^2}{2000}}, N^{0.7 - 0.1 \log N}) \log \frac{N}{2000}. }[/math]

The second term can be shown to be at most [math]\displaystyle{ 0.253 }[/math], thus

[math]\displaystyle{ F_{0.3, 0.4}(N) \leq F_{0.3, 0.4}(2000) + 0.253 \leq 3.469 }[/math]

for all [math]\displaystyle{ N \geq 2000 }[/math].