Polymath15 test problem: Difference between revisions
Line 514: | Line 514: | ||
To control <math>\sigma</math>, it suffices to obtain lower bounds because our criteria (both the triangle inequality and Lemma 1) become harder to satisfy when <math>\sigma</math> decreases. We compute | To control <math>\sigma</math>, it suffices to obtain lower bounds because our criteria (both the triangle inequality and Lemma 1) become harder to satisfy when <math>\sigma</math> decreases. We compute | ||
:<math> \sigma = \frac{1+y}{2} + \frac{t}{2} \mathrm{Re}(\frac{1}{1+y+ix} + \frac{2}{-1+y+ix} + \frac{1}{2} \log \frac{1+y+ix}{4\pi})</math> | :<math> \sigma = \frac{1+y}{2} + \frac{t}{2} \mathrm{Re}(\frac{1}{1+y+ix} + \frac{2}{-1+y+ix} + \frac{1}{2} \log \frac{1+y+ix}{4\pi})</math> | ||
:<math> = \frac{1+y}{2} + \frac{t}{2} \frac{1+y}{(1+y)^2+x^2} + \frac{-2+2y}{(-1+y)^2+x^2} + \frac{1}{2} \log \frac{|1+y+ix|}{4\pi}</math> | :<math> = \frac{1+y}{2} + \frac{t}{2} (\frac{1+y}{(1+y)^2+x^2} + \frac{-2+2y}{(-1+y)^2+x^2} + \frac{1}{2} \log \frac{|1+y+ix|}{4\pi})</math> | ||
:<math> \geq \frac{1+y}{2} + \frac{t}{2} \frac{1+y}{(-1+y)^2+x^2} + \frac{-2+2y}{(-1+y)^2+x^2} + \frac{1}{2} \log \frac{x}{4\pi}</math> | :<math> \geq \frac{1+y}{2} + \frac{t}{2} (\frac{1+y}{(-1+y)^2+x^2} + \frac{-2+2y}{(-1+y)^2+x^2} + \frac{1}{2} \log \frac{x}{4\pi})</math> | ||
:<math> \geq \frac{1+y}{2} + \frac{t}{2} \frac{3y-1}{(-1+y)^2+x^2} + \log N</math> | :<math> \geq \frac{1+y}{2} + \frac{t}{2} (\frac{3y-1}{(-1+y)^2+x^2} + \log N)</math> | ||
:<math> \geq \frac{1+y}{2} + \log N</math> | :<math> \geq \frac{1+y}{2} + \frac{t}{2} \log N</math> | ||
assuming that <math>y \geq 1/3</math>. Hence we can actually just use the same value of <math>\sigma</math> as in the toy case. | assuming that <math>y \geq 1/3</math>. Hence we can actually just use the same value of <math>\sigma</math> as in the toy case. | ||
Revision as of 15:36, 27 February 2018
We are initially focusing attention on the following
- Test problem For [math]\displaystyle{ t=y=0.4 }[/math], can one prove that [math]\displaystyle{ H_t(x+iy) \neq 0 }[/math] for all [math]\displaystyle{ x \geq 0 }[/math]?
If we can show this, it is likely that (with the additional use of the argument principle, and some further information on the behaviour of [math]\displaystyle{ H_t(x+iy) }[/math] at [math]\displaystyle{ y=0.4 }[/math]) that one can show that [math]\displaystyle{ H_t(x+iy) \neq 0 }[/math] for all [math]\displaystyle{ y \geq 0.4 }[/math] as well. This would give a new upper bound
- [math]\displaystyle{ \Lambda \leq 0.4 + \frac{1}{2} (0.4)^2 = 0.48 }[/math]
for the de Bruijn-Newman constant.
For very small values of [math]\displaystyle{ x }[/math] we expect to be able to establish this by direct calculation of [math]\displaystyle{ H_t(x+iy) }[/math]. For medium or large values, the strategy is to use a suitable approximation
- [math]\displaystyle{ H_t(x+iy) \approx A + B }[/math]
for some relatively easily computable quantities [math]\displaystyle{ A = A_t(x+iy), B = B_t(x+iy) }[/math] (it may possibly be necessary to use a refined approximation [math]\displaystyle{ A+B-C }[/math] instead). The quantity [math]\displaystyle{ B }[/math] contains a non-zero main term [math]\displaystyle{ B_0 }[/math] which is expected to roughly dominate. To show [math]\displaystyle{ H_t(x+iy) }[/math] is non-zero, it would suffice to show that
- [math]\displaystyle{ \frac{|H_t - A - B|}{|B_0|} \lt \frac{|A + B|}{|B_0|}. }[/math]
Thus one will seek upper bounds on the error [math]\displaystyle{ \frac{|H_t - A - B|}{|B_0|} }[/math] and lower bounds on [math]\displaystyle{ \frac{|A+B|}{|B_0|} }[/math] for various ranges of [math]\displaystyle{ x }[/math]. Numerically it seems that the RHS stays above 0.4 as soon as [math]\displaystyle{ x }[/math] is moderately large, while the LHS stays below 0.1, which looks promising for the rigorous arguments.
Choices of approximation
There are a number of slightly different approximations we have used in previous discussion. The first approximation was
- [math]\displaystyle{ A := \frac{1}{8} \frac{s(s-1)}{2} \pi^{-s/2} \Gamma(s/2) \sum_{n=1}^N \frac{\exp(\frac{t}{16} \log^2 \frac{s+4}{2\pi n^2})}{n^s} }[/math]
- [math]\displaystyle{ B := \frac{1}{8} \frac{s(s-1)}{2} \pi^{-(1-s)/2} \Gamma((1-s)/2) \sum_{n=1}^N \frac{\exp(\frac{t}{16} \log^2 \frac{5-s}{2\pi n^2})}{n^{1-s}} }[/math]
- [math]\displaystyle{ B_0 := \frac{1}{8} \frac{s(s-1)}{2} \pi^{-(1-s)/2} \Gamma((1-s)/2) \exp( \frac{t}{16} \log^2 \frac{5-s}{2\pi} ) }[/math]
- [math]\displaystyle{ s := \frac{1-y+ix}{2} }[/math]
- [math]\displaystyle{ N := \lfloor \sqrt{\frac{\mathrm{Im} s}{2\pi}} \rfloor = \lfloor \sqrt{\frac{x}{4\pi}} \rfloor. }[/math]
This was modified slightly to
- [math]\displaystyle{ A' := \frac{2}{8} \pi^{-s/2} \sqrt{2\pi} \exp( (\frac{s+4}{2}-\frac{1}{2}) \log \frac{s+4}{2} - \frac{s+4}{2}) \sum_{n=1}^N \frac{\exp(\frac{t}{16} \log^2 \frac{s+4}{2\pi n^2})}{n^s} }[/math]
- [math]\displaystyle{ B' := \frac{2}{8} \pi^{-(1-s)/2} \sqrt{2\pi} \exp( (\frac{5-s}{2}-\frac{1}{2}) \log \frac{5-s}{2} - \frac{5-s}{2}) \sum_{n=1}^N \frac{\exp(\frac{t}{16} \log^2 \frac{5-s}{2\pi n^2})}{n^{1-s}} }[/math]
- [math]\displaystyle{ B'_0 := \frac{2}{8} \pi^{-(1-s)/2} \sqrt{2\pi} \exp( (\frac{5-s}{2}-\frac{1}{2}) \log \frac{5-s}{2} - \frac{5-s}{2}) \exp( \frac{t}{16} \log^2 \frac{5-s}{2\pi} ) }[/math]
- [math]\displaystyle{ s := \frac{1-y+ix}{2} }[/math]
- [math]\displaystyle{ N := \lfloor \sqrt{\frac{\mathrm{Im} s}{2\pi}} \rfloor = \lfloor \sqrt{\frac{x}{4\pi}} \rfloor. }[/math]
In Effective bounds on H_t - second approach, a more refined approximation was introduced:
- [math]\displaystyle{ A^{eff} := \frac{1}{8} \exp( \frac{t}{4} \alpha_1(\frac{1-y+ix}{2})^2 ) H_{0,1}(\frac{1-y+ix}{2}) \sum_{n=1}^N \frac{1}{n^{\frac{1-y+ix}{2} + \frac{t \alpha_1(\frac{1-y+ix}{2})}{2} - \frac{t}{4} \log n}} }[/math]
- [math]\displaystyle{ B^{eff} := \frac{1}{8} \exp( \frac{t}{4} \overline{\alpha_1(\frac{1+y+ix}{2})}^2 ) \overline{H_{0,1}(\frac{1+y+ix}{2})} \sum_{n=1}^N \frac{1}{n^{\frac{1+y-ix}{2} + \frac{t \overline{\alpha_1(\frac{1+y+ix}{2})}}{2} - \frac{t}{4} \log n}} }[/math]
- [math]\displaystyle{ B^{eff} := \frac{1}{8} \exp( \frac{t}{4} \overline{\alpha_1(\frac{1+y+ix}{2})}^2 ) \overline{H_{0,1}(\frac{1+y+ix}{2})} }[/math]
- [math]\displaystyle{ H_{0,1}(s) := \frac{s (s-1)}{2} \pi^{-s/2} \sqrt{2\pi} \exp( (\frac{s}{2} - \frac{1}{2}) \log \frac{s}{2} - \frac{s}{2} ) }[/math]
- [math]\displaystyle{ \alpha_1(s) := \frac{1}{2s} + \frac{1}{s-1} + \frac{1}{2} \log \frac{s}{2\pi} }[/math]
- [math]\displaystyle{ N := \lfloor \sqrt{ \frac{T'}{2\pi}} \rfloor }[/math]
- [math]\displaystyle{ T' := \frac{x}{2} + \frac{\pi t}{8}. }[/math]
Finally, a simplified approximation is
- [math]\displaystyle{ A^{toy} := B^{toy}_0 \exp(i ((\frac{x}{2} + \frac{\pi t}{8}) \log \frac{x}{4\pi} - \frac{x}{2} - \frac{\pi}{4} )) N^{-y} \sum_{n=1}^N \frac{1}{n^{\frac{1-y+ix}{2} + \frac{t}{4} \log \frac{N^2}{n} + \pi i t/8}} }[/math]
- [math]\displaystyle{ B^{toy} := B^{toy}_0 \sum_{n=1}^N \frac{1}{n^{\frac{1+y-ix}{2} + \frac{t}{4} \log \frac{N^2}{n} - \pi i t/8}} }[/math]
- [math]\displaystyle{ B^{toy}_0 := \frac{\sqrt{2}}{4} \pi^2 N^{\frac{7+y}{2}} \exp( i (-\frac{x}{4} \log \frac{x}{4\pi} + \frac{x}{4} + \frac{9-y}{8} \pi) + \frac{t}{16} (\log \frac{x}{4\pi} - \frac{\pi i}{2})^2 ) e^{-\pi x/8} }[/math]
- [math]\displaystyle{ N := \lfloor \sqrt{\frac{x}{4\pi}} \rfloor. }[/math]
Here is a table comparing the size of the various main terms:
[math]\displaystyle{ x }[/math] | [math]\displaystyle{ B_0 }[/math] | [math]\displaystyle{ B'_0 }[/math] | [math]\displaystyle{ B^{eff}_0 }[/math] | [math]\displaystyle{ B^{toy}_0 }[/math] |
---|---|---|---|---|
[math]\displaystyle{ 10^3 }[/math] | [math]\displaystyle{ (3.4405 + 3.5443 i) \times 10^{-167} }[/math] | [math]\displaystyle{ (3.4204 + 3.5383 i) \times 10^{-167} }[/math] | [math]\displaystyle{ (3.4426 + 3.5411 i) \times 10^{-167} }[/math] | [math]\displaystyle{ (2.3040 + 2.3606 i) \times 10^{-167} }[/math] |
[math]\displaystyle{ 10^4 }[/math] | [math]\displaystyle{ (-1.1843 - 7.7882 i) \times 10^{-1700} }[/math] | [math]\displaystyle{ (-1.1180 - 7.7888 i) \times 10^{-1700} }[/math] | [math]\displaystyle{ (-1.1185 - 7.7879 i) \times 10^{-1700} }[/math] | [math]\displaystyle{ (-1.1155 - 7.5753 i) \times 10^{-1700} }[/math] |
[math]\displaystyle{ 10^5 }[/math] | [math]\displaystyle{ (-7.6133 + 2.5065 i) * 10^{-17047} }[/math] | [math]\displaystyle{ (-7.6134 + 2.5060 i) * 10^{-17047} }[/math] | [math]\displaystyle{ (-7.6134 + 2.5059 i) * 10^{-17047} }[/math] | [math]\displaystyle{ (-7.5483 + 2.4848 i) * 10^{-17047} }[/math] |
[math]\displaystyle{ 10^6 }[/math] | [math]\displaystyle{ (-3.1615 - 7.7093 i) * 10^{-170537} }[/math] | [math]\displaystyle{ (-3.1676 - 7.7063 i) * 10^{-170537} }[/math] | [math]\displaystyle{ (-3.1646 - 7.7079 i) * 10^{-170537} }[/math] | [math]\displaystyle{ (-3.1590 - 7.6898 i) * 10^{-170537} }[/math] |
[math]\displaystyle{ 10^7 }[/math] | [math]\displaystyle{ (2.1676 - 9.6330 i) * 10^{-1705458} }[/math] | [math]\displaystyle{ (2.1711 - 9.6236 i) * 10^{-1705458} }[/math] | [math]\displaystyle{ (2.1571 - 9.6329 i) * 10^{-1705458} }[/math] | [math]\displaystyle{ (2.2566 - 9.6000 i) * 10^{-1705458} }[/math] |
Here some typical values of [math]\displaystyle{ B/B_0 }[/math] (note that [math]\displaystyle{ B/B_0 }[/math] and [math]\displaystyle{ B'/B'_0 }[/math] are identical):
[math]\displaystyle{ x }[/math] | [math]\displaystyle{ B/B_0 }[/math] | [math]\displaystyle{ B'/B'_0 }[/math] | [math]\displaystyle{ B^{eff}/B^{eff}_0 }[/math] | [math]\displaystyle{ B^{toy}/B^{toy}_0 }[/math] |
---|---|---|---|---|
[math]\displaystyle{ 10^3 }[/math] | [math]\displaystyle{ 0.7722 + 0.6102 i }[/math] | [math]\displaystyle{ 0.7722 + 0.6102 i }[/math] | [math]\displaystyle{ 0.7733 + 0.6101 i }[/math] | [math]\displaystyle{ 0.7626 + 0.6192 i }[/math] |
[math]\displaystyle{ 10^4 }[/math] | [math]\displaystyle{ 0.7434 - 0.0126 i }[/math] | [math]\displaystyle{ 0.7434 - 0.0126 i }[/math] | [math]\displaystyle{ 0.7434 - 0.0126 i }[/math] | [math]\displaystyle{ 0.7434 - 0.0124 i }[/math] |
[math]\displaystyle{ 10^5 }[/math] | [math]\displaystyle{ 1.1218 - 0.3211 i }[/math] | [math]\displaystyle{ 1.1218 - 0.3211 i }[/math] | [math]\displaystyle{ 1.1218 - 0.3211 i }[/math] | [math]\displaystyle{ 1.1219 - 0.3213 i }[/math] |
[math]\displaystyle{ 10^6 }[/math] | [math]\displaystyle{ 1.3956 - 0.5682 i }[/math] | [math]\displaystyle{ 1.3956 - 0.5682 i }[/math] | [math]\displaystyle{ 1.3955 - 0.5682 i }[/math] | [math]\displaystyle{ 1.3956 - 0.5683 i }[/math] |
[math]\displaystyle{ 10^7 }[/math] | [math]\displaystyle{ 1.6400 + 0.0198 i }[/math] | [math]\displaystyle{ 1.6400 + 0.0198 i }[/math] | [math]\displaystyle{ 1.6401 + 0.0198 i }[/math] | [math]\displaystyle{ 1.6400 - 0.0198 i }[/math] |
Here some typical values of [math]\displaystyle{ A/B_0 }[/math], which seems to be about an order of magnitude smaller than [math]\displaystyle{ B/B_0 }[/math] in many cases:
[math]\displaystyle{ x }[/math] | [math]\displaystyle{ A/B_0 }[/math] | [math]\displaystyle{ A'/B'_0 }[/math] | [math]\displaystyle{ A^{eff}/B^{eff}_0 }[/math] | [math]\displaystyle{ A^{toy}/B^{toy}_0 }[/math] |
---|---|---|---|---|
[math]\displaystyle{ 10^3 }[/math] | [math]\displaystyle{ -0.3856 - 0.0997 i }[/math] | [math]\displaystyle{ -0.3857 - 0.0953 i }[/math] | [math]\displaystyle{ -0.3854 - 0.1002 i }[/math] | [math]\displaystyle{ -0.4036 - 0.0968 i }[/math] |
[math]\displaystyle{ 10^4 }[/math] | [math]\displaystyle{ -0.2199 - 0.0034 i }[/math] | [math]\displaystyle{ -0.2199 - 0.0036 i }[/math] | [math]\displaystyle{ -0.2199 - 0.0033 i }[/math] | [math]\displaystyle{ -0.2208 - 0.0033 i }[/math] |
[math]\displaystyle{ 10^5 }[/math] | [math]\displaystyle{ 0.1543 + 0.1660 i }[/math] | [math]\displaystyle{ 0.1543 + 0.1660 i }[/math] | [math]\displaystyle{ 0.1543 + 0.1660 i }[/math] | [math]\displaystyle{ 0.1544 + 0.1663 i }[/math] |
[math]\displaystyle{ 10^6 }[/math] | [math]\displaystyle{ -0.1013 - 0.1887 i }[/math] | [math]\displaystyle{ -0.1010 - 0.1889 i }[/math] | [math]\displaystyle{ -0.1011 - 0.1890 i }[/math] | [math]\displaystyle{ -0.1012 - 0.1888 i }[/math] |
[math]\displaystyle{ 10^7 }[/math] | [math]\displaystyle{ -0.1018 + 0.1135 i }[/math] | [math]\displaystyle{ -0.1022 + 0.1133 i }[/math] | [math]\displaystyle{ -0.1025 + 0.1128 i }[/math] | [math]\displaystyle{ -0.0986 + 0.1163 i }[/math] |
Controlling |A+B|/|B_0|
Some numerical data on [math]\displaystyle{ |A+B/B_0| }[/math] source and also [math]\displaystyle{ \mathrm{Re} \frac{A+B}{B_0} }[/math] source, using a step size of 1 for [math]\displaystyle{ x }[/math], suggesting that this ratio tends to oscillate roughly between 0.5 and 3 for medium values of [math]\displaystyle{ x }[/math]:
range of [math]\displaystyle{ x }[/math] | minimum value | max value | average value | standard deviation | min real part | max real part |
---|---|---|---|---|---|---|
0-1000 | 0.179 | 4.074 | 1.219 | 0.782 | -0.09 | 4.06 |
1000-2000 | 0.352 | 4.403 | 1.164 | 0.712 | 0.02 | 4.43 |
2000-3000 | 0.352 | 4.050 | 1.145 | 0.671 | 0.15 | 3.99 |
3000-4000 | 0.338 | 4.174 | 1.134 | 0.640 | 0.34 | 4.48 |
4000-5000 | 0.386 | 4.491 | 1.128 | 0.615 | 0.33 | 4.33 |
5000-6000 | 0.377 | 4.327 | 1.120 | 0.599 | 0.377 | 4.327 |
[math]\displaystyle{ 1-10^5 }[/math] | 0.179 | 4.491 | 1.077 | 0.455 | -0.09 | 4.48 |
[math]\displaystyle{ 10^5-2 \times 10^5 }[/math] | 0.488 | 3.339 | 1.053 | 0.361 | 0.48 | 3.32 |
[math]\displaystyle{ 2 \times 10^5-3 \times 10^5 }[/math] | 0.508 | 3.049 | 1.047 | 0.335 | 0.50 | 3.00 |
[math]\displaystyle{ 3 \times 10^5-4 \times 10^5 }[/math] | 0.517 | 2.989 | 1.043 | 0.321 | 0.52 | 2.97 |
[math]\displaystyle{ 4 \times 10^5-5 \times 10^5 }[/math] | 0.535 | 2.826 | 1.041 | 0.310 | 0.53 | 2.82 |
[math]\displaystyle{ 5 \times 10^5-6 \times 10^5 }[/math] | 0.529 | 2.757 | 1.039 | 0.303 | 0.53 | 2.75 |
[math]\displaystyle{ 6 \times 10^5-7 \times 10^5 }[/math] | 0.548 | 2.728 | 1.038 | 0.296 | 0.55 | 2.72 |
Here is a computation on the magnitude [math]\displaystyle{ |\frac{d}{dx}(B'/B'_0)| }[/math] of the derivative of [math]\displaystyle{ B'/B'_0 }[/math], sampled at steps of 1 in [math]\displaystyle{ x }[/math] source, together with a crude upper bound coming from the triangle inequality source, to give some indication of the oscillation:
range of [math]\displaystyle{ T=x/2 }[/math] | max value | average value | standard deviation | triangle inequality bound |
---|---|---|---|---|
0-1000 | 1.04 | 0.33 | 0.19 | |
1000-2000 | 1.25 | 0.39 | 0.24 | |
2000-3000 | 1.31 | 0.39 | 0.25 | |
3000-4000 | 1.39 | 0.38 | 0.27 | |
4000-5000 | 1.64 | 0.37 | 0.26 | |
5000-6000 | 1.60 | 0.36 | 0.27 | |
6000-7000 | 1.61 | 0.36 | 0.26 | |
7000-8000 | 1.55 | 0.36 | 0.27 | |
8000-9000 | 1.65 | 0.34 | 0.26 | |
9000-10000 | 1.47 | 0.34 | 0.26 | |
[math]\displaystyle{ 1-10^5 }[/math] | 1.78 | 0.28 | 0.23 | 2.341 |
[math]\displaystyle{ 10^5-2 \times 10^5 }[/math] | 1.66 | 0.22 | 0.18 | 2.299 |
[math]\displaystyle{ 2 \times 10^5-3 \times 10^5 }[/math] | 1.55 | 0.20 | 0.17 | 2.195 |
[math]\displaystyle{ 3 \times 10^5-4 \times 10^5 }[/math] | 1.53 | 0.19 | 0.16 | 2.109 |
[math]\displaystyle{ 4 \times 10^5-5 \times 10^5 }[/math] | 1.31 | 0.18 | 0.15 | 2.039 |
[math]\displaystyle{ 5 \times 10^5-6 \times 10^5 }[/math] | 1.34 | 0.18 | 0.14 | |
[math]\displaystyle{ 6 \times 10^5-7 \times 10^5 }[/math] | 1.33 | 0.17 | 0.14 |
In the toy case, we have
- [math]\displaystyle{ \frac{|A^{toy}+B^{toy}|}{|B^{toy}_0|} \geq |\sum_{n=1}^N \frac{b_n}{n^s}| - |\sum_{n=1}^N \frac{a_n}{n^s}| }[/math]
where [math]\displaystyle{ b_n := \exp( \frac{t}{4} \log^2 n) }[/math], [math]\displaystyle{ a_n := (n/N)^{y} b_n }[/math], and [math]\displaystyle{ s := \frac{1+y+ix}{2} + \frac{t}{2} \log N + \frac{\pi i t}{8} }[/math]. For the effective approximation one has
- [math]\displaystyle{ \frac{|A^{eff}+B^{eff}|}{|B^{eff}_0|} \geq |\sum_{n=1}^N \frac{b_n}{n^s}| - |\sum_{n=1}^N \frac{a_n}{n^s}| \quad (2.1) }[/math]
where now [math]\displaystyle{ b_n := \exp( \frac{t}{4} \log^2 n) }[/math], [math]\displaystyle{ s := \frac{1+y+ix}{2} + \frac{t}{2} \alpha_1(\frac{1+y+ix}{2}) }[/math], and
- [math]\displaystyle{ a_n := |\frac{\exp( \frac{t}{4} \alpha_1(\frac{1-y+ix}{2})^2 ) H_{0,1}( \frac{1-y+ix}{2} )}{ \exp( \frac{t}{4} \alpha_1(\frac{1+y+ix}{2})^2 ) H_{0,1}( \frac{1+y+ix}{2} ) }| n^{y - \frac{t}{2} \alpha_1(\frac{1-y+ix}{2}) + \frac{t}{2} \alpha_1(\frac{1+y+ix}{2}) )} b_n. }[/math]
It is thus of interest to obtain lower bounds for expressions of the form
- [math]\displaystyle{ |\sum_{n=1}^N \frac{b_n}{n^s}| - |\sum_{n=1}^N \frac{a_n}{n^s}| }[/math]
in situations where [math]\displaystyle{ b_1=1 }[/math] is expected to be a dominant term.
From the triangle inequality one obtains the lower bound
- [math]\displaystyle{ |\sum_{n=1}^N \frac{b_n}{n^s}| - |\sum_{n=1}^N \frac{a_n}{n^s}| \geq 1 - |a_1| - \sum_{n=2}^N \frac{|a_n|+|b_n|}{n^\sigma} }[/math]
where [math]\displaystyle{ \sigma := \frac{1+y}{2} + \frac{t}{2} \log N }[/math] is the real part of [math]\displaystyle{ s }[/math]. There is a refinement:
Lemma 1 If [math]\displaystyle{ a_n,b_n }[/math] are real coefficients with [math]\displaystyle{ b_1 = 1 }[/math] and [math]\displaystyle{ 0 \leq a_1 \lt 1 }[/math] we have
- [math]\displaystyle{ |\sum_{n=1}^N \frac{b_n}{n^s}| - |\sum_{n=1}^N \frac{a_n}{n^s}| \geq 1 - a_1 - \sum_{n=2}^N \frac{\max( |b_n-a_n|, \frac{1-a_1}{1+a_1} |b_n+a_n|)}{n^\sigma}. }[/math]
Proof By a continuity argument we may assume without loss of generality that the left-hand side is positive, then we may write it as
- [math]\displaystyle{ |\sum_{n=1}^N \frac{b_n - e^{i\theta} a_n}{n^s}| }[/math]
for some phase [math]\displaystyle{ \theta }[/math]. By the triangle inequality, this is at least
- [math]\displaystyle{ |1 - e^{i\theta} a_1| - \sum_{n=2}^N \frac{|b_n - e^{i\theta} a_n|}{n^\sigma}. }[/math]
We factor out [math]\displaystyle{ |1 - e^{i\theta} a_1| }[/math], which is at least [math]\displaystyle{ 1-a_1 }[/math], to obtain the lower bound
- [math]\displaystyle{ (1-a_1) (1 - \sum_{n=2}^N \frac{|b_n - e^{i\theta} a_n| / |1 - e^{i\theta} a_1|}{n^\sigma}). }[/math]
By the cosine rule, we have
- [math]\displaystyle{ (|b_n - e^{i\theta} a_n| / |1 - e^{i\theta} a_1|)^2 = \frac{b_n^2 + a_n^2 - 2 a_n b_n \cos \theta}{1 + a_1^2 -2 a_1 \cos \theta}. }[/math]
This is a fractional linear function of [math]\displaystyle{ \cos \theta }[/math] with no poles in the range [math]\displaystyle{ [-1,1] }[/math] of [math]\displaystyle{ \cos \theta }[/math]. Thus this function is monotone on this range and attains its maximum at either [math]\displaystyle{ \cos \theta=+1 }[/math] or [math]\displaystyle{ \cos \theta = -1 }[/math]. We conclude that
- [math]\displaystyle{ \frac{|b_n - e^{i\theta} a_n|}{|1 - e^{i\theta} a_1|} \leq \max( \frac{|b_n-a_n|}{1-a_1}, \frac{|b_n+a_n|}{1+a_1} ) }[/math]
and the claim follows.
We can also mollify the [math]\displaystyle{ a_n,b_n }[/math]:
Lemma 2 If [math]\displaystyle{ \lambda_1,\dots,\lambda_D }[/math] are complex numbers, then
- [math]\displaystyle{ |\sum_{d=1}^D \frac{\lambda_d}{d^s}| (|\sum_{n=1}^N \frac{b_n}{n^s}| - |\sum_{n=1}^N \frac{a_n}{n^s}|) = ( |\sum_{n=1}^{DN} \frac{\tilde b_n}{n^s}| - |\sum_{n=1}^{DN} \frac{\tilde a_n}{n^s}| ) }[/math]
where
- [math]\displaystyle{ \tilde a_n := \sum_{d=1}^D 1_{n \leq dN} 1_{d|n} \lambda_d a_{n/d} }[/math]
- [math]\displaystyle{ \tilde b_n := \sum_{d=1}^D 1_{n \leq dN} 1_{d|n} \lambda_d b_{n/d} }[/math]
Proof This is immediate from the Dirichlet convolution identities
- [math]\displaystyle{ (\sum_{d=1}^D \frac{\lambda_d}{d^s}) \sum_{n=1}^N \frac{a_n}{n^s} = \sum_{n=1}^N \frac{\tilde a_n}{n^s} }[/math]
and
- [math]\displaystyle{ (\sum_{d=1}^D \frac{\lambda_d}{d^s}) \sum_{n=1}^N \frac{b_n}{n^s} = \sum_{n=1}^N \frac{\tilde b_n}{n^s}. }[/math]
[math]\displaystyle{ \Box }[/math]
Combining the two lemmas, we see for instance that we can show [math]\displaystyle{ |\sum_{n=1}^N \frac{b_n}{n^s}| - |\sum_{n=1}^N \frac{a_n}{n^s}| \gt 0 }[/math] whenever can find [math]\displaystyle{ \lambda_1,\dots,\lambda_D }[/math] with [math]\displaystyle{ \lambda_1=1 }[/math] and
- [math]\displaystyle{ \sum_{n=2}^N \frac{\max( \frac{|\tilde b_n-\tilde a_n|}{1-a_1}, \frac{|\tilde b_n+ \tilde a_n|}{1+a_1})}{n^\sigma} \lt 1. }[/math]
A usable choice of mollifier seems to be the Euler products
- [math]\displaystyle{ \sum_{d=1}^D \frac{\lambda_d}{d^s} := \prod_{p \leq P} (1 - \frac{b_p}{p^s}) }[/math]
which are designed to kill off the first few [math]\displaystyle{ \tilde b_n }[/math] coefficients.
Analysing the toy model
With regards to the toy problem of showing [math]\displaystyle{ A^{toy}+B^{toy} }[/math] does not vanish, here are the least values of [math]\displaystyle{ N }[/math] for which this method works source source source source:
[math]\displaystyle{ P }[/math] in Euler product | [math]\displaystyle{ N }[/math] using triangle inequality | [math]\displaystyle{ N }[/math] using Lemma 1 |
---|---|---|
1 | 1391 | 1080 |
2 | 478 | 341 |
3 | 322 | 220 |
5 | 282 | 192 |
7 | 180 | |
11 | 176 |
Dropping the [math]\displaystyle{ \lambda_6 }[/math] term from the [math]\displaystyle{ P=3 }[/math] Euler factor improves the 220 threshold slightly to 213 source.
Analysing the effective model
The differences between the toy model and the effective model are:
- The real part [math]\displaystyle{ \sigma }[/math] of [math]\displaystyle{ s }[/math] is now [math]\displaystyle{ \frac{1+y}{2} + \frac{t}{2} \mathrm{Re} \alpha_1(\frac{1+y+ix}{2}) }[/math] rather than [math]\displaystyle{ \frac{1+y}{2} + \frac{t}{2} \log N }[/math]. (The imaginary part of [math]\displaystyle{ s }[/math] also changes somewhat.)
- The coefficient [math]\displaystyle{ a_n }[/math] is now given by
- [math]\displaystyle{ a_n = \lambda n^{y + \frac{t}{2} (\alpha_1(\frac{1+y+ix}{2}) - \alpha_1(\frac{1-y+ix}{2}))} b_n }[/math]
rather than [math]\displaystyle{ a_n = N^{-y} n^y b_n }[/math], where
- [math]\displaystyle{ \lambda := |\frac{\exp( \frac{t}{4} \alpha_1(\frac{1-y+ix}{2})^2 H_{0,1}( \frac{1-y+ix}{2})}{\exp( \frac{t}{4} \alpha_1(\frac{1-y+ix}{2})^2 H_{0,1}( \frac{1-y+ix}{2})}|. }[/math]
Two complications arise here compared with the toy model: firstly, [math]\displaystyle{ \sigma,a_n }[/math] now depend on [math]\displaystyle{ x }[/math] and not just on [math]\displaystyle{ N }[/math], and secondly the [math]\displaystyle{ a_n }[/math] are not quite real-valued making it more difficult to apply Lemma 1.
However we have good estimates for [math]\displaystyle{ \sigma,a_n }[/math] that depend only on [math]\displaystyle{ N }[/math]. Note that
- [math]\displaystyle{ 2\pi N^2 \leq T' \lt 2\pi (N+1)^2 }[/math]
and hence
- [math]\displaystyle{ x_N \leq x \lt x_{N+1} }[/math]
where
- [math]\displaystyle{ x_N := 4\pi N^2 - \frac{\pi t}{4}. }[/math]
To control [math]\displaystyle{ \sigma }[/math], it suffices to obtain lower bounds because our criteria (both the triangle inequality and Lemma 1) become harder to satisfy when [math]\displaystyle{ \sigma }[/math] decreases. We compute
- [math]\displaystyle{ \sigma = \frac{1+y}{2} + \frac{t}{2} \mathrm{Re}(\frac{1}{1+y+ix} + \frac{2}{-1+y+ix} + \frac{1}{2} \log \frac{1+y+ix}{4\pi}) }[/math]
- [math]\displaystyle{ = \frac{1+y}{2} + \frac{t}{2} (\frac{1+y}{(1+y)^2+x^2} + \frac{-2+2y}{(-1+y)^2+x^2} + \frac{1}{2} \log \frac{|1+y+ix|}{4\pi}) }[/math]
- [math]\displaystyle{ \geq \frac{1+y}{2} + \frac{t}{2} (\frac{1+y}{(-1+y)^2+x^2} + \frac{-2+2y}{(-1+y)^2+x^2} + \frac{1}{2} \log \frac{x}{4\pi}) }[/math]
- [math]\displaystyle{ \geq \frac{1+y}{2} + \frac{t}{2} (\frac{3y-1}{(-1+y)^2+x^2} + \log N) }[/math]
- [math]\displaystyle{ \geq \frac{1+y}{2} + \frac{t}{2} \log N }[/math]
assuming that [math]\displaystyle{ y \geq 1/3 }[/math]. Hence we can actually just use the same value of [math]\displaystyle{ \sigma }[/math] as in the toy case.
Next we control [math]\displaystyle{ \lambda }[/math]. Note that we can increase [math]\displaystyle{ \lambda }[/math] (thus multiplying [math]\displaystyle{ \sum_{n=1}^N \frac{a_n}{n^s} }[/math] by a quantity greater than 1) without affecting (2.1), so we just need upper bounds on [math]\displaystyle{ \lambda }[/math]. We may factor
- [math]\displaystyle{ \lambda = \exp( \frac{t}{4} \mathrm{Re} (\alpha_1(\frac{1-y+ix}{2})^2 - \alpha_1(\frac{1+y+ix}{2})^2) + \mathrm{Re}( f(\frac{1-y+ix}{2}) - f(\frac{1+y+ix}{2} ) ) }[/math]
where
- [math]\displaystyle{ f(s) := -\frac{s}{2} \log \pi + (\frac{s}{2} - \frac{1}{2}) \log \frac{s}{2} - \frac{s}{2}. }[/math]
By the mean value theorem, we have
- [math]\displaystyle{ \mathrm{Re} (\alpha_1(\frac{1-y+ix}{2})^2 - \alpha_1(\frac{1+y+ix}{2})^2) = 2 y \alpha_1(s') \alpha'_1(s') }[/math]
for some [math]\displaystyle{ s' }[/math] between [math]\displaystyle{ \frac{1-y+ix}{2} }[/math] and [math]\displaystyle{ \frac{1+iy}{2} }[/math]. We have
- [math]\displaystyle{ \alpha_1(s') = \frac{1}{2s'} + \frac{1}{s'-1} + \frac{1}{2} \log \frac{s'}{2\pi} }[/math]
- [math]\displaystyle{ = O_{\leq}(\frac{1}{x}) + O_{\leq}(\frac{1}{x/2}) + \frac{1}{2} \log \frac{|s'|}{2\pi}} + O_{\leq}(\frac{\pi}{4}) }[/math]
- [math]\displaystyle{ = O_{\leq}( \frac{\pi}{4} + \frac{3}{x_N}) + \frac{1}{2} O_{\leq}^{\mathbf{R}}( \log \frac{|1+y+ix_{N+1}|}{4\pi}} ) }[/math]
and
- [math]\displaystyle{ \alpha'_1(s') = -\frac{1}{2(s')^2} + \frac{1}{(s'-1)^2} + \frac{1}{2s'} }[/math]
- [math]\displaystyle{ = O_{\leq}(\frac{1}{x^2/2}) + O_{\leq(\frac{1/x^2/4}) + \frac{1}{2s'} }[/math]
- [math]\displaystyle{ = O_{\leq}(\frac{6}{x_N^2}) + \frac{1}{2s'} }[/math]
- [math]\displaystyle{ = O_{\leq}(\frac{6}{x_N^2}) + O_{\leq}( \frac{1}{x_N} ). }[/math]
Thus one has
- [math]\displaystyle{ \mathrm{Re} (\alpha_1(\frac{1-y+ix}{2})^2 - \alpha_1(\frac{1+y+ix}{2})^2) = 2y O_{\leq}( (\frac{\pi}{4} + \frac{3}{x_N}) (\frac{1}{x_N} + \frac{6}{x_N^2}) ) }[/math]
- [math]\displaystyle{ + 2y O_{\leq}( \log \frac{|1+y+ix_{N+1}|}{4\pi}} (\frac{6}{x_N^2} + |\mathrm{Re} \frac{1}{2s'}|) ) }[/math]
Now we have
- [math]\displaystyle{ \mathrm{Re} \frac{1}{2s'} = \frac{\mathrm{Re}(s')}{2|s'|^2} }[/math]
- [math]\displaystyle{ \leq \frac{1+y}{x^2} }[/math]
- [math]\displaystyle{ \leq \frac{1+y}{x_N^2}; }[/math]
also
- [math]\displaystyle{ (\frac{\pi}{4} + \frac{3}{x_N}) (\frac{1}{x_N} + \frac{6}{x_N^2}) \leq \frac{\pi}{4} (1 + \frac{12/\pi}{x_N}) \frac{1}{x_N-6} }[/math]
- [math]\displaystyle{ \leq \frac{\pi}{4} ( \frac{1}{x_N-6} + \frac{12/\pi}{(x_N-6)^2} ) }[/math]
- [math]\displaystyle{ \leq \frac{\pi}{4} \frac{1}{x_N - 6 - 12/\pi}. }[/math]
We conclude that
- [math]\displaystyle{ \mathrm{Re} (\alpha_1(\frac{1-y+ix}{2})^2 - \alpha_1(\frac{1+y+ix}{2})^2) = O_{\leq}(\frac{\pi y}{2 (x_N - 6 - 12/\pi)} + \frac{2y(7+y)}{x_N^2} \log \frac{|1+y+ix_{N+1}|}{4\pi}). }[/math]
In a similar vein, from the mean value theorem we have
- [math]\displaystyle{ \mathrm{Re}( f(\frac{1-y+ix}{2}) - f(\frac{1+y+ix}{2} ) = -y \mathrm{Re} f'(s'') }[/math]
for some [math]\displaystyle{ s'' }[/math] between [math]\displaystyle{ \frac{1-y+ix}{2} }[/math] and [math]\displaystyle{ \frac{1+y+ix}{2} }[/math]. We have
- [math]\displaystyle{ \mathrm{Re} f'(s'') = -\frac{1}{2} \log \pi + \frac{1}{2} \log \frac{|s''|}{2} - \mathrm{Re} \frac{1}{2s''} }[/math]
- [math]\displaystyle{ = \frac{1}{2} \log \frac{|s''|}{2\pi} + O_{\leq}(\frac{\mathm{Re}(s'')}{2|s''|^2}) }[/math]
- [math]\displaystyle{ \geq \log N + O_{\leq}(\frac{1+y}{x^2}) }[/math]
- [math]\displaystyle{ \geq \log N + O_{\leq}(\frac{1+y}{x_N^2}) }[/math]
and thus
- [math]\displaystyle{ \lambda \leq N^{-y} \exp( \frac{\pi y}{2 (x_N - 6 - 12/\pi)} + \frac{2y(7+y)}{x_N^2} \log \frac{|1+y+ix_{N+1}|}{4\pi} + \frac{y(1+y)}{x_N^2} ) }[/math]
- [math]\displaystyle{ \leq e^\delta N^{-y} }[/math]
where
- [math]\displaystyle{ \delta := \frac{\pi y}{2 (x_N - 6 - \frac{14+2y}{\pi})} + \frac{2y(7+y)}{x_N^2} \log \frac{|1+y+ix_{N+1}|}{4\pi} ) }[/math]
Asymptotically we have
- [math]\displaystyle{ \delta = \frac{\pi y}{2 x_N} + O( \frac{\log x_N}{x_N^2} ) = O( \frac{1}{x_N} ). }[/math]
Now we control [math]\displaystyle{ \alpha_1(\frac{1+y+ix}{2}) - \alpha_1(\frac{1-y+ix}{2}) }[/math]. By the mean-value theorem we have
- [math]\displaystyle{ \alpha_1(\frac{1+y+ix}{2}) - \alpha_1(\frac{1-y+ix}{2}) = O_{\leq}( y |\alpha'_1(s''')|) }[/math]
for some [math]\displaystyle{ s''' }[/math] between [math]\displaystyle{ \frac{1+y+ix}{2} }[/math] and [math]\displaystyle{ \frac{1-y+ix}{2} }[/math]. As before we have
- [math]\displaystyle{ \alpha'_1(s''') = -\frac{1}{2(s''')^2} - \frac{1}{(s'''-1)^2} + \frac{1}{2s'''} }[/math]
- [math]\displaystyle{ = O_{\leq}( \frac{1}{x^2/2} + \frac{1}{x^2/4} + \frac{1}{x} ) }[/math]
- [math]\displaystyle{ = O_{\leq}( \frac{1}{x_N} + \frac{6}{x_N^2} ) }[/math]
- [math]\displaystyle{ = O_{\leq}( \frac{1}{x_N-6} ). }[/math]
We conclude that (after replacing [math]\displaystyle{ \lambda }[/math] with [math]\displaystyle{ e^\delta N^{-y} }[/math]
- [math]\displaystyle{ a_n = (n/N)^y \exp( \delta + O_{\leq}( \frac{t y \log n}{2(x_N-6)} ) ) b_n. }[/math]
The triangle inequality argument will thus give [math]\displaystyle{ A^{eff}+B^{eff} }[/math] non-zero as long as
- [math]\displaystyle{ \sum_{n=1}^N (1 + (n/N)^y \exp( \delta + \frac{t y \log n}{2(x_N-6)} ) ) b_n \lt 2. }[/math]
The situation with using Lemma 1 is a bit more complicated because [math]\displaystyle{ a_n }[/math] is not quite real. We can write [math]\displaystyle{ a_n = e^\delta a_n^{toy} + O_{\leq}( e_n ) }[/math] where
- [math]\displaystyle{ a_n^{toy} := (n/N)^y b_n }[/math]
and
- [math]\displaystyle{ e_n := e^\delta (n/N)^y (\exp( \frac{t y \log n}{2(x_N-6)} ) - 1) b_n }[/math]
and then by Lemma 1 and the triangle inequality we can make [math]\displaystyle{ A^{eff}+B^{eff} }[/math] non-zero as long as
- [math]\displaystyle{ a_1^{toy} + \sum_{n=2}^N \frac{\max( |b_n-a_n^{toy}|, \frac{1+a_1^{toy}}{1-a_1^{toy}} |b_n + a_n^{toy}|}{n^\sigma} + \sum_{n=1}^N \frac{e_n}{n^\sigma} \lt 1. }[/math]
Controlling |H_t-A-B|/|B_0|
As computed in [Effective bounds on H_t - second attempt], there is an effective bound
- [math]\displaystyle{ |H_{eff} - A^{eff} - B^{eff}| \leq E_1 + E_2 + E_3 }[/math]
where
- [math]\displaystyle{ H_{0,1}(s) := \frac{s (s-1)}{2} \pi^{-s/2} \sqrt{2\pi} \exp( (\frac{s}{2} - \frac{1}{2}) \log \frac{s}{2} - \frac{s}{2} ) }[/math]
- [math]\displaystyle{ E_1 := \frac{1}{8 (T - 3.33)} \exp( \frac{t}{4} \mathrm{Re} \alpha_1(\frac{1-y+ix}{2})^2 ) |H_{0,1}(\frac{1-y+ix}{2})| \epsilon'(\frac{1-y+ix}{2}) }[/math]
- [math]\displaystyle{ E_2 := \frac{1}{8 (T - 3.33)} \exp( \frac{t}{4} \mathrm{Re} \alpha_1(\frac{1+y+ix}{2})^2 ) |H_{0,1}(\frac{1+y+ix}{2})| \epsilon'(\frac{1+y+ix}{2}) }[/math]
- [math]\displaystyle{ E_3 := \frac{1}{8} \sqrt{\pi} \exp( -\frac{t \pi^2}{64} ) (T')^{3/2} e^{-\pi T/4} \int_{-\infty}^\infty v(\sigma) w(\sigma) f(\sigma)\ d\sigma }[/math]
- [math]\displaystyle{ \epsilon'(s) := \frac{1}{2} \sum_{n=1}^N \frac{1}{n^{\mathrm{Re}(s) + \frac{t \mathrm{Re} \alpha_1(s)}{2} - \frac{t}{4} \log n}} \exp(\frac{1}{2(T-3.33)} (\frac{t^2}{4} |\alpha_1(s) - \log n|^2 + \frac{1}{3} + t)) (\frac{t^2}{4} |\alpha_1(s) - \log n|^2 + \frac{1}{3} + t ) }[/math]
- [math]\displaystyle{ f(\sigma) := \frac{1}{2\sqrt{\pi t}} (e^{-(\sigma-(1-y)/2)^2/t} + e^{-(\sigma-(1+y)/2)^2/t}) \quad (4.1) }[/math]
- [math]\displaystyle{ w(\sigma) := (1 + \frac{\sigma^2}{(T'_0)^2})^{1/2} (1 + \frac{(1-\sigma)^2}{(T'_0)^2})^{1/2} \exp( \frac{(\sigma-1)_+}{4} \log (1 + \frac{\sigma^2}{(T'_0)^2}) + (\frac{T'_0}{2} \arctan \frac{\sigma}{T'_0} - \frac{\sigma}{2}) 1_{\sigma \lt 0} + \frac{1}{12(T'_0 - 0.33)}) }[/math]
- [math]\displaystyle{ v(\sigma) := 1 + (0.400 \frac{9^\sigma}{a_0} + 0.346 \frac{2^{3\sigma/2}}{a_0^2}) 1_{\sigma \geq 0} + (9/10)^{\lceil -\sigma \rceil} \sum_{1 \leq k \leq 4-\sigma} (1.1)^k \frac{\Gamma(k/2)}{a_0^k} 1_{\sigma \lt 0} }[/math]
- [math]\displaystyle{ a_0 := \sqrt{\frac{T'_0}{2\pi}} }[/math]
- [math]\displaystyle{ \alpha_1(s) := \frac{1}{2s} + \frac{1}{s-1} + \frac{1}{2} \log \frac{s}{2\pi} }[/math]
- [math]\displaystyle{ N := \lfloor \sqrt{ \frac{T'}{2\pi}} \rfloor }[/math]
- [math]\displaystyle{ T' := \frac{x}{2} + \frac{\pi t}{8} }[/math]
- [math]\displaystyle{ T'_0 := T_0 + \frac{\pi t}{8} }[/math]
Comparison between [math]\displaystyle{ H^{eff} = A^{eff}+B^{eff} }[/math], [math]\displaystyle{ A'+B' }[/math], and the effective error bound [math]\displaystyle{ E_1+E_2+E_3 }[/math] on [math]\displaystyle{ H - H^{eff} }[/math] at some points of [math]\displaystyle{ x }[/math] source:
[math]\displaystyle{ x }[/math] | [math]\displaystyle{ |H^{eff}/B'_0| }[/math] | [math]\displaystyle{ |(A'+B')/B'_0| }[/math] | [math]\displaystyle{ |(H^{eff}-(A'+B'))/B'_0| }[/math] | [math]\displaystyle{ |(H^{eff}-(A'+B'))/B'_0| + |(E_1+E_2+E_3)/B'_0| }[/math] |
---|---|---|---|---|
10000 | 0.52 | 0.52 | 0.0006 | 0.039 |
12131 | 1.28 | 1.28 | 0.0004 | 0.033 |
15256 | 0.97 | 0.97 | 0.0003 | 0.027 |
18432 | 0.68 | 0.68 | 0.0003 | 0.023 |
20567 | 0.98 | 0.98 | 0.0004 | 0.022 |
30654 | 1.93 | 1.93 | 0.0004 | 0.016 |
The [math]\displaystyle{ E_3 }[/math] error dominates the other two source:
[math]\displaystyle{ x }[/math] | [math]\displaystyle{ \frac{E_3}{E_1+E_2} }[/math] |
---|---|
10000 | 9.11 |
15000 | 14.97 |
20000 | 19.26 |
50000 | 32.39 |
100000 | 42.99 |
[math]\displaystyle{ 10^7 }[/math] | 87.23 |
[math]\displaystyle{ A+B-C }[/math] is a good approximation to [math]\displaystyle{ H_t }[/math] source
[math]\displaystyle{ x }[/math] | [math]\displaystyle{ \frac{|H_t-(A+B-C)|}{|B_0|} }[/math] |
---|---|
160 | 0.06993270565802375041 |
320 | 0.006716674125965016299 |
480 | 0.005332893070605698501 |
640 | 0.003363431256036816251 |
800 | 0.1548144749150572349 |
960 | 0.03009229958121352990 |
1120 | 0.004507664238680722472 |
1280 | 0.002283591962997851167 |
1440 | 0.01553727684468691873 |
1600 | 0.001778051951547709718 |
1760 | 0.02763769444052338578 |
1920 | 0.002108779890256530964 |
2080 | 0.02746770886040058927 |
2240 | 0.001567020041379128455 |
2400 | 0.01801417530687959747 |
2560 | 0.001359561117436848149 |
2720 | 0.008503327577240081269 |
2880 | 0.001089253262122934826 |
3040 | 0.003004181560093288747 |
3200 | 0.02931455383125538672 |
A closer look at the "spike" in error near [math]\displaystyle{ x=800 \approx 256 \pi \approx 804 }[/math]:
[math]\displaystyle{ x }[/math] | [math]\displaystyle{ \frac{|H_t-(A+B-C)|}{|B_0|} }[/math] |
---|---|
622.035345 | 0.003667321 |
631.460123 | 0.004268055 |
640.884901 | 0.003284407 |
650.309679 | 0.004453589 |
659.734457 | 0.003872174 |
669.159235 | 0.005048162 |
678.584013 | 0.005009254 |
688.008791 | 0.007418686 |
697.433569 | 0.007464541 |
706.858347 | 0.010692337 |
716.283125 | 0.012938629 |
725.707903 | 0.017830524 |
735.132681 | 0.022428596 |
744.557459 | 0.030907876 |
753.982237 | 0.040060298 |
763.407015 | 0.053652069 |
772.831793 | 0.071092824 |
782.256571 | 0.094081856 |
791.681349 | 0.123108726 |
801.106127 | 0.159299234 |
810.530905 | 0.002870724 |
In practice [math]\displaystyle{ E_1/B^{eff}_0 }[/math] is smaller than [math]\displaystyle{ E_2/B_{eff}_0 }[/math], which is mostly dominated by the first term in the sum which is close to [math]\displaystyle{ \frac{t^2}{16 x} \log^2 \frac{x}{4\pi} }[/math]:
[math]\displaystyle{ x }[/math] | [math]\displaystyle{ E_1 / B^{eff}_0 }[/math] | [math]\displaystyle{ E_2 / B^{eff}_0 }[/math] | [math]\displaystyle{ \frac{t^2}{16x} \log^2 \frac{x}{4\pi} }[/math] |
---|---|---|---|
10^3 | [math]\displaystyle{ 1.389 \times 10^{-3} }[/math] | [math]\displaystyle{ 2.341 \times 10^{-3} }[/math] | [math]\displaystyle{ 1.915 \times 10^{-4} }[/math] |
10^4 | [math]\displaystyle{ 1.438 \times 10^{-4} }[/math] | [math]\displaystyle{ 3.156 \times 10^{-4} }[/math] | [math]\displaystyle{ 4.461 \times 10^{-5} }[/math] |
10^5 | [math]\displaystyle{ 1.118 \times 10^{-5} }[/math] | [math]\displaystyle{ 3.574 \times 10^{-5} }[/math] | [math]\displaystyle{ 8.067 \times 10^{-6} }[/math] |
10^6 | [math]\displaystyle{ 7.328 \times 10^{-7} }[/math] | [math]\displaystyle{ 3.850 \times 10^{-6} }[/math] | [math]\displaystyle{ 1.273 \times 10^{-6} }[/math] |
10^7 | [math]\displaystyle{ 4.414 \times 10^{-8} }[/math] | [math]\displaystyle{ 4.197 \times 10^{-7} }[/math] | [math]\displaystyle{ 1.846 \times 10^{-7} }[/math] |
...