# Difference between revisions of "An efficient algorithm exists if Cramer's conjecture holds"

Cramer's conjecture asserts that for sufficiently large n there is a prime between n and $n+C(\log n)^2$. This conjecture is plausible if you believe that when you pick a random integers m near n, then the events "m is prime" are roughly independent. In that case, the number of primes between n and $n+C(\log n)^2$ should have a Poisson distribution with mean $C\log n$ (by the prime number theorem), which means that the probability of getting no prime will be around $\exp(-C\log n)$. If we take C to be greater than 2, then the sum $\sum_{n=N}^\infty n\exp(-C\log n)=\sum_{n=N}^\infty n^{1-C}$ converges to around $N^{2-C}$, so we expect there to be no values of n greater than N for which Cramer's conjecture fails.
If Cramer's conjecture is true, then from that and the fact that primality testing is in P it follows instantly that there is a polynomial-time algorithm for finding a k-digit prime. You just search through all the integers from $10^{k-1}$ to $10^{k-1}+Ck^2$, testing each one for primality in time p(k). This interval is guaranteed to contain a prime, so the algorithm terminates after at most $Ck^2p(k)$ steps.
Needless to say, we could increase the security of the above approach by using the weaker conjecture that there is a prime between n and $n+(\log n)^{1000}$. Unexpected phenomena do sometimes occur in the distribution of primes [reference needed to Maier's work here], so using a higher power would make the conjecture more robust. But it would still be way beyond what anyone knows how to prove.