# Kolmogorov complexity

Thus, for instance, any k-bit integer has Kolmogorov complexity at most k+O(1) (the O(1) overhead being for the trivial program that outputs the remaining k bits of the Turing machine). On the other hand, it is obvious that there are at most $2^k$ integers of Kolmogorov complexity at most k (we'll refer to this as the counting argument). As a consequence, most k-bit integers have Kolmogorov complexity close to k.
On the other hand, a deterministic algorithm which takes k as input and runs in time polynomial in k can only produce integers of Kolmogorov complexity $O(\log k)$, since any number produced can be described by a Turing machine of length $O(\log k) + O(1)$ (the O(1) is to describe the algorithm, and the $O(\log k)$ bits are to describe how long the program is to run and what k is).