Quantum computing, one step at a time

For anyone interested in when we’ll have large-scale quantum computers, there was a very striking paper released today, by Manny Knill.

To explain Knill’s paper, I need to give a little background.

Just a little over ten years ago, quantum computing suddenly took off when Peter Shor announced his fast algorithm for quantum factoring.

In 1994, large-scale quantum computing looked like a pipe dream. Several people said publicly (and far more said it privately): “you’ll never ever be able to build a device like that”.

The most common criticisms ran something like this: “to quantum compute, you need to do Y and Z. At the moment in the lab, you can’t even do A and B. Therefore, you’ll never be able to quantum compute.”

The tempting response is to say “well, we’re just about to do C and D, and E and F look pretty likely as well in a few years time. So maybe X and Y aren’t so unrealistic.”

That’s a tempting response, and it’s how I used to respond to this sort of criticism. But it’s turned out that neither the criticism nor the response is accurate.

What’s wrong with the argument is not (or not only) that it doesn’t take account of technological progress from A to B to C and so on.

No, the main thing wrong with the argument has turned out to be that it doesn’t take enough account of theoretical progress.

Sure, over the years experimentalists have moved from A to B to C to D, and they’re getting on to E and F.

But in the meantime, theorists have shown that, actually, you only need to do J and K to quantum compute.

(Alphabet not to scale. At least, not as far as I know.)

In short, lots of discussions of the future of quantum computing are framed as though it’s a fixed target, like building a teraflop laptop.

But it’s not. It’s a fluid target, and pure theory can move us a lot closer to the target, without technology improving a whit.

What’s this all got to do with Knill’s paper?

To go back to 1994 again, one of the first responses to Shor’s paper was numerous claims, some of them public, that quantum computing would never be possible because of the effects of noise.

Roughly speaking, the argument was that quantum states are analogue beasts, and it’s very difficult or impossible to protect analogue information against the effects of noise. Therefore, quantum computers will inevitably be overwhelmed by the effects of noise.

In late 1995 and early 1996, Peter Shor and Andrew Steane independently showed that quantum information, despite appearances to the contrary, actually behaves much more like digital information. In particular, it turns out that the analogue continuum of errors apparently afflicting quantum states can effectively be digitized, and this enables error-correction techniques to be applied to protect against the effects of noise.

I’d like to emphasize, by the by, that this ability to digitize is a deep result, not at all obvious, and depends critically on certain special properties of quantum mechanics. It’s difficult to give a short pat explanation, even to experts on quantum mechanics, and I won’t try to explain it here.

The error-correction techniques were quickly extended to an entire theory of fault-tolerant quantum computation. I won’t try and name names here, as a very large number of people were involved. Indeed, the development of fault-tolerance meant that 1996 was, perhaps the one year in the last ten in which progress toward quantum computing really did seem rapid!

Roughly speaking, the takeaway message from all this work went something like this, circa the end of 1996:

“Suppose I can build individual quantum computer elements so they work with an accuracy of 99.9999% – the so-called threshold value. Without error-correction, if I put 10 million together, there will probably be a few total failures that screw everything up. But I can use error-correction to reduce my error rate as low as I like, provided all my elements work better than that threshold. Whatsmore, this is true no matter how large the computation.”

This idea of a threshold for quantum computing is an incredibly important one, and there’s been quite a bit of work on trying to improve that number (99.9999%).

Knill’s paper is the latest in that line of work, improving the threshold.

What’s Knill’s value?

His threshold value is 97%. Or 99%, with more modest resource requirements in terms of number of gates, memory steps, etcetera.

To state the obvious, having to do things to an accuracy of 97% (or 99%) is a far more encouraging state of affairs than 99.9999%.

There’s a lot of caveats. The 97 / 99% number is for a rather artificial error model. Real physical systems don’t behave like Knill’s model. Indeed, it’s not really clear what that number translates into in terms of actual physical parameters – heating rates, dephasing times, and so on. And Knill uses a model satisfying some assumptions that may not be such a good approximation in real physical systems; indeed, they may even fail completely. (This is true of all the threshold results, although there’s been extremely encouraging recent progress on this front too.) Finally, Knill’s approach has a rather high resource overhead.

All the same, no matter how you look at it, moving from 99.9999% to 97 / 99% is pretty darn encouraging, despite the caveats. And I do wonder just how high the threshold can go.

Published
Categorized as General

Quantum games

Two interesting posts by Daniel Davies at Crooked Timber on a topic that is likely of some interest to many readers of this blog: quantum game theory.

In particular, Davies discussed whether protocols like those allowed in this paper by Cleve, Hoyer, Toner and Watrous can be said to involve communication or not. I’d comment at more length, but have to run.

Published
Categorized as General

Atiyah and Singer

Via Peter Woit, an excellent interview with Atiyah and Singer on the occasion of their receiving the Abel Prize. I’d post excerpts, but I’d probably end up posting the entire thing.

One thing I find fascinating is their belief in the fundamental importance of the connection between physics and mathematics. Most famously, in recent years, this has resulted in what is apparently an extremely fruitful cross-fertilization between the high-energy physics community (especially the string theorists) and mathematicians.

I’ve occasionally wondered to what extent similar connections may appear as a result of the ongoing work in quantum information science.

In that vein, let me mention two exciting recent papers, by Daftuar and Hayden, and by Klyachko.

These papers are ostensibly about a problem of great physical interest, namely, characterizing the possible relationships between the quantum state of a many-particle system, and the quantum state of the individual particles. This is certainly a problem of interest in quantum information, and the solution probably has implications in other areas of physics, like condensed matter physics.

Remarkably, these papers relate this physical problem to some quite sophisticated (and, occasionally, very recent) developments in mathematics. It seems pretty likely to me that the many problems of physical interest still open in this general area may in the future help stimulate the development of interesting new mathematics.

Published
Categorized as General

Frivolity

Work has been very intense the past few weeks, resulting in an unforseen blog hiatus. Perhaps unfortunately, depending on your point of view, I’m all stocked up on frivolous links:

  • Extreme ironing
  • My favourite idea for an extreme sport – extreme dating – appears to have been made into a rather poor movie.
  • The Australian Bureau of Statistics response to the 2001 campaign to get people to write in “Jedi” as their religion on the census. My favourite bit is this:

    ABS recognises that people have a wide range of belief systems

    If your belief system is “Jedi” then answer as such on the census form. But if you would normally answer Anglican or Jewish or Buddhist or something else to the question “what is your religion?” and for the census you answer “Jedi” then this may impact on social services provision if enough people do the same.

Published
Categorized as General