Climbing Mt. Scalable

The survey of abused words in quantum computing shows the word "exponential" as having an, um, exponential, lead over its competitors. My own personal choice for the most abused word was "scalable," a word that is, in my opinion, the least debated, but most important, concept in quantum computing today. A word which everyone uses but whose definition is strangely missing from all almost all papers that use the word. Here are some thoughts on this word, what it means to particular groups, and what I, in my own pomposity, think the word really should mean.

Note the title of this post is ripped off from the title of quant-ph/0204157, a classic paper on "scalability" in quantum computing.

At the bottom of the totem pole of those who use the word scalable, are probably those working in quantum computing who are of a theoretical computer science-ish bent (looks in mirror, realizes he needs to shave.) For a computer scientist, scalable most often simply means that the protocol being considered does not require resources which scale exponentially in the fundamental size of the problem. Thus, for example, fault-tolerant quantum computing provides a scalable method for building a quantum computer which can maintain its coherent information for long periods of time. Using n qubits one can shrink the error in a computation exponentially in n, and thus only a polynomial in the logarithm of n overhead is incurred as a cost in building the fault-tolerant circuit. This then is the scalability of big-O notation: useful for separating the theoretically unworkable from the crowd, but not necessarily connected directly to the practical. At least not until we actually build a quantum computer. In other words, this is the scalable that is important for both the beginning (foundation) and also for the long run (when we have a quantum computer.) But for today's challenge of building a quantum computer? Well besides the obvious fact that you should not be using exponential resources in your computer, this simply isn't that important for scaling up a quantum computer. And it's certainly not the obstacle from building a large quantum computer: it's yesterday's news that (baring strange error models, new physics, etc.) quantum computing is fundamentally a valid model of computing without theoretical exponentials blocking its way.

Next up in line of people who use the word "scalable" is another set of theorists and sometimes experimentalists: the proposers of different physical implementations of quantum computers. These are the people who write the papers describing why a particular physical system, or a particular experimental setup, is appropriate for quantum computing. For them, the ultimate goal is often DiVincenzo's five criteria:

1. A scalable physical system with well characterized qubits.
2. The ability to initialize the state of the qubits to a simple ï¬ducial state.
3. Long relevant decoherence times, much longer than gate operation times.
4. A "universal" set of quantum gates.
5. A qubit-specific measurement capability.

Notably, criteria 1, where David DiVincenzo mentions "scalable" physical system, no attempt is made (in his original paper) to define what scalable means. DiVincenzo's requirements, are a door into the club of those who want to build a quantum computer. They are important requirements that a basic quantum computing system should obey, and if you're missing some of these, then you may have some problems (I say may, because there is some fudge room in interpretation of these criteria.) But satisfying DiVincenzo's criteria, does not, in my humble opinion, a quantum computer make.

Okay, so let me try to pen down the misuse of the word scalable used in these proposal papers. There are, now, a few quantum systems where you can argue that 1,2,3,4, and 5 have all been achieved, but that which don't, in my opinion, scale up. The problem, in my opinion is a view of scalable that says "build one, build two, build infinity." The idea here is that if one builds a quantum computer by achieving the five DiVincenzo criteria as best possible in one and two qubits, and then, because you could make one and two, you can now make more and more. While I don't disagree that this is a possible way to building a large computer, I don't believe that any such kludge of one, two, three, will really result in a large scale quantum computer. The analogy, in my head here, is with prior technology to the integrated circuit. Certainly vacuum tubes and transistors not on integrated circuits were technologies that could be used to build large computers. So, technically, they were scalable, but below I'll argue exactly why this approach is probably doomed to fail.

A third group of people who tend to use the word scalable, are experimental physicists who have gotten their hands dirty in a particular physical implementation. Here there are a few different reasons given for why a particular experimental system is "scalable" while others are not. One reason often given is a connection with "prior art." For example, many of those who work with solid state systems, argue that these systems are inherently more likely to be more scalable, than, say atomic systems, because the former can leverage the vast industry built up to construct modern classical computers. While I certainly wouldn't argue that any connection with prior art is bad, the practitioners of this form of scalability often do the "this thing scales, we use it, therefore we scale" argument. The world is littered with technologies that used the same processes as best practices in the computing industry, but which failed. Thus while I find myself sympathetic to this argument, it feels to me a lot like ditching the problem of what it really means to be scalable onto a technology which is already scalable, at the avoidance of discussing why a particular implementation would explicitly benefit from the connection.

Okay, now that I've gone dissed many uses of the word scalable in quantum computer, I'd better step up and give my definition before someone comes and bites my head off. I believe, that fundamentally, a quantum computing technology is scalable if it gives marginal returns which are economically increasing. Economics, Dave? Really? Yes.

Almost everyone knows Moore's law: the amount of transistors on a chip doubles roughly every two years. To me, Moore's law is an example of the outcome of a scalable technology: it's what we are shooting for. So first of all, I will define the outcome of a scalable quantum technology as one being able to achieve an exponential growth in the number of qubits produced per year under the requirement that these qubits can perform a calculation of length the number of qubits. The use of raw qubit numbers which can perform a base computation, of course leaves, out all sorts of other variations in grow rates, for example it's possible that one could jump start at some large number of qubits, and then not grow exponentially, but only, say, linearly, in time. This is one possible way we could build a quantum computer, of course, but I would argue that anytime you are not working with a technology which is growing as fast as an exponential, you are setting yourself up for doom.

At this point, people will probably object: even if a quantum computer grows polynomially in time, because quantum computers can exponentially outperform classical computers at some tasks, they can still provide exponential benefit for these tasks, and thus you can have a "scalable" technology that grows the number of qubits only polynomial in time. I believe this argument is flawed. The first reason is that the places where we know of exponential benefit are currently special purpose tasks (example: factoring). This then, is a pathway toward the quantum computer as a special purpose (even if it is universal) device, and while exciting, it's not clear to me how one can build an industry out of this. A more reasonable argument along these lines is probably related to simulation of quantum systems: if it turns out that quantum effects are important for chemistry and biology, and if we can build quantum algorithms which extract the relevant data, then there is most probably an industry here. But this has not, to my satisfaction, been shown.

Finally it seems to me that it is important not to undersell the polynomial algorithmic speedups known in quantum computing. While often scoffed at, these speedups are much more ubiquitous, and therefore applicable to a large portion of the world of computing. They can be found all over the place, and, if you could give a programmer access to these algorithms today with their currently sized classical systems replaced by an equivalent quantum system, you would revolutionize high-performance computing. And for this important class of algorithms, getting on the exponential bandwagon is a necessity (indeed the requirement is even more severe in the sense that want really really fast growth to catch up with modern computers.)

Okay so I'm going to define the outcome of a scalable quantum technology as one which can achieve exponential growth in the number of qubits executing a linear sized algorithm over a short time period, but what I'm going to say next is actually somewhat independent of that. And this is really an economic argument about what it means to be a scalable quantum technology. Along with Moore's law comes an even more amazing fact about the computer industry: the price per transistor has been exponentially shrinking. Actually if you examine manufacturing cost per unit area of integrated circuits you will find that this has only increased as a function of time, but the price per transistor has gone done exponentially. Most importantly if one examines the computational power per unit cost, this quantity has halved once every two year. Colloquially, one gets more exponentially more "bang for the buck" as a function of time.

And this, to me, has an important consequences. In particular I would argue that increasing marginal returns builds an industry. The reasons for this are manifold, but most importantly, when you are working in a situation where increasing marginal returns grow exponentially, there are then economic incentives necessary to take the risk to continue the exponential growth of the industry. Suppose for example that the bang for buck leveled out in modern computers. For example, transistor size could still be shrunk, but the costs rise such that the bang for buck goes down. You will then be in a race where by pushing the technology to more computational power, you will be more expensive than your competitors, but with a technology which is only, say, a constant factor better. Doubling many times is fantastic, but doubling once at a cost increase?

Thus it seems to me that what we need to be looking for, when we are thinking about what it means for a technology to build a quantum computer as being scalable, is whether the technology will allow for increasing marginal returns accompanied by suitable growth in the quantum computing power. This, to me, is the heart of the search for "scalable" implementations of quantum computers. Climbing mount scalable, then, is more about a hike that gets easier as you get higher, than about the overall shape of the hill (theorists scalability), the ability to put one foot in front of the other (proposers of quantum computing technologies), or the ability to use a ski lift already built (leveraging the technology which led to the computer revolution in the first place.) At least on my climb, I hope the climbing gets easier. If not we're in for some severe back pains.

Categories

More like this

In Vienna, Virginia on April 23-25th a workshop is being held in response to a report, "A Federal Vision for Quantum Information Science" issued by the United States National Science and Technology Council. While this workshop looks, from the outside, like any other typical quantum computing…
The Optimizer has gotten tired of everyone asking him about D-wave and gone and written a tirade about the subject. Like all of the optimizer's stuff it's a fun read. But, and of course I'm about to get tomatoes thrown on me for saying this, I have to say that I disagree with Scott's assessment…
Over at Emergent Chaos I found an article which throws down the gauntlet over quantum computers. And there isn't anything I cherish more than gauntlets thrown down! Note: I should preface this by saying that I don't consider myself a over the top hyper of quantum computers in the sense attacked…
Quantum error correction and quantum hard drives in four dimension. Part IV of my attempt to explain one of my main research interests in quantum computing: Prior parts: Part I, Part II, Part III. Quantum Error Correction Classical error correction worked by encoding classical information across…

Dave, having just come back from the 50th ENC, I can testify that most magnetic resonance research groups now use quantum spin simulations routinely both at the beginning of their experiments (for planning) and at the end (for data interpretation).

The present scaling of these widely-used programs is exponential. E.g., in Veshtort and Griffin's SPINEVOLUTION: A powerful tool for the simulation of solid and liquid state NMR experiments we read "the equation of motion is integrated with a sparse matrix based algorithm that behaves asymptotically as O(N^2 4^N)."

Needless to say, there is a lot of investment in improving the scaling of existing quantum simulation algorithms -- and a lot of borrowing of ideas from QIS/QIT. And it is becoming clear to many people that the next generation of algorithms will scale polynomially rather than exponentially.

These radical improvements in quantum simulation capability are a valuable breakthrough for QIS/QIT, for which we need *not* wait decades ... because they are happening now.

Dave, I like your connection to economics. I'm glad to hear a clear and cogent analysis of the scalability issue, even though I disagree with your definition on some levels.

Anyway, I'm more concerned that within the space of one single day, http://arxiv.org/abs/quant-ph/0204157 has been dethroned from the top of Google's hitlist for "Climbing Mount Scalable." A dastardly deed, sir! And you have the gall to speak of "prior art".

Bah. Humbug. :)

By Robin Blume-Kohout (not verified) on 06 Apr 2009 #permalink

I just assumed everyone would know the reference. I've added a link the paper so that it will hopefully googlefu to the top.

Don't apologize for economics. Moore's law originally was "The complexity for minimum component costs has increased at a rate of roughly a factor of two per year". Note the reference to minimum component _costs_.

Sir, you are a gentleman and a scholar, of the highest magnitude! Mine honor is thoroughly sassified!

(Er... was the facetiousness of my first comment's second paragraph sufficiently obvious? Because, if it wasn't, and I sounded like I was seriously complaining, then I owe an apology...)

By Robin Blume-Kohout (not verified) on 07 Apr 2009 #permalink