The Quantum Pontiff

Quantum Algorithms Zoo

Stephen Jordan, now a postdoc at Caltech, has produced a useful little guide to quantum algorithms: a zoo of quantum algorithms. Help squash the myth that all there is to quantum algorithms are the algorithms of Shor and Grover!


  1. #1 Ian Durham
    October 23, 2008

    I’ve been pushing for a few years to have a GQI session at the APS March Meeting dedicated to quantum algorithms. A few years back we had a mish-moshed session that included algorithms, but nothing dedicated solely to algorithms. Several people agreed with me, but we haven’t seen it yet. Perhaps this year.

  2. #2 Jonathan Vos Post
    October 23, 2008

    Great web page! Thanks for the shout-out.

    Since I’m writing a story for a semipopular magazine about QC, this will help me catch up from weird stuff I’ve coauthored on the arXiv, and what I wrote in such papers as the ASEE (American Society for Engineering Education) 2005 Proceedings with my wife:

    Preparing the Teachers: Quantum Computing
    Jonathan Vos Post (1), Christine Carmichael, Ph.D. (2)

    (1) CEO, Computer Futures, Inc.
    [street address]

    (2) Physics Department
    Woodbury University
    Burbank, CA 91510

    Introduction: What is a Quantum Computer?

    “Where a calculator on the Eniac is equipped with 18000 vacuum tubes and weighs 30 tons, computers in the future may have only 1000 tubes and weigh only 1 1/2 tons.”
    [Popular Mechanics, March 1949]

    I once asked John Mauchley, who shared the U.S. patent on the Stored program Electronic Computer, what personal computer he had at home.

    “A TRS-80,” he said.

    “A trash-80? When there are much more powerful PCs on the market? Why?” I asked.

    “Because, in terms of the number of registers in its processor, it comes closest of anything on the market to the UNIVAC that I invented.” [1]

    The Personal Computer with which I wrote this paper evolved from many generations of Engineering from early ideas (1862) of Charles Babbage (1791-1871). [2,3,4,5,6]

    “… I was sitting in the rooms of the Analytical Society, at Cambridge, my head leaning forward on the table in a kind of dreamy mood, with a table of logarithms lying open before me. Another member, coming into the room, and seeing me half asleep, called out, ‘Well, Babbage, what are you dreaming about?’ to which I replied “I am thinking that all these tables” (pointing to the logarithms) “might be calculated by machinery.” [2]

    The PC sitting in front of me is fundamentally no different from its 30 ton ancestors, some of which were equipped with some 18,000 vacuum tubes and 500 miles of wiring. Computers have become roughly 1,000,000 times more compact, and at least that much faster in performing their task, yet the task remains the same: manipulating and interpreting an encoding of binary bits into a useful computational result. The first electrical computer was an engineering breakthrough by German engineer Konrad Zuse (22 June 1910-18 Dec 1995) in 1941.

    I started in 1934, working independently and without knowledge of other developments going on around me. In fact, I hadn’t even heard of Charles Babbage when I embarked on my work�I approached the problem from various angles:

    Firstly, from a logical and mathematical point of view This involved:

    1. program control,
    2. the binary system of numbers,
    3. and floating point arithmetic.

    Today, these concepts are taken for granted, but at the time this was new ground for the computing industry. Secondly, from the design angle:
    1. allowing fully automatic arithmetical calculation,
    2. a high-capacity memory,
    3. and modules or relays operating on the yes/no principle.

    My research was initially aimed at pure number calculation, but soon led on (1935/36) to new ideas about “computing” in general. Personally, I believe that was the birth of modern computer science. I recognized that computing could be seen as a general means of dealing with data and that all data could be represented through bit patterns. That led to my basic hypothesis that: “data processing starts with the bit”

    I defined “computing” as “the formation of new data from input according to a given set of rules” This basic theory meant that all computing operations could be carried out by relays operating according to the dual status principle just mentioned. The most suitable devices available at the time were telephone relays. Now a link with mathematical logic had been forged. As an engineer I had no idea of the existence of such a discipline. I developed a system of “conditional propositions” for relays – something that corresponded approximately to what is known as Boolean algebra today. My former mathematics teacher showed me that this sort of calculation was identical with the propositional calculus of mathematical logic.

    From the engineering point of view, the gap between this and pure mathematical logic was bridged in order to simplify the design and programming of computing machines. At roughly the same time in England, the mathematician and logician Alan Turing was in the process of solving this problem from a different angle. He used a very simple computer as a model in order to place theoretical logic on a more formal basis. Turing’s work was of major importance for the theory of computer science. However, his ideas had little influence on the practical development of computing machines.

    [Zuse, quoted by Lee, 1994]

    Before digital computers, there were analog computers, of which slide rules were the most common. The point for us in this paper of an analog computer is that it uses a physical process, and doesn’t really compute anything in the sense of Zuse or Von Neumann, but produces approximate results quite usefully. Quantum computers embody a digital schema in a different way from classical computers, so they compute only one result at a time (like an analog computer) but can do arbitrary digital operations (like a classical computer). Digital computers are explained in terms of the “bit.”

    A bit [Glossary] (the term was invented by Claude Shannon [8,9] is a fundamental unit of information, classically represented as a 0 or 1 in your digital computer. Claude Elwood Shannon (30 Apr 1916-24 Feb 2001) is known as “the father of Information Theory” and “the father of Communication Theory” as well as the founder of practical digital circuit design. Each classical bit is physically realized through a macroscopic physical system, such as the magnetization on a hard disk, a microscopic pit in a plastic DVD, or the charge on a capacitor. A document, for example, comprised of N characters stored on the hard drive of a typical computer is accordingly described by a string of 8N zeros and ones. However, while a classical computer obeys well-understood laws of classical physics, a quantum computer is a device that harnesses physical phenomenon unique to quantum mechanics (especially Quantum Interference) to perform information processing in an inherently different way.

    In a quantum computer, the fundamental unit of information (called a quantum bit or qubit), is not binary in nature. Mathematically, a qubit is a ket (state) in a two dimensional Hilbert space.10 Since the superposition of the two base states has complex numbers a coefficients, the qubit composed of a 0 bit |0> and a 1 bit |1> may be written (superscripts are exponents):

    A |0> + B |1> where | A |^2 + | B |^2 = 1.

    This qubit property comes directly from laws of quantum mechanics which differ from laws of classical physics. A qubit can exist not only in a state corresponding to the logical state 0 or 1 as in a classical bit, but also in states corresponding to a blend or superposition [Glossary] of these classical states. That is, a qubit can exist as a zero, a one, or simultaneously as BOTH 0 and 1, with a complex numerical coefficient representing the probability for each state, at the atomic level. This makes current research in Quantum Computing not a continuation of today’s idea of a computer, but, rather, a completely new style of thought that has led to a new discipline of Engineering. Because quantum computers harness characteristics of qubits, they have potential to be incredibly powerful computational devices, in a greater evolutionary leap than that which adapted the 30-ton mainframe monsters to iPods weighing less than an ounce.

    Information, although it can be treated mathematically, is not an abstract concept. Information has to be recorded and stored on media that are fundamentally quantum mechanical. That is why we extend our definition of information as merely a string of bits, 1s and 0s, to examine the consequences of the quantum nature of media for information. The implications of the qubits of quantum information theory are still being investigated and will probably surprise the entire body of experts several times before 2020. Yet to introduce Quantum Computing now, to those who will be the teachers in 2020, we need just a handful of quantum concepts and principles. As we shall see in the Curriculum Development section of this paper, we are forced to introduce some ideas of Ed Fredkin and Charles Bennett about reversible computing and reversible logic gates. But let’s get acquainted with qubits, first. This introduction is expanded and adapted from West’s presentation. [10]

    What is the Matrix? Potential and Power of Quantum Computing

    In classical computing, information is encoded in a sequence of bits, which are manipulated via Boolean [Glossary] logic gate arranged in succession to produce an end result which is output. Similarly, a quantum computer manipulates qubits [Glossary] by executing a series of quantum gates, each a Unitary Transformation [Glossary] acting on a single qubit or pair of qubits. By applying these gates in succession, a quantum computer performs a complicated unitary transformation to a set of qubits in some initial state. The qubits can then be measured, with this measurement serving as the final computational result. This parallel between classical and quantum computing suggests why a classical computer can accurately simulate a quantum computer and be able to do anything a quantum computer can.

    So why bother with quantum computers? Because, although a classical computer can theoretically simulate a quantum computer, it is incredibly inefficient, so much so that a classical computer is effectively incapable of performing many tasks that a quantum computer could perform with ease. The simulation of a quantum computer on a classical one is a computationally hard problem. This is since correlations among quantum bits are qualitatively different from correlations among classical bits, as first explained by John Bell [28 June 1928-1 Oct 1990]:


  3. #3 Dave Bacon
    October 23, 2008

    I have fond memories of my own Trash 80. It took however, a very long time to draw the Mandelbrot set using that computer.

  4. #4 Geordie
    October 23, 2008

    The trash 80 had the world’s greatest game also (Dungeons of Daggorath).

  5. #6 Dave Bacon
    October 24, 2008

    Hey rrtucci long time no see! Got bored flaming my blog?

    Anyway your link is wrong and doesn’t work… it should be

    Thanks for stopping by and insulting us!!!

  6. #7 Jonathan Vos Post
    October 24, 2008

    Since nobody complained about the opening chunk of my own paper, and it stimulated memory of when dinosaurs walked the PC planet, I’ll give another snippet.

    John Bell [28 June 1928-1 Oct 1990]:

    “The discomfort that I feel is associated with the fact that the observed perfect quantum correlations seem to demand something like the ‘genetic’ hypothesis. For me, it is so reasonable to assume that the photons in those experiments carry with them programs, which have been correlated in advance, telling them how to behave. This is so rational that I think that when Einstein saw that, and the others refused to see it, he was the rational man. The other people, although history has justified them, were burying their heads in the sand. I feel that Einstein’s intellectual superiority over Bohr, in this instance, was enormous; a vast gulf between the man who saw clearly what was needed, and the obscurantist. So for me, it is a pity that Einstein’s idea doesn’t work. The reasonable thing just doesn’t work.” [11,12,13,14]

    A system of only a few hundred qubits exists in a Hilbert space [Glossary] of dimension ~10^90 that, in simulation, would require a classical computer to work with exponentially large matrices (to perform calculations on each individual state, which is also represented as a matrix), meaning it would take an exponentially longer time than even a primitive quantum computer, and would take longer than the history of the universe so far.

    Richard Feynman, my mentor [see my Nanotechnology paper in this Proceedings] was among the first 15 to recognize (1982) the potential in quantum superposition for solving such problems tremendously faster. For example, a system of 500 qubits, which is impossible to simulate classically, represents a quantum superposition of as many as 2500 states. Each state would be classically equivalent to a single list of 500 1′s and 0′s. Any quantum operation on that system –a particular pulse of radio waves, for instance, whose action might be to execute a Controlled-NOT operation on the 100th and 101st qubits– would simultaneously operate on all 2500 states. Hence in just one tick of the computer clock, a quantum operation could compute not just on one machine state, as serial computers do, but on 2500 machine states simultaneously! Eventually, however, observing the system would cause it to collapse into a single quantum state corresponding to a single answer, a single list of 500 1′s and 0′s, as dictated by the measurement axiom of Quantum Mechanics. The reason this is an exciting result is because this answer, derived from the massive Quantum Parallelism [Glossary] achieved through superposition [Glossary], is the equivalent of performing the same operation on a classical super computer with ~10^150 separate processors (which is impossible in our physical universe).

    Scientists and Engineers searched something interesting for a quantum computer to do. Peter Shor, a research computer scientist at AT&T’s Bell Laboratories in New Jersey, provided such an application by devising the first quantum computer algorithm. [16] Shor’s algorithm harnesses the power of quantum superposition to rapidly factor very large numbers (on the order ~10^200 digits and greater) in a matter of seconds. Applying quantum computers to implement this algorithm challenges the entire field of encryption, where one standard encryption code, known as RSA [17,18] [Glossary] relies heavily on the difficulty of factoring very large composite numbers into their primes. A computer which can do this easily is naturally of great interest to intelligence agencies throughout the world, and to numerous corporations and government agencies that use RSA — previously considered to be “uncrackable” — and to anyone interested in electronic and financial privacy. Quantum Cryptography is a viable industry right now, as an alternative to RSA and other systems that operate over classical communication infrastructure, with “uncrackable” communication lines and devices being sold to government agencies and major corporations by a number of manufacturers.

    Encryption is merely one application of quantum computing. Shor devised a set of mathematical operations that can only be performed on a quantum computer, and used many of these in his factorization algorithm. Feynman had earlier asserted that a quantum computer could function as a kind of simulator for quantum physics, perhaps enabling important discoveries in the field. Currently the immense power of quantum computers is a combination of theoretical speculation, and preliminary laboratory development of prototypes; the advent of the first fully functional quantum computer will undoubtedly bring many new and exciting applications.

    [next time, unless David Bacon objects: "A Brief History of Quantum Computing"]

  7. #8 Jonathan Vos Post
    October 25, 2008

    A Brief History of Quantum Computing

    The concept of computing by quantum mechanics was explored in the 1970′s and early 1980′s by physicists and computer scientists such as Charles H. Bennett,[19,20,21,22] of IBM, Paul A. Benioff [23-28] of Argonne National Laboratory, David Deutch [29,30] of Oxford University, and the late Richard Feynman of Caltech. These scientists were pondered the fundamental limits of computation. If Computer Engineering continued to follow Moore’s Law [31] (40 years old the month of this conference) then the continually shrinking size of circuitry packed onto silicon chips would eventually reach a point where individual elements would be no larger than a few atoms. Problems would then arise because, at the atomic scale, physical laws that govern behavior and properties of circuits are inherently quantum mechanical in nature, not classical. Could a new kind of computer could be devised based on the principles of quantum physics? Feynman was among the first to answer this question, with an abstract model 15 that showed how a quantum system could be used to do computations. He also gave a motivational argument for how such a machine would be able to act as a simulator for quantum physics. That is, a physicist could carry out experiments in quantum physics inside a quantum mechanical computer.

    Within three years it became clear that Feynman’s assertion could eventually lead to a general purpose quantum computer. A crucial 1985 theoretical paper [29] proved that ANY physical process could (in principle) be modeled perfectly by a quantum computer. Quantum computers would be far beyond any classical computer, which (for various reasons) can not model any physical process with arbitrary precision. After Deutsch published this breakthrough paper, the race was on to find interesting applications for such a hypothetical machine. At first, all that could be found were some rather contrived mathematical problems, until Shor circulated a preprint of a paper 16 describing a method for using quantum computers to crack an important problem in number theory, namely factorization of integers into prime numbers. Shor showed how an ensemble of mathematical operations, designed specifically for a quantum computer, could be organized to enable such a machine to factor huge numbers extremely rapidly, astronomically faster than is possible on any classical computer. Quantum Computing turned, overnight, from academic curiosity to a global concern.

    Current Obstacles and Future Research:

    Quantum information processing has advanced since its conception, Engineers have built two- and three-qubit quantum computers capable of some simple arithmetic and data sorting. Major obstacles remain that prevent us from building a quantum computer to make obsolete today’s digital computers. These difficulties include: (1) error correction, (2) decoherence, and (3) hardware architecture. Error correction [Glossary] is self-explanatory at the superficial level, but has evolved into a sophisticated field in classical computing and mathematics but what errors need correction? In the quantum domain, we worry about those errors that arise directly from decoherence [Glossary]. Such interactions between environment and qubits are unavoidable. They cause breakdown of information stored in the quantum computer, and thus induce errors in computation. Before any quantum computer will be capable of solving hard problems, research must demonstrate a way to maintain decoherence and other potential sources of error at an acceptable level. Thanks to the theory (and now reality) of quantum error correction, first [32] proposed in 1995 and continually developed since, [33,34,35] small scale quantum computers have been built and the prospects of large quantum computers are improving rapidly.

    Probably the most important idea in this field for the application of error correction is Phase Coherence [Glossary] as a means to extract information and reduce error in a quantum system without actually measuring that system. In 1998, Los Alamos and MIT researchers led by Raymond Laflamme [36-39] managed to spread a single bit of quantum information (qubit) across three nuclear spins in each molecule of a liquid solution of alanine or trichloroethylene molecules. They used nuclear magnetic resonance (NMR). This matters because spreading out the information made it harder to corrupt, and therefore hinted at more reliable quantum computing in the future. Quantum mechanics tells us that directly measuring the state of a qubit invariably destroys the superposition of states in which it exists, “collapsing the wave function” and forcing it to become either a 0 or 1. The technique of spreading out the information allows researchers to use entanglement [Glossary] to study the interactions between states as an indirect method for analyzing the quantum information. Rather than a direct measurement, the group compared the spins to see if any new differences arose between them without learning the information itself. This technique gave them the ability to detect and fix errors in a qubit’s phase coherence and thus maintain a higher level of coherence in the quantum system. This breakthrough has silenced some skeptics, and given great hope for true believers.

    Currently, research in quantum error correction continues with Preskill’s groups at Caltech [40,41] and Professor Jeff Kimble’s quantum optics group at Caltech. [42-44] Related research is proceeding at a rapid rate by Gottesma at Microsoft, at Los Alamos, and elsewhere. Research is underway to battle the destructiveness of decoherence [Glossary] to develop better hardware architectures, and to find new quantum algorithms. These pursuits deeply connect to quantum error correction codes and quantum algorithms, which have become hot areas for research. Designs have involved ion traps, cavity quantum electrodynamics (QED), and NMR. Such devices have provided fascinating experiments, but each of these technologies each has serious limitations. Ion trap computers are limited in speed by the vibration frequency of the modes in the trap. NMR devices have an exponential attenuation of signal to noise as the number of qubits in a system increases. Cavity QED is slightly more promising; however, it still has only been demonstrated with a few qubits.

    Seth Lloyd of MIT is currently a prominent researcher in quantum hardware. The future of quantum computer hardware architecture is probably quite different from what we understand today. Current research has helped to provide insight as to what obstacles the future will hold for these devices. Engineering Education must keep pace with current research, trends in research, and the theories that limit any quantum computers.

  8. #9 Jonathan Vos Post
    October 26, 2008

    Future Outlook and Curriculum Development:

    In 2005 quantum computers and quantum information technology are in a pioneering stage. Obstacles are being surmounted that will provide the knowledge needed to thrust quantum computers up to their rightful position as the fastest computational machines in existence well before the year 2020. Error correction [Glossary] is nearing a point now where we may have the tools required to build a computer robust enough to adequately withstand the effects of decoherence. Quantum hardware experiments suggest that it will only be a few years before we have devices large enough to test Shor’s and other quantum algorithms. Quantum computers will then emerge as the superior computational devices and perhaps make today’s computers obsolete. Quantum computation began with theoretical physics, but its future lies in the profound effect it will have on the lives of all human beings. Engineering Educators must ponder the implications to individuals and societies from computers in 2020 that are as far beyond the computers of 2005 as today’s internet is to a person a few centuries ago manipulating an abacus.

    At the higher levels, the references of this paper offer a range of primary and secondary readings that can be the basis of tertiary textbooks and/or lectures. The challenge to curriculum development is that too few people capable of classroom presentations are familiar with even a small fraction of these references. Thus there is a need to do the painful work now of educating in the period 2005-2010 the students who will become the teachers in 2020, when Quantum Computing will already have transformed the world.

    Given the Introduction and History above, Curriculum development clearly needs to focus on computational consequences of the Larger Hilbert Space, Schor’s algorithm, Grover’s algorithm, Quantum lithography, Quantum Key Distribution, Quantum Cryptography, Squeezed Light, and Quantum Feedback. Since this seems forbiddingly technical, the following explanation combining arguments by Professor Tony Hey [45] and by George Hockney [see acknowledgment] puts it in simpler terms. The references to Pauli and Minsky hint at the need for curriculum development which combines Quantum Mechanics texts and modern computer texts.

    At its heart, the basis of quantum computation is Landauer’s observation that all information is ultimately physical. [46,47] Information, encoded as the 1s and 0s of classical computers, must inevitably be recorded by some physical system, whether on paper or silicon or magnetic disks or pits in CD-ROMS and DVDs. This brings us to the key point of Physics in the curriculum development for Quantum Computing. So far as we know today, all ordinary matter is composed of atoms (nuclei orbited by electrons). Hence the interactions and time evolution of atoms are governed by the laws of Quantum Mechanics. Although the weirdness of the quantum world may not seem obvious at first glance, careful teaching reveals to the student that applications of quantum mechanics are all around us. 45 As emphasized by Marvin Minsky [48] at MIT, the very existence of atoms comes from, not the chaotic uncertainties of classical mechanics, but rather the certainties of quantum mechanics with the Pauli exclusion principle, necessitating well-defined and stable atomic energy levels. Without the quantum understanding of the solid state and the band theory of metals, insulators and semiconductors, (first developed by scientists, and then embodied by engineers in useful devices) the entire the semiconductor industry with its transistors and integrated circuits, and therefore the entire electronic computer industry, could never have been born. In the same way, quantum optics (including lasers), which spawned gigantic industries, from fiber optical communications, to music and video CDs and DVDs, has its essence in these intrinsically quantum technologies. So, isn’t every computer today a Quantum Computer? No, that’s not what is meant by this term. Why?

    True, at the bottom (in the sense of Feynman’s visionary 1959 lecture ‘Plenty of room at the bottom’), everything is quantum mechanical. So we can certainly look past several generations of Moore’s Law [Glossary] towards a future of storing bits of information on single atoms or electrons. The problem is, these microscopic objects do not obey Newton’s Laws of classical mechanics. Rather, electrons and atoms evolve and interact according to the Schrodinger equation, which the student must be shown is the analog of Newton’s Law for Quantum Mechanics. Furthermore, students should be cautioned, Schrodinger’s equation, is itself merely an approximation for terrestrial speeds and energies. At cosmologically high speeds and energies, physicists rely on the Dirac equation and Einstein’s relativity, which predict relativistic mass increase and particle-antiparticle pair creation. Most of the time, on Earth, we can safely ignore these complications and use the non-relativistic version of Quantum Mechanics, exactly as described in Schrodinger’s equation. But to explain the basic concepts of Quantum Computing, introduce some ideas of Ed Fredkin [49,50] and Charles Bennett about reversible computing and reversible logic gates. {the detailed explanation of this, Quantum Logic Gates, and Grover’s search algorithm has been excised from this short version of the paper at referees demand, and will appear later in the full-length version}. Rolf Landauer has shown that it is the act of discarding information that incurs an unavoidable energy loss. The most efficient possible conventional computer inevitably uses up energy each time it does an irreversible operation and throws away information. It is not learning that takes work, according to Landauer. It is forgetting that takes work, and expends energy! There is no minimum amount of energy needed to compute an answer. Bennett’s analysis means that we can, in principle, build a computer for any classical computation, to calculate reversibly, very slowly, with an energy as small as we desire, as close to zero as we want.

    Feynman discusses a reversible computer that calculates for a few steps, then drifts back a bit, ‘uncalculating’ as it goes backwards, before it drifts forward again to eventually complete the calculation with essentially zero loss of energy. It is possible that use of such gates may one day be needed to reduce power consumption of microprocessors implemented in today’s CMOS silicon technology. Right now, an Intel Pentium discards (forgets) roughly 100,000 bits per floating point operation (FLOP). Each discarded bit costs at least the minimum Landauer energy loss. We care because the laws of quantum physics are reversible in time. Thus probability is conserved as a state evolves with time. Technically speaking, the Schrodinger time evolution operator is unitary [Glossary] and preserves the norm of quantum mechanical states. To build a quantum computer with quantum states evolving according to the Schrodinger equation therefore necessarily requires us to use realizations of reversible logic gates such as CN, Toffoli, or Fredkin gates.

    One important application of Quantum Computing is for Cryptography, including Quantum Key Distribution (QKD), which uses Quantum Mechanics to make a communication which cannot, even in principle, be intercepted undetectably. {explanation excised for length in this short version}. There is no telling where the escalation of codemakers and codebreakers will take us, through Quantum Computing. We must be prepared today to have a body of teachers who can help the students of 2020 understand what today cannot be predicted in any detail.

    Who Will Teach the Teachers?

    Who will teach the teachers? Government? Industry? Engineering Societies? No one of these legs of the tripod can do the entire job. The discussions above show a unique mixture of concerns of Government, such as the National Security and Homeland Security issues of the USA using Quantum Computers for decrypting covert communications of terrorists who are armed only with conventional computers. But the gap between urgent needs of government agencies, and the manifest problems of government-funded public schools are almost impossible to bridge.

    Our discussions keep cycling back to concerns of Industry, in terms of fabrication of faster and smaller devices according to Moore’s Law. But Industry cannot, however huge its Research & Development budget, also educate generations of students. IBM Research Laboratories, and Microsoft Research, and Google, and Lucent, cannot themselves mobilize the hundreds of thousands of teachers necessary to prepare for 2020.

    Our discussions keep invoking deep results of professional societies for Electrical Engineering, for Computer Science, for Physics, and for other disciplines. The lines between those disciplines are blurred by the new concerns of Quantum Computing. Hence the borders between disciplines in High Schools, Colleges, and Universities are just as paralyzed by the revolutionary nature of Quantum Computing as are the borders between the professional societies that have evolved for mature researchers in those disciplines. It is my belief that a new system that rests with stability on all three branches of stakeholders — Government and Industry and Engineering Societies will be needed to teach the teachers. The American Society for Engineering Education can play a crucial policy making role in this new consolidation of resources.


    Over the last twenty years, improved methods of controlling and understanding quantum coherence have not only enabled experimental tests in fundamental Physics, but created new practical applications. We have tried to discuss this at a level appropriate to Engineering Education with an introduction, some technical explanation, and a summary of the history of the field. Again and again we were forced to combine old ideas with new interpretations. We needed to reach deep into the inventory of concepts from Algorithms, Boolean Logic, Complexity, Newtonian Physics, and Quantum Physics.

    More advanced application such as noise reduction using “squeezed light,” quantum lithography, and quantum computing may become possible as our ability to control quantum coherence gets better. Yet the notions of ‘entanglement’ and ‘coherence’ are difficult for experts today to convey to experts from other fields, let alone to those students who will become the next generation of engineering teachers. Therefore, it will be important for instructors in physics and micro-scale engineering to become conversant with the basic ideas of these new techniques, and to operate within a new framework of interdisciplinary Engineering Education which draws from resources in Government, Industry, and Professional Societies.

    The PC sitting in front of me is fundamentally no different from its 30 ton ancestors, some of which were equipped with some 18,000 vacuum tubes and 500 miles of wiring. Yet Quantum Computing not a continuation of today’s idea of a computer, but, rather, a completely new style of thought that has led to a new discipline of Engineering, and may lead to incredibly powerful computational devices, in a greater evolutionary leap than that which adapted the 30-ton mainframe monsters to iPods weighing less than an ounce.

    We must learn an obscure language of qubits, Toffoli gates, quantum entanglement, and superposition. Then we must discover how to make that clear to the students from all walks of life, all backgrounds, and all cultures. Engineering Educators must ponder the implications to individuals and societies from computers in 2020 that are as far beyond the computers of 2005 as today’s internet is to a person a few centuries ago manipulating an abacus.

  9. #10 Jonathan Vos Post
    October 26, 2008

    References (this section reduced 90% by demand of referee; see complete version elsewhere):

    1. John Mauchley, personal communication, at lunch with Theodor Nelson (father of hypertext and hypermedia), King-of-Prussia, Pennsylvania, 1974.

    2. Charles Babbage, Passages from the life of a philosopher, London, 1864.

    3. H. P. Babbage, Babbage’s calculating engines, London, 1889.

    4. [Babbage, 1998]

    5. Hyman, Charles Babbage : pioneer of the computer, Oxford, 1982.

    6. P. Morrison and E. Morrison, Charles Babbage and his calculating engines, New York, 1961.

    7. J. A. N. Lee, “Konrad Zuse”, September 1994

    I would like to take this opportunity to thank Professor J. A. N. Lee for superbly educating me on the true history of computing, at the University of Massachusetts, Amherst, in 1973-1975.

    8. Claude E. Shannon, A mathematical theory of communication. Bell System Technical Journal, vol. 27, pp. 379-423 and 623-656, July and October, 1948.

    9. Claude E. Shannon and Warren Weaver: The Mathematical Theory of Communication. The University of Illinois Press, Urbana, Illinois, 1949

    10. Jacob West, “The Quantum Computer: An Introduction”, 28 April 2000

    11. Jeremy Bernstein, Quantum Profiles, Princeton University Press, 1991, p. 84, quotes John Stewart Bell (1928-1990), author of “Bell’s Theorem” (or “Bell’s Inequality”).

    12. Bell, John S., The Speakable and Unspeakable in Quantum Mechanics, Cambridge University Press, 1987.

    13. Pearle, P, Hidden-Variable Example Based upon Data Rejection, Physical Review D, 2, 1418-25 (1970)

    14. Aczel, Amir D, Entanglement: The Greatest Mystery in Physics, New York: Four Walls Eight Windows, 2001

    15. R. P. Feynman, “Simulating physics with computers”, Int. J. Theor. Phys. 21, 467 (1982).

    reprinted in A.J.G. Hey (Ed.):’Feynman and Computation’, see also R. P. Feynman, in A.J.G. Hey and R.W. Allen, (Eds.), The Feynman lectures on computation, Longman, Reading, MA: Addison Wesley, 1996.

    16. Shor, P. W., Algorithms for quantum computation: Discrete logarithms and factoring, in Proceedings of the 35th Annual Symposium on Foundations of Computer Science, IEEE Computer Society Press (1994), reprinted in S. Goldwasser (Ed.): Proc. 35th A. Symp. on the Foundations of Computer Science, p.124, (IEEE, Los Alamitos, CA, 1994).

    17. Rivest, R.; Shamir, A.; and Adleman, L. “A Method for Obtaining Digital Signatures and Public-Key Cryptosystems.” MIT Memo MIT/LCS/TM-82, 1977.

    18. Rivest, R.; Shamir, A.; and Adleman, L. “A Method for Obtaining Digital Signatures and Public Key Cryptosystems.” Comm. ACM 21, 120-126, 1978. see also “The Mathematical Guts of RSA Encryption”

    19. Charles H. Bennett, “Logical Reversibility of Computation” IBM J. Res. Develop. 17, 525 (1973).

    20. Charles H. Bennett “The Thermodynamics of Computation– a Review” Internat. J. Theoret. Phys. 21, pp. 905-940 (1982).

    21. Charles H. Bennett and R. Landauer “Fundamental Physical Limits of Computation”, Scientific American 253:1 48-56 (July 1985).

    22. Charles H. Bennett “Demons, Engines, and the Second Law” Scientific American 257:5, 108-116 (November 1987). {12 more Bennett citations excised from this short version of paper}

    23. Paul Benioff, “Operator Valued Measures in Quantum Mechanics: Finite and Infinite Processes”
    J. Math. Phys. 13, 231-242 (1972)

    24. Paul Benioff , “Decision Procedures in Quantum Mechanics”, J. Math. Phys. 13, 908-915 (1972)

    25. Paul Benioff, “Procedures in Quantum Mechanics without Von Neumann’s Projection Axiom”
    J. Math. Phys. 13, 1347-1355 (1972)

    26. Paul Benioff, “Finite and infinite measurement sequences in quantum mechanics and randomness: The Everett interpretation”, J. Math. Phys. 18, 2289-2295 (1977)

    27. Paul Benioff, “The Computer as a Physical System: A Microscopic Quantum Mechanical Hamiltonian Model of Computers as Represented by Turing Machines”, J. Stat. Phys. 22, 563 (1980)

    28. Paul Benioff, “Quantum Mechanical Models of Turing Machines That Dissipate No Energy”,
    Phys. Rev. Lett. 48, 1581-1585 (1982) {8 more Benioff citations excised from this short version of paper}

    29. Deutsch, Proc. Roy. Soc. London, Ser. A 400, 97 (1985).

    30. D. Deutsch, A. Ekert, “Quantum Computation,” Physics World, March (1998).

    31. Gordon E. Moore , “Cramming More Components Onto Integrated Circuits”, Electronics, April 19, 1965.

    32. P. W. Shor, Phys. Rev. A 52, R2493 (1995).

    33. M. Steane, Phys. Rev. Lett. 77, 793 (1996).
    34. A. M. Steane, Proc. Roy. Soc. A. 452, 2551 (1996).

    35. A. R. Calderbank and P. W. Shor, Phys. Rev. A. 54, 1098 (1996).

    36 Raymond Laflamme, 4 papers AT Paper on decoherence and Shor’s algorithm

    37 Raymond Laflamme: Paper on decoherence and a simple quantum computer

    38 Raymond Laflamme: Paper on error detection in quantum cryptography and quantum computer memories

    39 Raymond Laflamme: Paper on a perfect quantum code

    40. J. Preskill, “Quantum Computing: Pro and Con,” quant-ph/9705032 v3, 26 Aug 1997.

    41. J. Preskill, “Battling Decoherence: The Fault-Tolerant Quantum Computer,” Physics Today, June (1999).

    42. Jeff Kimble’s group at Caltech

    See bibliography with hotlinks at:

    43. T. C. Zhang, K. W. Goh, C. W. Chou, P. Lodahl, and H. J. Kimble, “Quantum teleportation of light beams”, Phys. Rev. A. 67, 033802 (2003); available as quant-ph/0207076.

    44. J. McKeever, A. Boca, A. D. Boozer, R. Miller, J. R. Buck, A. Kuzmich and H. J. Kimble, “Deterministic Generation of Single Photons from One Atom Trapped in a Cavity,” Science 303, 1992 (2004), published online February 26, 2004; 10.1126/science.1095232 (Science Express Reports) (this link provides access to the pdf file without a AAAS membership).

    45. A. J. G. Hey and P. Walters, The Quantum Universe, Cambridge University Press, 1987.

    46. R. Landauer, “Irreversibility and heat generation in the computing process”, IBM J. Res. Dev., 5 p.183 (1961); reprinted in A.J.G. Hey (Ed.):Feynman and computation, Longman, Reading, MA: Addison Wesley, 1998.

    47. J.A.Wheeler: ‘Information, physics, quantum: the search for links’, reprinted in A.J.G. Hey (Ed.): ‘Feynman and computation’, originally published in Proceedings of 3rd Int. Symp. Foundations of Quantum Mechanics, Tokyo, p.354 (1989).

    48. Marvin Minsky, “Richard Feynman and cellular vacuum”, in A.J.G. Hey (Ed.): ‘Feynman and computation”Feynman and computation’, originally published in Proceedings of 3rd Int. Symp. Foundations of Quantum Mechanics, Tokyo, p.354 (1989).

    49. E. Fredkin and T. Toffoli, Int. J. Theor. Phys., 21 (1982), p.219

    50. E. Fredkin, unpublished lecture given at Southampton, September 1997.

    51. Christine Carmichael and Steven Smith, “Demonstration of Beats with a Double-Driven String” Phys. Teach. 42, 462 (2004)

    52. Christine Carmichael, “Intrinsic Magnetic Aftereffect”, Wagga-Wagga, New South Wales, 1978

    53. Christine Carmichael, “Magnetic Aftereffect in Dy(Co,Ni)2 Alloys”, University of Warwick, England, 1979

    54. Christine Carmichael, “Investigation of the Intrinsic Magnetic Aftereffect in Dy(Co,Ni)2 Alloys”, Wagga-Wagga, Australia, 1979

    55. Christine Carmichael, “Origin of the Recrystallization Texture in Rolled Low Zinc Brass”, International Conference on the Strength of Metals and Alloys, Melbourne, Australia, 1982

    56. Christine Carmichael, “Mechanism of Intrinsic Magnetic Aftereffect”, International Conference on Magnetism, Kyoto, Japan, 1982

    57. Christine Carmichael, “Gallium Arsenide and Superchips”, International Computer Fair, University of Washington, Seattle, Washington, 1986

    58. Christine Carmichael, “Superchips”, Information Systems Forum, Olympia, Washington, 1986

    59. Christine Carmichael, “Deformation Processes in Hot Worked Copper and Alpha Brass”, Acta Metallica , Vol.34, No.11, pp.2247-2257, 1986

    60. Christine Carmichael, “Materials of the Future”, Information Systems Forum, Olympia, Washington, 1989

    61. Christine Carmichael, “Stacked p-FET Dosimeter for the STRV-2 MWIR Detector: A Joint US-UK Project”, IEEE Transactions in Nuclear Science , 1996

    62., Post et. al., “Imaginary Mass, Momentum, and Acceleration: Physical or Nonphysical?” [Proceedings of the Fifth International Conference on Complexity Science, 17-21 May 2004] Jonathan Vos Post, Andrew Carmichael Post, and Professor Christine Carmichael, Woodbury University

    63. Post, et al., “The Implications of Peter Lynds ‘Time and Classical and Quantum Mechanics: Indeterminacy vs Discontinuity’ for Mathematical Modeling” [Proceedings, North American Association for Computation in the Social and Organizational Sciences, 2004], Professor Philip V. Fellman, Southern New Hampshire University;

    Maurice Passman; Professor Jonathan Vos Post; Professor Christine Carmichael, Woodbury University; Andrew Carmichael Post, California State University Los Angeles

    DR. CHRISTINE CARMICHAEL 51-61 biography in her solo paper at this conference.

    PROF. JONATHAN VOS POST 62-63 biography in his solo paper at this conference.

    Acknowledgment: Thanks to George Hockney, Ph.D., formerly of FermiLab and the NASA/JPL Quantum Computing Group for assistance throughout the research and writing of this paper. More of his explanations to appear in the long version of this paper.

  10. #11 Jonathan Vos Post
    October 28, 2008

    I added the References section of the paper a day or two ago, but it probably vanished onto the ScienceBlogs queue of Things with More Than One URL in them. Let me know, please, if I should try again, with URLs suppressed, and all citations only to Dead Tree Archives.

    And, speaking the great-grandfather of Quantum Computing, Richard Feynman, there’s a nice little (technical) paper on the arXiv today, which is prefaced by these two quotations:

    “As it was, as soon as I heard Feynman describe his path integral approach to quantum mechanics during a lecture at Cornell everything became clear at once; and there remained only matters of rigor.” [1]

    “Hardly a month passes without someone discovering yet another application [of the Feynman-Kac formula]. Remarkable and, of course, highly gratifying.” [2]
    Mark Kac (1914 – 1984)

    I’ve yet to have a paper requested for rewrite by a referee or editor with a phrase as elegant as “there remained only matters of rigor.”

    arXiv:0810.2495 (replaced) [ps, pdf, other]
    Title: Three useful bounds in quantum mechanics – easily obtained by Wiener integration
    Authors: Hajo Leschke, Rainer Ruder
    Comments: To be published (in shortened and different form) by World Scientific in the proceedings of a conference on path integrals in Dresden 09/2007. Eds.: W. Janke and A. Pelster
    Subjects: Quantum Physics (quant-ph); Mathematical Physics (math-ph)