![]() ![]() The great difficulty in getting rid of decoherence has led to the impressive acronym NISQ for “noisy intermediate scale quantum” computer-for the idea that small collections of noisy physical qubits could do something useful and better than a classical computer can. Physicists are smart as we all know (disclosure: I am a physicist), and some physicists are also very good at coming up with substantive-sounding acronyms that stick. It is unclear whether extensive quantum error correction or topological quantum computing (or something else, like a hybrid between the two) will be the eventual winner. But it turns out that developing topological quantum-computing hardware is also a huge challenge. There are in fact ideas, and I played some role in developing the theories for these ideas, for bypassing quantum error correction by using far-more-stable qubits, in an approach called topological quantum computing. What, however, is missing is the breakthrough of integrated circuits and CPUs leading to smartphones-it took 60 years of very difficult engineering to go from the invention of transistors to the smartphone with no new physics involved in the process. You can put 100 tubes together and establish the principle that if you could somehow get 10 billion of them to work together in a coherent, seamless manner, you could achieve all kinds of miracles. It is akin to trying to make today’s best smartphones using vacuum tubes from the early 1900s. The qubit systems we have today are a tremendous scientific achievement, but they take us no closer to having a quantum computer that can solve a problem that anybody cares about. Only tens of thousands of these would be used for computation-so-called logical qubits the rest would be needed for error correction, compensating for decoherence. Building a quantum computer that could crack RSA codes out of such components would require many millions if not billions of qubits. The most advanced quantum computers today have dozens of decohering (or “noisy”) physical qubits. But in practice, it is extremely difficult. In 1994, scientists thought that such error correction would be easy because physics allows it. That depends on implementing an idea pioneered by Shor and others called quantum-error correction, a process to compensate for the fact that quantum states disappear quickly because of environmental noise (a phenomenon called “decoherence”). The only problem? Actually making a quantum computer that could do it. ![]() Prime factorization is at the heart of breaking the universally used RSA-based cryptography, so Shor’s factorization scheme immediately attracted the attention of national governments everywhere, leading to considerable quantum-computing research funding. The best known is Peter Shor’s 1994 theoretical demonstration that a quantum computer can solve the hard problem of finding the prime factors of large numbers exponentially faster than all classical schemes. But I’m disturbed by some of the quantum computing hype I see these days, particularly when it comes to claims about how it will be commercialized.Įstablished applications for quantum computers do exist. I am as pro-quantum-computing as one can be: I’ve published more than 100 technical papers on the subject, and many of my PhD students and postdoctoral fellows are now well-known quantum computing practitioners all over the world. Much of this commercial activity has happened with baffling speed over the past three years. IonQ, for example, was valued at $2 billion when it went public in October through a special-purpose acquisition company. A host of startups have sprung up as well, some boasting staggering valuations. Large tech companies such as Alphabet, Amazon, and Microsoft now have substantial research and development efforts in quantum computing. As a buzzword, quantum computing probably ranks only below AI in terms of hype.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |