Beyond Quantum Supremacy: The Hunt for Useful Quantum Computers

2022-08-10 04:00:13
关注

Occasionally Alán Aspuru-Guzik has a movie-star moment, when fans half his age will stop him in the street. “They say, ‘Hey, we know who you are,’” he laughs. “Then they tell me that they also have a quantum start-up and would love to talk to me about it.” He doesn’t mind a bit. “I don’t usually have time to talk, but I’m always happy to give them some tips.” That affable approach is not uncommon in the quantum-computing community, says Aspuru-Guzik, who is a computer scientist at the University of Toronto and co-founder of Zapata Computing in Cambridge, Mass. Although grand claims have been made about a looming revolution in computing, and private investment has been flowing into quantum technology, it is still early days, and no one is sure whether it is even possible to build a useful quantum computer.

Today’s quantum machines have at best a few dozen quantum bits, or qubits, and they are often beset by computation-destroying noise. Researchers are still decades—and many thousands of qubits—away from general-purpose quantum computers, ones that could do long-heralded calculations such as factoring large numbers. A team at Google has reportedly demonstrated a quantum computer that can outperform conventional machines, but such “quantum supremacy” is expected to be extremely limited. For general applications, 30 years is “not an unrealistic timescale,” says physicist John Preskill of the California Institute of Technology. Some researchers have raised the possibility that, if quantum computers fail to deliver anything of use soon, a quantum winter will descend: enthusiasm will wane and funding will dry up before researchers get anywhere close to building full-scale machines. “Quantum winter is a real concern,” Preskill says. Yet he remains upbeat because the slow progress has forced researchers to adjust their focus and see whether the devices they have already built might be able to do something interesting in the near future.

Judging from a flurry of papers published during the past few years, it’s a definite possibility. This is the era of the small, error-prone, or “noisy intermediate-scale quantum” (NISQ), machine, as Preskill has put it. And so far it has turned out to be a much more interesting time than anyone had anticipated. Although the results are still quite preliminary, algorithm designers are finding work for NISQ machines that could have an immediate impact in chemistry, machine learning, materials science and cryptography—offering insights into the creation of chemical catalysts, for example. These innovations are also provoking unexpected progress in conventional computing. All this activity is running alongside efforts to build bigger, more robust quantum systems. Aspuru-Guzik advises people to expect the unexpected. “We’re here for the long run,” he says. “But there might be some surprises tomorrow.”

Fresh Prospects

Quantum computing might feel like a 21st-century idea, but it came to life the same year that IBM released its first personal computer. In a 1981 lecture, physicist Richard Feynman pointed out that the best way to simulate real-world phenomena that have a quantum-mechanical basis, such as chemical reactions or the properties of semiconductors, is with a machine that follows quantum-mechanical rules. Such a computer would make use of entanglement, a phenomenon unique to quantum systems. With entanglement, a particle’s properties are affected by what happens to other particles with which it shares intimate quantum connections. These links give chemistry and many branches of materials science a complexity that defies simulation on classical computers. Algorithms designed to run on quantum computers aim to make a virtue of these correlations, performing computational tasks that are impossible on conventional machines.

Yet the same property that gives quantum computers such promise also makes them difficult to operate. Noise in the environment, whether from temperature fluctuations, mechanical vibrations or stray electromagnetic fields, weakens the correlations among qubits, the computational units that encode and process information in the computer. That degrades the reliability of the machines, limits their size and compromises the kinds of computation that they can perform. One potential way to address the issue is to run error-correction routines. Such algorithms, however, require their own qubits—the theoretical minimum is five error-correcting qubits for every qubit devoted to computation—adding a lot of overhead costs and further limiting the size of quantum systems.

Some researchers are focusing on hardware. Microsoft Quantum’s multinational team is attempting to use exotic “topological particles” in extremely thin semiconductors to construct qubits that are much more robust than today’s quantum systems. But these workarounds are longer-term projects, and many researchers are focusing on what can be done with the noisy small-scale machines that are available now—or will be in the next five to 10 years. Instead of aiming for a universal, error-corrected quantum computer, for example, physicist Jian-Wei Pan and his team at the University of Science and Technology of China in Hefei are pursuing short- and mid-term targets. That includes quantum supremacy and developing quantum-based simulators that can solve meaningful problems in areas such as materials science. “I usually refer to it as ‘laying eggs along the way,’” he says.

Researchers at Zapata Computing, including co-founder Alán Asparu-Guzik (fourth from left), are building quantum algorithms for today’s systems. Credit: Doug Levy

Bert de Jong of Lawrence Berkeley National Laboratory has his eye on applications in chemistry, such as finding alternatives to the Haber process for the manufacture of ammonia. At the moment, researchers must make approximations to run their simulations on classical machines, but that approach has its limits. “To enable large scientific advances in battery research or any scientific area relying on strong electron correlation,” he says, “we cannot use the approximate methods.” NISQ systems won’t be able to perform full-scale chemistry simulations. But when combined with conventional computers, they might demonstrate an advantage over existing classical simulations. “The classically hard part of the simulation is solved on a quantum processor, while the rest of the work is done on a classical computer,” de Jong says.

This kind of hybrid approach is where Aspuru-Guzik earned his fame. In 2014 he and his colleagues devised an algorithm called the variational quantum eigensolver (VQE), which uses conventional machines to optimize guesses. Those guesses might be about the shortest path for a traveling salesperson, the best shape for an aircraft wing or the arrangement of atoms that constitutes the lowest energy state of a particular molecule. Once that best guess has been identified, the quantum machine searches through the nearby options. Its results are fed back to the classical machine, and the process continues until the optimum solution is found. As one of the first ways to use NISQ machines, VQE had an immediate impact, and teams have used it on several quantum computers to find molecular ground states and explore the magnetic properties of materials.

That same year Edward Farhi, then at the Massachusetts Institute of Technology, proposed another heuristic, or best-guess, approach called the quantum approximation optimization algorithm (QAOA). The QAOA, another quantum-classical hybrid, performs what is effectively a game of quantum educated guessing. The only application so far has been fairly obscure—optimizing a process for dividing up graphs—but the approach has already generated some promising spin-offs, says Eric Anschuetz, a graduate student at M.I.T., who has worked at Zapata. One of those, devised by Anschuetz and his colleagues, is an algorithm called variational quantum factoring (VQF), which aims to bring the encryption-breaking, large-number-factoring capabilities of quantum processing to NISQ-era machines.

Until VQF, the only known quantum algorithm for such work was one called Shor’s algorithm. That approach offers a fast route to factoring large numbers but is likely to require hundreds of thousands of qubits to go beyond what is possible on classical machines. In a paper published in 2019, Zapata researchers suggest that VQF might be able to outperform Shor’s algorithm on smaller systems within a decade. Even so, no one expects VQF to beat a classical machine in that time frame. Others are looking for more general ways to make the most of NISQ hardware. Instead of diverting qubits to correct noise-induced errors, for example, some scientists have devised a way to work with the noise. With “error mitigation,” the same routine is run on a noisy processor multiple times. By comparing the results of runs of different lengths, researchers can learn the systematic effect of noise on the computation and estimate what the result would be without noise.

The approach looks particularly promising for chemistry. In March 2019 a team led by physicist Jay Gambetta of IBM’s Thomas J. Watson Research Center in Yorktown Heights, N.Y., showed that error mitigation can improve chemistry computations performed on a four-qubit computer. The team used the approach to calculate basic properties of the molecules hydrogen and lithium hydride, such as how their energy states vary with interatomic distance. Although single, noisy runs did not map onto the known solution, the error-mitigated result matched it almost exactly.

Errors might not even be a problem for some applications. Vedran Dunjko, a computer scientist and physicist at the University of Leiden in the Netherlands, notes that the kinds of tasks performed in machine learning, such as labeling images, can cope with noise and approximations. “If you’re classifying an image to say whether it is a human face, or a cat, or a dog, there is no clean mathematical description of what these things look like—and nor do we look for one,” he says.

Fuzzy Future

Gambetta’s team at IBM has also been pursuing quantum machine learning for NISQ systems. In early 2019, while working with researchers at the University of Oxford and at M.I.T., the group reported two quantum machine-learning algorithms that are designed to pick out features in large data sets. It is thought that as quantum systems get bigger, their data-handling capabilities should grow exponentially, ultimately allowing them to handle many more data points than classical systems can. The algorithms provide “a possible path to quantum advantage,” the team wrote.

But as with other examples in the machine-learning field, no one has yet managed to demonstrate a quantum advantage. In the era of NISQ computing, there is always a “but.” Zapata’s factoring algorithm, for instance, might never factor numbers faster than classical machines. No experiments have been done on real hardware yet, and there is no way to definitively, mathematically prove superiority.

Other doubts are arising. Gian Giacomo Guerreschi and Anne Matsuura of Intel Labs in Santa Clara, Calif., performed simulations of Farhi’s QAOA algorithms and found that real-world problems with realistically modeled noise do not fare well on machines the size of today’s NISQ systems. “Our work adds a word of caution,” Guerreschi says. “If order-of-magnitude improvements to the QAOA protocols are not introduced, it will take many hundreds of qubits to outperform what can be done on classical machines.” One general problem for NISQ computing, Dunjko points out, comes down to time.

False-color image of a seven-qubit system that has been used for quantum chemistry computations. Credit: "Error Mitigation Extends the Computational Reach of a Noisy Quantum Processor," by Abhinav Kandala et al., in Nature, Vol. 567; March 28, 2019

Conventional computers can effectively operate indefinitely. A quantum system can lose its correlations, and thus its computing power, in fractions of a second. As a result, a classical computer does not have to run for very long before it can outstrip the capabilities of today’s quantum machines. NISQ research has also created a challenge for itself by focusing attention on the shortcomings of classical algorithms. It turns out that many of those, when investigated, can be improved to the point at which quantum algorithms cannot compete.

In 2016, for instance, researchers developed a quantum algorithm that could draw inferences from large data sets. It is known as a type of recommendation algorithm because of its similarity to the “you might also like” algorithms used online. Theoretical analysis suggested that this scheme was exponentially faster than any known classical algorithm. But in July 2018 computer scientist Ewin Tang, then an undergraduate student at the University of Texas at Austin, formulated a classical algorithm that worked even faster. Tang has since generalized her tactic, taking processes that make quantum algorithms fast and reconfiguring them so that they work on classical computers. This has allowed her to strip the advantage from a few other quantum algorithms, too.

Despite the thrust and parry, researchers say it is a friendly field and one that is improving both classical computing and quantum approaches. “My results have been met with a lot of enthusiasm,” says Tang, who is now a Ph.D. student at the University of Washington. For now, however, researchers must contend with the fact that there is still no proof that today’s quantum machines will yield anything of use. NISQ could simply turn out to be the name for the broad, possibly featureless landscape researchers must traverse before they can build quantum computers capable of outclassing conventional ones in helpful ways.

“Although there were a lot of ideas about what we could do with these near-term devices,” Preskill says, “nobody really knows what they are going to be good for.” De Jong, for one, is okay with the uncertainty. He sees the short-term quantum processor as more of a lab bench—a controlled experimental environment. The noise component of NISQ might even be seen as a benefit because real-world systems, such as potential molecules for use in solar cells, are also affected by their surroundings. “Exploring how a quantum system responds to its environment is crucial to obtain the understanding needed to drive new scientific discovery,” he says.

For his part, Aspuru-Guzik is confident that something significant will happen soon. As a teenager in Mexico, he used to hack phone systems to get free international calls. He says he sees the same adventurous spirit in some of the young quantum researchers he meets—especially now that they can effectively “dial in” and try things out on the small-scale quantum computers and simulators made available by companies such as Google and IBM. This ease of access, he thinks, will be key to working out the practicalities.

“You have to hack the quantum computer,” Aspuru-Guzik says. “There is a role for formalism, but there is also a role for imagination, intuition and adventure. Maybe it’s not about how many qubits we have; maybe it’s about how many hackers we have.”

参考译文
超越量子霸权:寻找有用的量子计算机
偶尔,阿兰·阿斯普鲁-古兹克会有明星般的待遇——一些年龄只有他一半的年轻人会在街上拦住他。“他们对我说:‘嘿,我们知道你是谁。’”他笑着说,“然后他们会告诉我他们也有量子初创公司,并希望我与他们谈谈。”他丝毫不介意。“我通常没时间交谈,但总是乐意给他们一些建议。”阿斯普鲁-古兹克表示,这种亲切的态度在量子计算社区并不少见。他是多伦多大学的计算机科学家,也是马萨诸塞州剑桥市Zapata Computing的联合创始人。尽管人们对一场计算革命的前景充满期待,私人投资也大量涌入量子科技领域,但目前仍处于早期阶段,没有人能确定是否真的可以构建出实用的量子计算机。今天,量子计算机最多拥有几十个量子比特(qubit),而且常常受到破坏计算的噪声干扰。研究人员距离通用量子计算机还有几十年时间,还缺少成千上万的量子比特。这些通用量子计算机有望完成一些久被预言的计算任务,比如对大数进行因数分解。据报道,谷歌的一个研究团队已经展示了一台量子计算机,其性能可以超越传统机器,但这种“量子霸权”预计将非常有限。对于通用性应用,加州理工学院物理学家约翰·普雷斯基尔表示,“30年并不是不现实的时间框架。”一些研究人员提出了这样的可能性:如果量子计算机无法尽快实现有用的突破,就可能会迎来“量子寒冬”——热情消退,资金枯竭,而研究人员尚未接近制造出实用的量子计算机。“量子寒冬确实令人担忧,”普雷斯基尔说。但他仍保持乐观,因为缓慢的进展迫使研究人员调整研究重点,看看他们目前制造的设备是否能在不久的将来做一些有意义的事情。从过去几年发表的一系列论文来看,这种情况确实有可能发生。普雷斯基尔将这个时代称为“小规模、易出错或‘中间噪声量子’(NISQ)”的机器时代。到目前为止,这个时代证明比任何人预料的都更有意思。尽管结果还非常初步,但算法设计人员正在为NISQ设备寻找应用,这些应用可能在化学、机器学习、材料科学和密码学等领域产生立竿见影的影响,例如对化学催化剂构建的洞察。这些创新也在推动传统计算的意外进展。所有这些活动都在与构建更大、更稳健的量子系统的努力并行。阿斯普鲁-古兹克建议人们要准备迎接意外。“我们是长期在这里的,”他说,“但明天可能会有一些惊喜。” 量子计算也许看起来是21世纪的概念,但它的诞生时间其实很早——与IBM发布第一台个人计算机是同年。1981年,物理学家理查德·费曼在一次演讲中指出,模拟具有量子力学基础的现实现象(如化学反应或半导体特性)的最佳方式,是使用遵循量子力学规则的计算机。这样的计算机将利用纠缠现象——这是量子系统特有的现象。通过纠缠,一个粒子的特性会受到与之共享亲密量子连接的其他粒子的影响。这些联系使得化学和许多材料科学分支的复杂性远远超出传统计算机的模拟能力。为量子计算机设计的算法旨在利用这些关联,从而完成传统计算机不可能完成的计算任务。然而,正是这一特性,使得量子计算机既充满希望,又难以操作。环境中的噪声,如热噪声或其他干扰,会破坏这些量子状态,导致计算失败。 目前,研究人员正在探索如何克服这些挑战。例如,IBM的研究团队开发了一种“错误缓解”技术,使得在噪声量子处理器上进行的计算能够接近理论上的准确结果。此外,一些研究人员发现,某些机器学习任务可以容忍噪声和近似计算,这为量子计算提供了新的应用方向。尽管如此,研究人员也面临许多不确定性。例如,Zapata公司的因数分解算法可能永远不会比传统计算机更快;对实际硬件的实验尚未开展,也没有数学方法可以证明其优越性。 尽管如此,研究人员表示,这是一个友好的领域,它既推动了传统计算,也推动了量子计算的发展。例如,2018年,计算机科学家埃文·唐开发了一种经典算法,其速度甚至超过了之前被认为更优的量子算法。唐表示,她的研究受到了广泛欢迎。 目前,研究人员必须面对这样一个事实:今天,我们仍然没有确凿的证据证明量子计算机真的能带来有用的成果。NISQ可能只是研究人员必须穿越的一片广阔的、或许平淡无奇的领域,直到他们能够构建出真正超越传统计算机的量子计算机。 尽管如此,普雷斯基尔认为,研究人员仍需保持开放的心态。“尽管我们有很多关于这些短期设备能做什么的想法,”他说,“但没有人真正知道它们会擅长什么。” 对于德容来说,不确定性是可以接受的。他将短期量子处理器视为一个实验平台,一个受控的实验环境。NISQ中的噪声甚至可以被视为一种优势,因为现实世界中的系统(如用于太阳能电池的潜在分子)也受到环境的影响。“探索量子系统如何响应其环境,对于推动新的科学发现至关重要,”他说。 至于阿斯普鲁-古兹克,他则对不久的将来充满信心。他年轻时在墨西哥曾破解电话系统以获取免费国际通话。他说,他现在看到的年轻量子研究人员中,也有人具备同样的冒险精神——特别是现在他们可以有效地“拨号”并尝试在谷歌和IBM等公司提供的小型量子计算机和模拟器上开展实验。他相信,这种便捷的访问将是解决实际问题的关键。“你必须对量子计算机进行破解,”阿斯普鲁-古兹克说,“正式的理论有其作用,但想象力、直觉和冒险精神同样重要。也许,问题不在于我们有多少个量子比特,而在于我们有多少个黑客。”
您觉得本篇内容如何
评分

评论

您需要登录才可以回复|注册

提交评论

广告
提取码
复制提取码
点击跳转至百度网盘