Google’s Quantum Computer Hits Key Milestone by Reducing Errors

2023-02-26 02:20:43
关注

Physicists at Google have reached what they describe as their second milestone along the path to a useful quantum computer. At a laboratory in Santa Barbara, California, they have demonstrated that they can lower the error rate of calculations by making their quantum code bigger.

The feat, reported in Nature on 22 February, follows up on a celebrated 2019 experiment in which a Google quantum computer achieved ‘quantum advantage’ — by performing a calculation that would have taken thousands of years on an ordinary computer.

Error correction is an inescapable requirement if quantum computers are to fulfil their promise of solving problems that are beyond the reach of classical machines — such as factoring large whole numbers into primes, or understanding the detailed behaviour of chemical catalysts.

“The Google achievement is impressive, since it is very hard to get better performance with large code size,” says Barbara Terhal, a theoretical physicist who specializes in quantum error correction at the Delft University of Technology in the Netherlands. The improvement is still small, the Google researchers admit, and the error rate needs to drop much more. “It came down by a little; we need it to come down a lot,” said Hartmut Neven — who oversees the quantum-computing division at Google’s headquarters Mountain View, California — during a press briefing.

Correcting mistakes

All computers are subject to errors. An ordinary computer chip stores information in bits (which can represent 0 or 1) and copies some of the information into redundant ‘error correction’ bits. When an error occurs — as a result of stray electrons crossing an imperfectly insulating barrier, say, or a cosmic-ray particle disturbing the circuit — the chip can automatically spot the problem and fix it.

“In quantum information we can’t do that,” said Julian Kelly, Google’s director of quantum hardware, at the press briefing. Quantum computers are based on quantum states called qubits, which can exist in a mixture of ‘0’ and ‘1’ states. A qubit cannot be read out without its full quantum state being irretrievably lost, which means that its information cannot be simply copied onto redundant qubits.

But theoreticians have developed elaborate ‘quantum error correction’ schemes to address this problem. These typically rely on encoding a qubit of information — called a logical qubit — in a collection of physical qubits rather than a single one. The machine can then use some of the physical qubits to check on the health of the logical qubit and correct any errors. The more physical qubits there are, the better they can suppress an error. “The advantage of using multiple qubits for quantum error correction is that it scales,” says Terhal.

But adding more physical qubits also increases the chances that two of them will be affected by an error simultaneously. To address this issue, the Google researchers performed two versions of a quantum error-correction procedure. One, using 17 qubits, was able to recover from one error at a time. A larger version used 49 qubits and could recover from two simultaneous errors, and with slightly better performance than the smaller version could achieve. “The improvement currently is very small, and it is no guarantee yet that using even larger codes will give even better performance,” says Terhal.

Joe Fitzsimons, a physicist at Horizon Quantum in Singapore, says that various laboratories have made big steps towards effective error correction, and that Google’s latest result has many of the required features. But qubits also need to store information for sufficient time for the computer to carry out calculations, and Google’s team has yet to achieve that feat. “For a convincing demonstration of scalable error correction, we would want to see improvement in lifetimes”, as the system scales up, says Fitzsimons.

Google has set a quantum-computing roadmap for itself with six key milestones. Quantum advantage was the first, and the latest result was the second. Milestone six is a machine made of one million physical qubits, encoding 1,000 logical qubits. “At that stage, we can confidently promise commercial value,” says Neven.

Superconducting qubits are only one of several approaches to building a quantum computer, and Google still thinks it has the best chance of succeeding, says Neven. “We would pivot in a heartbeat if it becomes very clear that another approach will get us to a useful quantum computer quicker.”

This article is reproduced with permission and was first published on February 22, 2023.

参考译文
谷歌的量子计算机通过减少错误达到关键里程碑
谷歌的物理学家在实现实用量子计算机的道路上达成了他们描述的第二个重要里程碑。在加利福尼亚州圣巴巴拉的一个实验室中,他们证明,通过扩大量子代码规模,可以降低计算的错误率。这项成果发表在2月22日的《自然》杂志上,是对2019年一项著名实验的延续,当时谷歌的量子计算机实现了“量子优势”——完成了一项传统计算机需要数千年才能完成的计算任务。如果量子计算机要兑现其承诺,解决传统机器无法触及的问题——如将大整数分解为质数,或理解化学催化剂的详细行为——那么错误校正就是不可避免的要求。荷兰代尔夫特理工大学专门研究量子错误校正的理论物理学家芭芭拉·特哈尔回顾道:“谷歌的成就令人印象深刻,因为使用大尺寸代码获得更好的性能是非常困难的。” 谷歌研究人员承认,这个改进目前还很小,错误率仍需大幅下降。谷歌总部位于加利福尼亚州山景城的量子计算部门负责人哈特穆特·内文在一次新闻发布会上说:“错误率下降了一点点;我们需要它下降很多。”校正错误所有计算机都可能存在错误。普通计算机芯片通过比特(可以代表0或1)存储信息,并将部分信息复制到冗余的“错误校正”比特中。当发生错误时——例如,由于游离电子穿过了绝缘性不完美的屏障,或者宇宙射线粒子干扰了电路——芯片可以自动检测问题并加以纠正。谷歌量子硬件主管朱利安·凯利在新闻发布会上表示:“在量子信息中我们无法这样做。”量子计算机基于称为量子比特(qubit)的量子态,这些量子比特可以同时处于“0”和“1”的混合状态。对量子比特的读取会导致其完整的量子态不可逆地丢失,这意味着其信息不能简单地复制到冗余的量子比特中。然而,理论家们已经开发出复杂的“量子错误校正”方案来应对这一问题。这些方案通常依赖于将一个信息量子比特——称为逻辑量子比特——编码在一组物理量子比特中,而不是单一的量子比特中。机器可以利用其中一些物理量子比特来检查逻辑量子比特的状态并纠正任何错误。物理量子比特越多,越能有效抑制错误。“使用多个量子比特进行量子错误校正的优势在于其可扩展性,”特哈尔回忆道。但增加更多的物理量子比特也增加了其中两个同时出错的可能性。为了解决这一问题,谷歌研究人员进行了两种版本的量子错误校正过程。一种使用了17个量子比特,可以一次纠正一个错误;另一种使用了49个量子比特,可以同时纠正两个错误,其性能也比小版本稍好。“目前的改进还很小,而且目前还不确定使用更大的代码是否真的能带来更好的性能,”特哈尔回忆道。新加坡Horizon Quantum的物理学家乔·菲茨西蒙斯表示,多个实验室在有效错误校正方面都取得了重要进展,谷歌的最新成果具备了许多所需的特性。但量子比特还需足够长时间地存储信息,以便计算机完成计算,而谷歌团队尚未实现这一目标。菲茨西蒙斯表示:“为了真正展示可扩展的错误校正,我们希望看到在系统扩展时寿命的改善。”谷歌为自身设定了一个量子计算路线图,包含六个关键里程碑。量子优势是第一个,而最新成果是第二个。第六个里程碑是拥有100万个物理量子比特的机器,可编码1000个逻辑量子比特。内文表示:“到那时,我们就可以自信地承诺其商业价值。”内文表示,超导量子比特只是构建量子计算机的多种方法之一,而谷歌仍然认为这种方法最有可能成功。“如果我们非常明确地看到另一种方法能更快实现实用的量子计算机,我们会立即转向。”本文经授权转载,首次发表于2023年2月22日。
您觉得本篇内容如何
评分

评论

您需要登录才可以回复|注册

提交评论

广告

scientific

这家伙很懒,什么描述也没留下

关注

点击进入下一篇

揭秘IBM 400+量子比特量子处理器设计细节

提取码
复制提取码
点击跳转至百度网盘