The quest to build a practical quantum computer, a challenge that has tantalized physicists and computer scientists since the 1980s, is gaining unprecedented momentum. Recent breakthroughs have prompted technology firms to unveil blueprints that could see quantum machines grow from experimental constructs into viable, large-scale systems capable of solving problems beyond the reach of today's supercomputers.

Quantum computing, long seen as a distant goal, is moving closer to reality as leading tech firms outline fresh plans for building scalable machines. In June, IBM released an updated roadmap that it says resolves several of the field's most challenging technical obstacles, setting the stage for what could be the industry's most decisive period.

"It doesn't feel like a dream anymore. I really do feel like we've cracked the code and we'll be able to build this machine by the end of the decade," Jay Gambetta, IBM's leader for quantum initiatives, told the Financial Times.

Despite recent progress, the road ahead remains lined with formidable hurdles. Even after researchers solved the fundamental physics problems, manufacturers still face daunting engineering challenges. Oskar Painter, who leads quantum hardware development at Amazon Web Services, cautioned that building a practical quantum computer will require a massive engineering effort and could take another 15 to 30 years.

Current quantum prototypes typically use fewer than 200 qubits – the quantum equivalent of bits in conventional computers – but achieving machines with real industrial utility will require systems with a million or more. The challenge is immense, largely because qubits remain in their quantum states for only fractions of a second. As engineers pack in more qubits, interference, or "noise," multiplies, making reliable computation exponentially harder.

This challenge became clear with IBM's Condor chip, which contains 433 qubits. The chip's scale introduced "crosstalk," a type of interference between components that undermined performance. "Stacking larger numbers of qubits together like this creates a bizarre effect we can't control anymore," said Rigetti Computing CEO Subodh Kulkarni. "That's a nasty physics problem to solve." IBM responded by redesigning its couplers – the links connecting qubits – to reduce the system's vulnerability to interference. Earlier test systems relied on laborious individual "tuning" of qubits, a method that cannot scale to larger devices. Now, manufacturers are developing qubits with greater reliability and efficiency, requiring ongoing advances in materials science and production. Google aims to cut component costs tenfold, setting a $1 billion target for a full-scale quantum computer.

A keystone of scalability is quantum error correction, a method that allows systems to tolerate imperfect qubits. By distributing data across multiple qubits, error correction introduces redundancy to guard against failures. Julian Kelly, head of hardware at Google Quantum AI, cautioned that scaling systems too early could waste resources, produce noisy outputs, and consume significant engineering effort without delivering practical results.

Google has demonstrated a quantum chip that performs error correction at an increasing scale. It uses a technique called surface code, in which each qubit connects to its neighbors in a two-dimensional grid. Achieving meaningful computation with this setup would require more than a million qubits. IBM, by contrast, is pursuing low-density parity-check codes that could cut qubit needs by roughly 90 percent but depend on intricate long-distance connections. Kelly warned that these connections add new layers of complexity to systems that are already extremely difficult to control. Nevertheless, IBM reported a recent breakthrough in developing such connectors. Most notable advances come from qubits built with superconducting circuits, as used in IBM and Google machines. These systems must operate near absolute zero and are notoriously hard to control. Other approaches use trapped ions, neutral atoms, or photons as qubits. While these approaches offer greater inherent stability, scaling up and integrating large numbers of qubits remains a formidable practical challenge.

"The costs and technical challenges of trying to scale will probably show which are more practical," said Sebastian Weidt, chief executive at Universal Quantum, a startup developing trapped ions.

Weidt emphasized that government support in the coming years could play a decisive role in determining which quantum technologies prove viable, ultimately limiting the field to a handful of companies capable of bringing a system to full scale.

Widespread interest in quantum computing is attracting attention from both investors and government agencies. Last year, the Pentagon's advanced research arm, DARPA, launched a broad review to identify the quantum technologies most likely to achieve rapid industrial adoption, selecting a group of leading companies to test which approaches could deliver practical, scalable systems.

Firms like Amazon and Microsoft are exploring exotic states of matter in pursuit of more reliable qubits. These next-generation technologies are still in their early stages, though proponents argue they could eventually surpass today's quantum machines. For now, industry leaders continue refining and scaling legacy architectures developed over years of lab research.

"Just because it's hard, doesn't mean it can't be done," Horvath said about the industry now on the cusp of a technological revolution.

Source: techspot.com

Comment