Every few years, a technology arrives that forces a genuine rethink of what computing can do. Quantum computing is widely described as one of those technologies. It’s also one of the most consistently misunderstood — oscillating between breathless hype and dismissive scepticism, sometimes in the same week.
So let’s try to be precise about what it actually is, where it actually stands at the start of this decade, and why it’s worth paying attention to even if practical applications are still some distance away.
Classical Bits vs Quantum Qubits
Every classical computer — the laptop you’re reading this on, the servers running your enterprise applications — processes information as bits. A bit is binary: it’s either 0 or it’s 1. Everything a computer does, from rendering a webpage to running a machine learning model, ultimately reduces to sequences of those two states.
A quantum computer works differently. Its basic unit of information is the qubit — and a qubit can exist in a state of superposition, meaning it can be 0, 1, or any combination of both simultaneously. This isn’t a metaphor or an approximation. It’s a property of quantum mechanics — the physics that governs the behaviour of particles at atomic and subatomic scales.
The second relevant property is entanglement: two qubits can become correlated such that the state of one instantly determines the state of the other, regardless of the physical distance between them. Einstein famously called this “spooky action at a distance.” It’s real, and it’s one of the properties that gives quantum computers their potential power.
Together, superposition and entanglement mean that a quantum computer with 53 qubits can represent and process an astronomically larger state space than a classical computer with 53 bits. The computational advantage, for certain classes of problems, is not incremental — it’s exponential.
What Quantum Computers Are Actually Good At
The key phrase is “certain classes of problems.” Quantum computers are not universally faster than classical ones. They are specifically better at problems that involve exploring a very large number of possible states simultaneously — where classical computers have to check possibilities sequentially and the search space is enormous.
The most significant near-term applications fall into three broad areas.
Cryptography and security. Most modern encryption relies on the computational difficulty of factoring very large numbers — a task that would take a classical computer longer than the age of the universe. A sufficiently powerful quantum computer could perform this factorisation dramatically faster. This has already triggered serious work on post-quantum cryptography standards, and is one of the most concrete near-term reasons for organisations handling sensitive data to pay attention.
Simulation and drug discovery. Quantum computers can simulate quantum systems — molecules, materials, chemical reactions — with an accuracy that classical computers fundamentally cannot match. The implications for pharmaceutical research and materials science are substantial: designing new drugs or new materials by accurately modelling molecular behaviour at a level of fidelity no classical approximation can achieve.
Optimisation. Many real-world business problems — supply chain routing, financial portfolio optimisation, logistics scheduling — involve searching for the best solution across a combinatorially large space of possibilities. Quantum algorithms show early promise for specific formulations of these problems, though practical advantage here is still being demonstrated.
Where the Field Stands — Right Now
In October 2019, Google published a landmark paper in Nature claiming that its Sycamore processor — a 53-qubit superconducting quantum chip — had performed a specific calculation in 200 seconds that would take a state-of-the-art classical supercomputer thousands of years to complete. Google called this “quantum supremacy.”
The claim was significant and contested in roughly equal measure. IBM researchers countered that improved classical algorithms could complete the same task far faster than Google estimated, questioning both the benchmark and the framing. The scientific community broadly acknowledged the milestone as meaningful while debating whether “supremacy” was the right word for it.
What the demonstration did confirm — regardless of the naming debate — is that quantum hardware is progressing on a real trajectory. We are now firmly in what researchers call the NISQ era — Noisy Intermediate-Scale Quantum — a period where quantum processors exist and can run meaningful experiments, but error rates remain high and qubit counts are still well short of what large-scale practical applications require.
IBM recently published a development roadmap with an ambitious long-term target of reaching one million qubits by 2030. That’s an aspirational goal on a long horizon — but the fact that major organisations are now committing to public roadmaps signals that this is no longer purely a research conversation.
The Companies Building It
The quantum computing landscape involves a mix of technology giants, specialist startups, and national research programmes — each pursuing different hardware approaches.
Google is pursuing superconducting qubit technology, building on the Sycamore work. Unlike IBM, Google focuses primarily on internal milestones and research collaborations rather than broad public cloud access — meaning when Google announces something, it tends to be peer-reviewed and significant.
IBM has made quantum computing accessible through its cloud platform, IBM Quantum, which allows researchers, developers, and enterprises to run experiments and algorithms on real quantum hardware over the internet. IBM’s most advanced processor as we enter 2021 is the 65-qubit Hummingbird, released in mid-2020. IBM’s publicly transparent roadmap and open-source Qiskit development kit have made it the most accessible entry point for organisations beginning to explore quantum.
Microsoft is pursuing a different hardware bet — topological qubits, which are theoretically more stable and error-resistant than superconducting alternatives. The approach is higher-risk but potentially higher-reward in the long term. Microsoft has also built Q#, a dedicated quantum programming language, to support software development for quantum systems.
IonQ uses trapped ion technology — individual atoms held in place by electromagnetic fields and used as qubits. The approach offers different stability and error characteristics than superconducting methods, and IonQ has been increasingly visible as one of the more credible specialist players in the space.
Honeywell has developed a trapped ion quantum computer drawing on its precision engineering heritage. The company has made notable claims about the quality — not just quantity — of its qubits, measured through a metric called quantum volume that captures both qubit count and error rates together.
Rigetti Computing focuses on superconducting qubits and offers cloud-based quantum access through its Forest platform, allowing developers to run hybrid quantum-classical algorithms without owning hardware.
Alibaba has been one of the leading quantum computing investors in China, with an active research programme under its DAMO Academy and a cloud platform for quantum experimentation — part of a broader national investment that makes China a significant player in the global race.
The Challenges That Remain
The potential is real. The current limitations are equally real — and understanding both is what separates informed attention from hype.
Decoherence is the fundamental engineering challenge. Qubits are extraordinarily sensitive to their environment. Any disturbance — a change in temperature, an electromagnetic fluctuation, vibration — can cause a qubit to lose its quantum state before the computation completes. Building systems that maintain coherence long enough to perform useful calculations at scale is the central unsolved problem.
Error rates remain high. Current quantum hardware makes mistakes frequently — which is the defining characteristic of the NISQ era. Quantum error correction exists in theory and early implementation, but achieving the error rates required for large-scale reliable computation requires qubit counts and quality not yet achieved.
Software fragmentation is a real friction. Each hardware platform has its own programming model, making it difficult to write quantum algorithms that run portably across different systems. The classical computing world took decades to solve the equivalent problem. The quantum world is only beginning to work through it.
Why This Matters Now
The question worth sitting with is not “when will quantum computing be ready?” That framing produces either premature dismissal or premature excitement. The more useful question is: what should organisations with a long time horizon be doing right now?
The answers are relatively clear. Understanding which of your current computational problems are quantum-relevant is worth doing before quantum hardware matures, not after. Engaging with post-quantum cryptography — which is already being actively developed — is a concrete near-term action for any organisation handling sensitive data. And watching the hardware and software landscape for the moment when specific application classes become practical is the kind of ongoing signal-monitoring that separates organisations that are ready to move from the ones that are perpetually catching up.
Quantum computing is not arriving tomorrow. But the organisations paying attention today will be significantly better positioned when it does.
Which of your organisation’s current computational challenges — in optimisation, simulation, or security — would look fundamentally different with access to quantum-level processing power?
Let’s keep learning — together.
Share your thoughts