Google just built a computer that solved a problem in 5 minutes that would take our best supercomputers 10 septillion years … that’s not a typo. That’s a 10 with 24 zeros after it. The universe is only 13.8 billion years old, so we’re talking about a computer that just did something… cosmically impossible. And Google isn’t alone. Microsoft and Amazon are racing to crack quantum computing too, each with wildly different approaches. After decades of being the much hyped “the future,” quantum computing suddenly feels like it’s arriving … all at once. So what changed? And should you care about computers that need to be colder than outer space just to work?
What is Quantum Computing?
What’s quantum computing (or QC for short)? Well…to many, it’s definitely QC: Quite Complicated. So if you want an in-depth lesson that gets into all the bits and pieces… or should I say, all the qubits and pieces… you can check out my video from a little while ago. Here’s the condensed version…
Classical computers use binary bits to encode information: 0s and 1s. Quantum computers are different because they use quantum bits, or qubits.1 Operations on qubits can exploit quantum physics weirdness like superposition and entanglement, which basically allows them to process much, much more data in a lot less time.
This is extremely relevant to performing complex tasks, ones that would take even supercomputers an astronomically long time to do.23 By making these problems solvable within minutes rather than years, QC has the potential to transform a ton of fields… if development can make it practical.4 QC could be used for everything from discovering superconducting materials to finally overcoming the limits on solar cell efficiency.5
But here’s the catch: quantum computers are incredibly fragile. They’re so sensitive, they make your moody teenager look emotionally stable. The qubits themselves have a tendency to collapse or flip states from stray magnetic fields, temperature fluctuations, even tiny vibrations.67 It’s like writing a marriage proposal in beach sand: you need your message to last long enough to be read, but it’s constantly under threat.
This tendency to collapse is called “decoherence.” With all these errors, quantum computers are really going through a phase. Current fault rates typically run 1% to 0.1% depending on the type of qubit and operation, meaning one out of every 100 to 1000 operations fails.8 That’s far from the one-in-10-billion error rates needed for practical applications.9
But we’ve been chipping away at quantum computing for decades. So, what’s happening now, and is it worth the hype?
What Just Happened?
Here’s where things get interesting. The last few months have seen major quantum computing announcements from tech giants. And if you weren’t paying attention, you’d be forgiven for wondering why Big Tech is suddenly very interested in potatoes…because everyone’s talking about chips. But these aren’t the kind you can dip in salsa.
In December 2024, Google introduced Willow. No, not my dog, but its latest quantum computing chip that pulled off a computation literally cosmic in scale.3
Then Microsoft revealed Majorana 1 in February.10 Named for the Majorana particles that are key to its design (which sound like something you’d need a prescription for), the chip uses an especially resilient type of qubit called a topological qubit. Microsoft claims it will solve “meaningful, industrial-scale problems in years, not decades.”1011
The very next week, Amazon debuted its new chip, Ocelot. Quantum computers may be a completely different kind of metal gear, but sharp shooting is just as important. It’s fitting, then, that Ocelot’s main strength is accuracy. Amazon claims that Ocelot’s feline form of architecture — and I do mean that literally — cuts the costs of quantum error correction “by up to 90%.”12
What’s fascinating is that each company is taking a radically different approach to the same fundamental problem: quantum error correction. Google is going big with more qubits, Microsoft is betting on exotic particles that may not even exist yet, and Amazon is getting clever with cats. Well, quantum cats.
Why Now?
So why are we seeing so many advancements now? Recent years have seen massive investment from tech giants, startups, and pharmaceutical companies. Meanwhile, advances in materials science and quantum algorithms have matured alongside improved computing power, creating the perfect storm for breakthroughs.1314
But what’s really interesting about this cornucopia of chips is that each company emphasizes the same breakthrough: quantum error correction. When it comes to errors, these computers really need to get their act together… or should I say, get their qubit together?
The problem isn’t getting an output from quantum computers — it’s getting the right one and being able to interpret it.5 That “being able to interpret it” part is key, because usually, bigger quantum systems create more errors than they solve. Until now.
The Three Approaches
Each of these companies chose a fundamentally different path to tackle quantum error correction, and understanding their strategies reveals just how challenging this problem really is.
Microsoft’s Majorana 1: The Quantum Safe
Microsoft is fascinated with Majorana particles. Think of them as quantum safes — incredibly good at protecting information from errors, but equally hard to crack open when you need that information back.10 It’s like having the world’s best security system that won’t let even you into your own house.
But there’s a big problem: Majorana particles might not actually exist. Or at least, Microsoft hasn’t convincingly proven they’ve created them. Leading theoretical physicist John Preskill stated that Microsoft had yet to release any performance data to back up its claims.15 Professor Jonathan Oppenheim of University College London told Fortune:
“There is a massive disconnect between the scientific article, and their public claims, but the most obvious one is that they haven’t shown that they have a topological qubit. The editors even took the rare step of highlighting this.”15
This isn’t Microsoft’s first rodeo with questionable Majorana claims. Back in 2018, Microsoft said it found strong evidence of these particles, only to retract that statement later citing “insufficient scientific rigor.”16
Still, if Microsoft’s approach works, it could be revolutionary. These topological qubits can be controlled through simple voltage pulses — basically just flipping a light switch.17 Microsoft envisions stringing together millions of qubits to tackle the most complex problems.1017
Amazon’s Ocelot: The Schrödinger Strategy
Of course, there’s more than one way to skin a cat, or count a qubit. Amazon’s Ocelot takes a different approach. Instead of novel architecture or brute-forcing qubit numbers, Amazon’s strategy is efficiency.12
Ocelot uses four transmons for stability, plus five cat qubits as its primary data qubits. Transmons are common in superconducting quantum computing, which is not the most powerful, but is notably noise-resistant.18 Cat qubits are cheekily named after Schrödinger’s famous thought experiment. Just like that theoretical cat that’s both dead and alive until you check, these qubits exist in two extremely different states at once. And because those states are so different… like hot and cold, or Taylor Swift and death metal… it’s really hard for errors to flip between them.1920
This design means that Ocelot doesn’t waste energy constantly checking for decoherence. The strategy requires only a tenth as many qubits per bit of information, making the chip both error resistant and energy efficient.1221
Amazon’s quantum hardware head Oskar Painter estimates that a:
“Fully-fledged quantum computer capable of transformative societal impact would require as little as one-tenth of the resources associated with standard quantum error correcting approaches.”12
The hope is that this efficiency will make Ocelot easy to operate and scale to commercial production.19 However, like Microsoft’s claims, this remains mostly theoretical. Ocelot is just a proof-of-concept prototype with comparatively meager computing capability. Amazon admits that the next goal is to add more qubits, encode more information, and perform “actual computations.”19
Google’s Willow: The Brute Force Approach
Then there’s Google. Google’s trying to scale up its quantum computer… guess they’re really trying to tip the scales in their favor. Simply put, Google’s strategy is the more qubits, the better. Willow packs 105 superconducting qubits, a lot compared to everyone else.3 Well…except IBM.522
In quantum computing you stitch qubits into a lattice called a surface code. Each surface code represents a single logical qubit. Bigger lattices can tolerate more errors, so Google is going big. But here’s the catch: each added qubit is also another potential failure point. Usually, bigger lattices create more errors than they solve.23
But here’s where things get really interesting. Willow is the first quantum processor to demonstrate an exponential reduction in error rates as qubits increase. The bigger the lattice, the more error-resistant it becomes.24
This “below threshold” performance has eluded researchers since error correction was introduced in 1995. The breakthrough allowed Willow to perform that benchmark computation in under five minutes that would take today’s fastest supercomputers 10 septillion years.3 For context, the universe is only 13.8 billion years old.25 So calling this a radical improvement would be a bit of an understatement.
However, we’re still in the realm of theory and proof-of-concepts. These tests mostly showcase Willow’s quantum advantage over classical computers in very specific, artificial benchmarks. They’re pretty far removed from the kind of tasks you’d actually want a quantum computer doing commercially.26 And Willow, like many quantum chips, needs to be kept super cool — colder than what the kids set the thermostat to when you’re not looking.27
Other Developments
And there’s more chips on the horizon. Physicists at the University of Oxford have set a new global benchmark for the accuracy of controlling a single quantum bit, achieving the lowest-ever error rate for a quantum logic operation: just 0.000015%, or one error in 6.7 million operations.28 That means you’re more likely to get struck by lightning than Oxford’s logic gate is to goof up. And better yet, the experiment was conducted at room temperature and without magnetic shielding, meaning that Oxford’s easy-going chip probably doesn’t need to invest in all the protection that quantum computers usually need.28
This is an important milestone for single qubit logic gates. However, quantum computing requires both single and two-qubit gates to work together. Oxford cautioned that while their work is a big step forward for single-qubit gates, even two-qubit gates still have very high error rates, about 1 in 2,000. That’s a much bigger hurdle to jump, so even though this is clearly the next step for their research, it won’t be an easy task.28
Just days after Oxford’s announcement, the University of Vienna has revealed an advancement in machine learning, aided, of course, by quantum computing. This was achieved thanks to a photonic quantum processor. And while that sounds like something out of Tony Stark’s armory, its just a processor that uses photons as qubits for quantum computation. Unlike traditional quantum processors based on superconducting circuits or trapped ions, photons have low decoherence and work just fine at room temperature.29
The really exciting thing here is that Vienna’s experiment demonstrates that even these small-sized quantum processors can still do some impressive things. And in the right circumstances, outperform current algorithms.30 And as a nice bonus, photonic platforms consume less energy than standard computers. With AI eating a frankly disturbing amount of energy, and its footprint only set to get larger, using less energy to do these kind algorithms is a good thing.31
The Reality Check
Now, before you start planning your quantum-powered gaming rig… we need to talk about reality. There are serious challenges to overcome before quantum computing can hit usable scales, let alone the commercial space.
The biggest question mark is around the software side of quantum computers. While the hardware side of QC is advancing quickly, the software needed to decode and decipher the results appears to be lagging behind. Error correction algorithms… or as I call it, autocorrect for atoms… still need work.
At the end of the day, we have to remember that quantum computers are very specialized machines.5 They’re not meant to replace classical computers, so don’t expect one to dethrone another any time soon … or ever. Instead, QC will open up new possibilities where today’s computers fall short.
The billion-dollar question: when will it actually hit the market?
To answer that, let’s turn to NASA’s handy-dandy technological readiness level (TRL) scale.32 QC implementations vary widely in their readiness, but most are still in the lab with proof-of-concepts on those lab benches. By definition, that puts most approaches at 3 or 4 on NASA’s TRL scale, though some companies claim higher levels for specific implementations.33
However, I think the speed and diversity of the breakthroughs bode well for the future, and I’m not alone. According to a 2023 report by consulting firm Arthur C. Little, 75% of the 500 QC experts polled said that they expect quantum computing to reach TRL 8 or 9 by 2032. Keep in mind, though, that these levels aren’t guaranteed indicators of “widespread commercial availability or significant market penetration.”3334 I guess we’ll have to let the chips fall where they may.
Here’s my take: We don’t know what we don’t know. Sometimes figuring out what’s possible requires going beyond theory and into practical application, and having the hardware ready allows for that. The diversity of approaches — Google’s brute force, Microsoft’s exotic particles, and Amazon’s efficiency focus — suggests that at least one of these paths might lead to practical quantum computing.
That said, the timeline remains uncertain. Whether we see meaningful quantum applications in 5 years or 15 years depends not just on solving the hardware challenges, but on developing the software ecosystem, training quantum programmers, and finding the killer applications that justify the enormous complexity and cost.
- IBM – What is quantum computing? ↩︎
- USC – Quantum computer outperforms supercomputers in approximate optimization tasks ↩︎
- Google – Meet Willow, our state-of-the-art quantum chip ↩︎
- Wikipedia – Quantum computing ↩︎
- YouTube – Why I Left Quantum Computing Research ↩︎
- Wikipedia – Quantum error correction ↩︎
- Microsoft Azure – What is a qubit? ↩︎
- Microsoft Quantum – Quantum error correction ↩︎
- IEEE Spectrum – More Is Better in Error-Resilient Quantum Computer ↩︎
- Microsoft – Microsoft’s Majorana 1 chip carves new path for quantum computing ↩︎
- DataCentre Magazine – Majorana 1: Microsoft Set to Accelerate Quantum Computing ↩︎
- Amazon – Amazon Web Services announces a new quantum computing chip ↩︎
- Time Magazine – The Quantum Era has Already Begun ↩︎
- Forbes – Quantum Computing Has Arrived; We Need To Prepare For Its Impact ↩︎
- Fortune – Microsoft’s quantum computing breakthrough questioned by experts ↩︎
- Nature – Retraction Note: Quantized Majorana conductance ↩︎
- Interesting Engineering – How Microsoft is rewriting the rules of reality with quantum computing ↩︎
- Wikipedia – Transmon ↩︎
- MIT Technology Review – Amazon’s first quantum computing chip makes its debut ↩︎
- Nature – Hardware-efficient quantum error correction via concatenated bosonic qubits ↩︎
- Wall Street Journal – Amazon Unveils Its First Quantum Computing Chip ↩︎
- Scientific American – IBM Releases First-Ever 1,000-Qubit Quantum Chip ↩︎
- Google Research – Making quantum error correction work ↩︎
- Arizona State University – Preparing for the Quantum Future: Google’s New Quantum Chip and What Executives Need to Know ↩︎
- Wikipedia – Age of the universe ↩︎
- CNBC – Google claims quantum computing milestone — but the tech can’t solve real-world problems yet ↩︎
- Sogeti Labs – Beyond The Hype: Why Google’s Willow Alone Does Not Bring You Closer to Practical Applications ↩︎
- Science Daily – Sharper than lightning: Oxford’s one-in-6.7-million quantum breakthrough ↩︎
- Science Daily – Photonic quantum chips are making AI smarter and greener ↩︎
- University of Vienna – Quantum computers boost machine learning algorithms ↩︎
- MIT Technology Review – We did the math on AI’s energy footprint. Here’s the story you haven’t heard ↩︎
- NASA – Technological Readiness Levels ↩︎
- Arthur D. Little – Quantum Summer or Winter? ↩︎
- University of Bristol – The five platform problem: which quantum technology will win? ↩︎













Comments