The Qubits Are Here
Stage B of DARPA’s Quantum Benchmarking Initiative was awarded in November but you were distracted.
I’m as taken with the AI revolution as anyone. I spend weekends building entire applications on platforms like Base44 that solve problems I’ve been staring at for years. The internal tech I’ve built for myself has quadrupled my capacity. For product managers who know enough to debug code, structure an assignment and build out real-life user stories, this is the golden age. I am living in it.
But what truly has my attention right now is quantum.
You’d be forgiven if you missed the headlines. There weren’t many. The most significant validation of commercial quantum viability in history landed in November 2025 to almost no fanfare. No breathless cable news segments. No viral threads. The algorithmic attention economy was busy arguing about chatbots.
Meanwhile, eleven companies cleared Stage B of DARPA’s Quantum Benchmarking Initiative, the most rigorous quantum evaluation program ever conducted by the United States government. The goal: determine whether a fault-tolerant, utility-scale quantum computer can be built by 2033.
Several of these companies believe they’ll get there well before that.
Some Background
A classical computer thinks in bits. Yes or no. On or off. It works through problems sequentially, asking itself a long string of binary questions to arrive at an answer.
A quantum computer does something fundamentally different. Instead of working through that string, it leverages superposition to take trillions of guesses simultaneously, exploring an enormous landscape of possibilities all at once and arriving at the most probable answer at the end. It doesn’t walk to the conclusion. It jumps.
The catch is that quantum mechanics operates in probabilities. Sometimes the answer isn’t the most probable one. Errors creep in, and for a long time they crept in more often than not. The field needed a mechanism for error correction, a way to verify and stabilize results across millions of quantum operations. For years, this looked like it might be the wall that kept quantum in the laboratory forever.
It wasn’t. Error correction turned out to be a surmountable engineering problem, and in the last two years, multiple companies have made dramatic breakthroughs. That’s a large part of what DARPA’s evaluation validated. These aren’t theoretical architectures anymore. These are machines correcting their own errors at scale.
The other wall was physical. Quantum computing requires cryostats, cooling systems that bring processors to temperatures colder than deep space. Qubits are fragile things, and they need those conditions to maintain the quantum states that make computation possible. Cryostats get exponentially more expensive and complex the bigger you build them. More computing power means a bigger cryostat. It’s a scaling wall, and it’s the reason quantum has remained a laboratory curiosity for so long. You could build a quantum computer. You just couldn’t build one big enough to be useful.
One of the eleven companies that cleared DARPA’s evaluation is Photonic, headquartered in Vancouver, British Columbia. They looked at that wall and walked around it.
Instead of building a bigger cryostat, Photonic builds smaller ones and links them together over standard telecom fiber. The quantum operations between qubits are near-instantaneous, and the photonic connections bridging the cryostats travel at the speed of light, limited only by the negligible friction of the fiber itself. When they need more computing power, they add another cryostat to the network. No bigger machine. No exponential cost curve. More nodes on existing fiber. They’ve turned quantum computing into a networking problem. And networking problems, as the last thirty years of the internet have shown us, are problems we know how to solve.
Photonic’s approach could deliver commercial, fault-tolerant qubits as a service within eighteen months. Not selling a quantum computer to the handful of institutions that can afford one. Leasing quantum computing to businesses the way AWS leases cloud storage. They just closed CAD $180 million from Microsoft, RBC, and TELUS. Microsoft had already invested in a previous round and came back. That looks to me like smart money doubling down.
The implications are not speculative. They are engineering problems with timelines attached.
Drug discovery. The molecular simulations required to develop new pharmaceuticals take years on classical computers. Applications like AlphaFold have accelerated the process, but the calculations remain exponentially complex. A fault-tolerant quantum computer could compress that timeline from decades into seconds. Not faster drug discovery. A fundamentally different kind, modeling molecular interactions at a fidelity that classical computing cannot achieve regardless of how much hardware you throw at it.
Cryptography. The encryption securing virtually every financial transaction, medical record, and government communication on earth is built on mathematical problems that classical computers cannot solve in any reasonable timeframe. Quantum computers can. Google warned in February that adversaries are already harvesting encrypted data today, storing it for the day a quantum machine can crack it open. The new security architecture is not a response to a future threat. It is a response to a present one.
Climate modeling. Quantum could run atmospheric simulations that finally keep pace with the climate itself, modeling millions of variables simultaneously rather than relying on the approximations current models require.
And beyond these, industries that don’t have names yet. Every previous expansion of computing capability, from mainframes to the internet to mobile to cloud to AI, created entirely new categories of human activity. Quantum will do the same. We just can’t see them yet, the same way nobody in 1993 could have described social media.
AI is getting all the attention, deservedly so. But on a greater timescale, quantum will be what this decade is remembered for.



