
Written by Dr. Paul Terry, CEO of Photonic Inc.
CEO Paul Terry addresses some of the most pressing questions facing the quantum computing industry today, from what it will take to build truly scalable systems, to where we’re most likely to see the first commercially valuable applications. With decades of experience in large-scale computing and networking, Terry shares how Photonic’s architecture tackles key challenges like long-distance entanglement, fault-tolerant design, and integration with existing telecom infrastructure. He also outlines the technical milestones to watch for – indicators that quantum computing is approaching a genuine inflection point.
The Quantum Industry's Next Challenge
I believe scale is the biggest challenge facing both quantum computing and networking. One way to think of scale is that the marginal cost of adding one more unit should trend to zero. If it does not, the technology can get bigger but will eventually hit upper limits. As an example, the internet has scaled to the size it is because the cost of adding a router cost became almost nothing.
Photonic has approached this by circumventing the issues that present challenges for others by using a unique qubit modality, the T centre, which uses optically linked silicon spin qubits for exceptional connectivity. It the basis of our Entanglement First™ architecture, which can perform both compute and networking functions, and is inherently compatible with existing telecom networks.
Ultimately, most of the challenges in quantum computing boil down to how they impact the ability to build systems that can grow without limit. The demand for classical compute capacity has yet to peak, and there's no reason to believe the same won't apply to quantum computing.

Image Credit: Photonic Inc.
Upcoming Applications for Quantum Computing
Once the quantum industry clears the scale hurdle, the applications of the technology are highly exciting. Quantum computers will be able to tackle chemistry-based challenges like materials science, drug discovery, and climate change, in the relatively near term. It will start with smaller scale quantum systems creating AI training data that will be used to accelerate models that have been built to work around the kinds of problems that classical systems just cannot do.
The industry started with NISQ systems, and now the focus is shifting to fault tolerance, but this is still within small scale systems. From the start, Photonic has been dedicated to building quantum computers that will create commercial value by providing solutions to real world problems. These applications often require a minimum of 400-2000 application-grade logical qubits, which is where we have put our focus – on building large scale, networkable, fault tolerant systems. Photonic’s systems engineering approach to building a quantum computing platform positions the company to meet the needs of a variety of end users, across applications. To that end, we have created an architecture that will deliver the performance and scale to support large-scale quantum operations, while also considering the manufacturability, maintainability, serviceability, reliability, energy consumption – all costs that cannot be overlooked.
Why Photonic’s Quantum Hardware is Competitive
Most quantum companies are built on architectures that use proximity-based entanglement: entanglement generated by having qubits physically very close together – whether in a fixed topology or by physical movement of qubits. However, placing qubits too close together will also increase crosstalk, which imposes a practical limit on qubit density.
There is another way to create entanglement, which is what Alain Aspect, John F Clauser and Anton Zeilinger won the Nobel Prize in Physics for in 2022, specifically for “experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information.”
At Photonic, we use entangled photons to avoid proximity-based limitations. Entanglement is created by using the qubits in two T centres, without the need for them to be on the same chip (or even in the same room); each can emit a telecom photon. When these photons interfere at a beam splitter, the resulting entanglement is projected back onto the qubits that emitted them. This creates entanglement between qubits that are physically distant, enabled by any-to-any qubit connections across chips, systems, and networks.
Additionally, Photonic leverages decades of silicon manufacturing experience to print large numbers of these T centres on silicon, connect them, and create entanglement across the system on demand. This enables a highly parallel, GPU-like quantum architecture.
Because of the architecture’s high connectivity, we can use quantum LDPC error-correction codes (including the new SHYPS family of codes announced last year) that bring down physical overheads, allowing for more computing with fewer physical qubits. Instead of needing hundreds to thousands of physical qubits per logical qubit like surface codes do, our networked architecture with SHYPS error correction codes require only 10–30 physical qubits per logical qubit. These kinds of foundational decisions enable us to build an architecture that can reach commercial scale from the ground up.
Integrating Quantum with Current Infrastructure
Quantum computers won’t exist only as stand-alone systems: they’ll need to be integrated into current classical computing and data centre infrastructure. Compatibility with existing infrastructure is therefore a key requirement for any qubit platform and architecture, hence the need for a qubit that emits photons at telecom wavelengths. Building a system that conforms to existing standards for data centres and telecom infrastructure increases the utility of that system by decreasing the barriers to widespread adoption.
In terms of implementation, Photonic already has a collaboration with TELUS, a Canadian telecommunications technology company, to test Photonic’s quantum networking on their advanced fibre-optic network.
What Next?
There will be many incremental, internal milestones along the way, but ultimately, the most important metric is how quickly systems capable of running useful applications can be deployed. Innovation in error correction remains a key focus to bring that day closer, alongside efforts to increase the number of qubits being deployed. Development also continues in computer science, advancing the algorithms and compilers needed to run a program as a service on a quantum computer.
One way to think about the quantum computing market is that most of the real applications people care about – chemistry, materials, catalysts, biotech, some AI training-data generation, and later finance and combinatorial optimisation – sit in the 400 to 2,000 logical-qubit range. Once that volume of application-grade logical qubits become available to enough end users for the problems they need to solve, we will start to see real traction. Another thing to consider is that not all users will want to own a quantum computer. Instead, they may only need access to it for a minute, an hour, or a day. As such, we’re in partnership with Microsoft to provide access for short-term use as a cloud service through their Azure Quantum platform.
Quantum computing has already made a lot of meaningful progress, but true commercial value will depend on achieving scalable, fault-tolerant systems that integrate seamlessly with today’s infrastructure. We believe Photonic’s distributed approach – rooted in telecom-wavelength qubits, photon-mediated entanglement, and efficient error-correction – is the most viable path to get there.
About Paul Terry

Dr. Paul Terry leads the team at Photonic. He is a seasoned entrepreneur, engineer, and angel investor specializing in disruptive technologies. Paul advises VCs and governments on economic, technical, health, and national defense strategies.
During his career, Paul has founded or been a founding employee at six successful Canadian companies with a cumulative valuation of more than $10B. Past roles include CEO of PHEMI, CTO Canada of Cray Supercomputing (OctigaBay), CTO of Abatis (Ericsson), and Director of Strategy at Newbridge Networks (Alcatel). Paul holds a 1st Class Honors Degree in Physics/Engineering, and a PhD in Engineering from Liverpool University. He holds an MBA from Cranfield University.