In this interview, AZoQuantum speaks to Scott Genin, Vice President of Materials Discovery at OTI Lumionics, about the frontier of quantum simulations in materials science. Scott shares how his team navigates the challenges of current quantum hardware to unlock scalable, real-world applications.
Can you share how your expertise bridges the fields of chemical engineering and materials discovery, and what drew you to quantum computing?
Yes, my background is a bit unusual for my position at OTI and my interest in quantum computing. My interest is in experimental design and optimization of sampling routines using models and statistics. In chemical engineering, experiments and tests are very expensive, so the ‘move fast and break things’ doesn’t work as a few failures could result in running out of funds. As a consequence, we have to ask the question: “what are the optimal experiments to run given the current knowledge”. This is where modeling becomes very important since it is much less expensive than running experiments. The challenge though is that since OTI focuses on developing new and novel materials, we don’t have large datasets to use ML models. We have to take a first principles approach since we need to understand the underlying physics of the materials. Quantum computers hold a lot of promise in that they're able to simulate materials more efficiently compared to classical computers.

Image Credit: Dragon Claws/Shutterstock.com
What are some of the key bottlenecks of quantum computing that you see in materials discovery, and what solutions are being developed to overcome them?
There are bottlenecks with both the act of running quantum chemistry simulations on quantum computers and the application of quantum simulations to materials discovery. Current quantum computers lack the fidelity required to run very long circuits that are required for quantum chemistry. Improved error correction and better architectures should help resolve these issues, but they need to also improve the interconnectivity between the qubits at the same time, which may prove to be quite difficult. There is also an underlying assumption in the quantum computing community that simulation examples such as simulating a chain of hydrogens or ring opening using a basis set of STO-3G would translate to a useful result, but such ideas are very misleading. Materials simulations on a quantum computer need to be as accurate as possible since they compete with cheap conventional quantum chemistry algorithms such as DFT.
Your team has successfully conducted 160-qubit simulations, which is a major milestone given the noise and error rates in current quantum processors. What made these simulations possible, and what lessons did you learn from running computations at this scale?
We ran these simulations on our quantum emulator that uses classical computing. The emulator does not try to mimic the noise of a universal quantum computer. It simulates the materials using the same logic as a quantum computer, but conducted as efficiently as possible. This allows the emulator to simulate a large number of qubits with very deep circuits. Our recent innovation was how to optimize over 200,000 parameters in the circuit, as opposed to just simulating random combinations.
Can you share any insights into how quantum simulations are already making an impact in different areas of application, such as next-gen materials for electronics?
Quantum simulations run on a quantum computer or quantum simulator outside of OTI are not making any impact on next-gen materials. To be as direct as possible, material simulations that can run on current quantum computers are not relevant. Only OTI Lumionics’ Qubit Coupled Cluster (QCC) simulations on our emulator have demonstrated any simulation of industrial relevance. To be as specific as possible, the only chemical simulations of any relevance demonstrated on a quantum computer are classified as non-variational and through our peer-reviewed publications, OTI Lumionics has shown that non-variational quantum computing simulations of chemical systems can be easily replicated on a classical computer at scale. Other quantum computing simulations have either been performed on huge supercomputing clusters or have been resource estimations for variational quantum chemistry algorithms.
What are some of the most promising strategies you see emerging to make quantum simulations more reliable and scalable?
We see a lot of surface code and error correction improvements to make a stable algorithmic qubit and new architectures such as using bosonic modes. While I think the surface code developments for superconducting quantum computers is promising, I suspect that in the long run, architectures that are natively stable will be better since many architectures have error correction algorithms that can be added on top.
About the Speaker

Scott Genin is the Vice President of Materials Discovery at OTI Lumionics, where he manages the computational design and synthesis of novel materials for organic light emitting diodes. As part of his role, he leads the development of algorithms to simulate molecules and material properties on quantum computers. Scott has a B.A.Sc in Chemical Engineering and Chemistry from Queen’s University and a Ph.D. from the University of Toronto. Scott previously worked in the pharmaceutical industry prior to joining OTI.
Disclaimer: The views expressed here are those of the interviewee and do not necessarily represent the views of AZoM.com Limited (T/A) AZoNetwork, the owner and operator of this website. This disclaimer forms part of the Terms and Conditions of use of this website.