Integrating quantum simulations into multiscale models for materials development

Julian van Velzen, Head of Capgemini’s Quantum Lab, describes what happens in a major application area – materials development and chemical processes – when quantum computing, multiscale modeling, artificial intelligence, and classical computing are brought together in a business-and-academia partnership program

In The Karate Kid – we’re talking here about the original film from 1984 – there is a sequence in which the young hero, Daniel, has asked his elderly Japanese neighbor, Mr Miyagi, to teach him karate. Mr Miyagi gives him household chores to do: sanding the decking, waxing cars, painting the house and the fence. After four days the boy loses his temper. “You said you were going to teach me karate!” Mr Miyagi demonstrates that the repetition of the chores has helped Daniel to learn defensive blocks as a form of muscle memory.

It’s a great example of extrapolation, of deriving bigger lessons from smaller things – and also of doing the groundwork. Just as you can’t learn karate without some hardwired practices, so you can’t enter new territory in science or business without an understanding of the fundamental principles at work.

Let’s take R&D in materials development as a case in point. Researchers who start with incomplete knowledge or assumptions are obliged to proceed on a trial-and-error basis – but if they can harness advanced techniques such as multiscale modeling and quantum computing to develop insights into materials at an atomic level, and if they can bring those techniques together with other, more established processes, they can steer the course of their simulations and digital-driven designs with a much greater degree of confidence.

How can this be done? What progress is being made?

Artificial intelligence is not enough

The questions we’re considering here fall within the scope of Capgemini’s strategic university research partnerships initiative. This planned and coordinated program involves bilateral collaborations where Capgemini teams work alongside world-class leading universities on key selected challenges. Dedicated team leaders take forward these bilateral projects with the goal of producing high-quality outputs that are useful not just in terms of pure research, but in terms of practical, real-world benefits.

Universities play a critical role here by working at the boundary between what is theoretically possible and what is not yet computationally tractable in industry. As part of this program, and also as part of a broad, multi-team series of projects exploring the potential and application of quantum computing, Capgemini has been working with King’s College London (KCL) to develop a closer understanding of how quantum simulations can be integrated into multiscale models in the context of the development of new materials.

A key focus of the collaboration is quantum embedding: combining accurate quantum-mechanical treatments of the most chemically important region (for example, a catalytic active site) with more scalable classical models of the surrounding environment. This makes it feasible to study chemical reactions, including surface and heterogeneous catalysis, with a level of fidelity that would otherwise be out of reach.

Materials development, particularly where reaction chemistry and catalysis are involved, is a prime example of innovation, and moreover of innovation that takes place in a world of large datasets. So it’s often automatically assumed that this is an area that can be tackled by the modern magic bullet, artificial intelligence (AI).

And yes, that’s true. But it’s true only to a certain extent. Artificial intelligence can indeed explore vast datasets and recognize patterns – but it can’t understand the scientific behavior of a material in real-world environments. It can’t distinguish between genuinely trail-blazing discovery and rediscovery. And it can’t justify any options it identifies by giving them the rigor this discipline requires.

Bringing AI together with quantum computing…

A different approach is needed – one that isn’t solely predicated on the use of AI. One such approach is sometimes termed ‘first principles modeling’. Also referred to as ab initio modeling, it uses quantum mechanics to predict material properties and molecular behavior more accurately. By relying on physics, this framework ensures a more reliable foundation for experimentation – but it also provides other important benefits:

  • Validation of AI predictions: Quantum mechanical methods such as the Density Functional Theory (DFT) and Coupled-Cluster theory can provide a physics-based validation of AI-generated candidates before any investment in costly experiments;
  • Exploration of novel domains: Quantum mechanics can be used to model systems where limited or no training data already exists, opening pathways to true innovation;
  • High-quality training data: Simulations informed by quantum mechanics help establish robust datasets that improve AI accuracy over time;
  • Mechanistic understanding: Unlike black-box AI predictions, quantum methods explain why a material or molecule behaves as it does, enabling smarter, more informed experimentation.

By using quantum mechanics as the foundation for discovery, first principles modeling is reliable to a degree that AI can’t deliver on its own. Quantum computing promises to go even further, offering new ways to interrogate material behaviour that remain challenging for today’s classical techniques.

There is a caveat, though: like AI, quantum computing can’t do everything. Current quantum hardware remains limited by noise, error rates, and qubit connectivity issues that restrict the size and complexity of problems that can be addressed. Understanding these limitations and when they might be overcome is essential for developing realistic expectations about quantum computing’s near-term impact on materials discovery.

In the short to medium term, the most promising application of quantum computing is to address problems where quantum effects are significant, particularly in strongly correlated systems such as transition-metal compounds, high-temperature superconductors, and complex magnetic materials. These systems, where electron-electron interactions create behaviors that emerge from genuine quantum entanglement, represent challenges that classical methods have struggled to address effectively.

In addition, quantum algorithms may offer advantages for simulating dynamics, excited states, and reaction pathways, especially when the chemistry happens in complex environments such as solid surfaces, interfaces, or catalytic materials – areas where classical methods often require significant approximations. The ability to model these phenomena more accurately could provide insights into photocatalysis, energy harvesting materials, and the chemical reaction mechanisms that drive many technologies.

… and bringing quantum together with classical computing

Quantum computing can achieve even more if it is put to work alongside traditional computing as part of a hybrid approach.

For example, quantum-centric supercomputing (QCSC), a term used to describe tightly coupled quantum–HPC workflows, can comprise classical pre-processing, quantum computation, and classical post-processing in an integrated workflow. By formalizing this relationship, QCSC creates a framework where problems can be decomposed into components best suited to different computational architectures.

For materials simulation, this decomposition might involve using classical methods to handle molecular fragments where electron correlation is less significant, while deploying quantum resources for strongly correlated regions where classical approximations break down. In practice, this can be implemented through embedding frameworks that define how information flows through scales between the quantum and classical parts of the calculation, and how accuracy can be improved systematically. It also provides a natural route to testing methods on emulators and near-term hardware, while keeping an eye on longer-term scalability. This targeted application of quantum computing to specific subproblems, rather than attempting to quantum-compute entire systems, represents a practical path to near-term impact even with limited quantum resources.

It’s not about attempting to use the latest and most exciting technologies for everything. It’s about using the right vehicle and taking the best route to the desired destination. It’s about pragmatism – and it’s about results.

A foundation for the future

The hybrid approach described in this paper is not only about today; it’s also about preparing for tomorrow. Why? Because at the moment, the quantum computing methods that form part of that hybrid are still in development.

By building atomistic and multiscale models now, organizations will be able to unlock the full potential of quantum computing when the hardware is ready. Quantum will supercharge these simulations, but many breakthroughs can already be delivered with current technology, with quantum chemistry on classical devices, with molecular dynamics, and with multi-physics simulations. This coordinated approach is changing the way R&D can be done. Capgemini’s role is to connect these scientific advances to industrial workflows, data platforms, and decision‑making processes that organisations already use today.

When young Daniel in The Karate Kid was sanding the decking, waxing cars, and painting the fence, he was of course completing short-term tasks – but he was also establishing a foundation on which he could build his ambitions.

Similarly, Capgemini and its academic and commercial partners are transforming R&D to be quantum-ready, while delivering tangible value today.