Quantum Simulations of New Materials for the 21st Century
We are surrounded by a multiplicity of materials, from metals and alloys to crystals, glasses, and ceramics; from polymers and plastics to organic and living-derived substances; and let’s not forget natural materials like stone and exotic materials like aerogel.
The amazing thing to me is that all these materials are formed from different combinations of the same small group of elements. For example, while living organisms and other objects can contain traces of many elements, a core group does the heavy lifting; only six elements—carbon (C), hydrogen (H), oxygen (O), nitrogen (N), phosphorus (P), and sulfur (S)—make up over 95% of the mass of most living things.
Similarly, only eight elements—oxygen (O), silicon (Si), aluminum (Al), iron (Fe), calcium (Ca), Sodium (Na), potassium (K), and magnesium (Mg)—make up more than 98% of the Earth’s crust.
As an aside, there are currently 118 confirmed chemical elements in the periodic table. These range from hydrogen (element 1) to oganesson (element 118). The reason I say “currently” is that there are ongoing attempts to synthesize additional elements, but we can worry about that later.
Although there may appear to be a vast number of materials available to us, the ones we see that are naturally occurring, coupled with the ones we’ve created in our laboratories, represent only a tiny fraction of the possible combinations and permutations of atoms. And even this is only the tip of the iceberg, as it were. For example…
…I’ve said it before, and I’ll say it again, one of the books at the very top of my “must-read” recommendations is The Disappearing Spoon and Other True Tales of Madness, Love, and the History of the World from the Periodic Table of the Elements by Sam Keen. As far as I’m concerned, this is a real page-turner.
I was introduced to so many new concepts by this book that it made my head spin. For example, “superatoms” are clusters (between 8 and 100 atoms of one element) that have the amazing ability to mimic single atoms of different elements. As Sam says in his book: “For instance, thirteen aluminium atoms grouped together in the right way do a killer bromine: the two entities are indistinguishable in chemical reactions. This happens despite the cluster being thirteen times larger than a single bromine atom and despite aluminium being nothing like the lacrimatory poison-gas staple. Other combinations of aluminium can mimic noble gases, semiconductors, bone material like calcium, or elements from pretty much any other region of the periodic table. The clusters work like this…”
Everything seemed so simple when I was taught chemistry and physics at high school. For example, electrons “orbiting” an atom’s nucleus occupy distinct energy “shells.” The first shell can hold two electrons, the next shell can hold eight, and… then it gets more complicated.
Each element is defined by the number of protons (hydrogen has one, helium has two, etc.). By default, each atom has the same number of negatively charged electrons as it has positively charged protons. So far, so good. However, each atom would ideally like to see its outermost shell full of electrons. The problem is that you can’t just add or remove electrons (well, you can, but the atom doesn’t like it). The solution is for atoms to bond together.
The two main types of bonds that stick in my mind are covalent bonds (valence bonds), where atoms share pairs of electrons, and ionic bonds, where electrons are transferred from one atom to another, resulting in charged ions that attract each other and stick together. (Let’s not muddy the waters with things like hydrogen bonds, metallic bonds, dipole-dipole interactions, and things of that ilk).
So far, so good, but… I used to think that the electrons in the outermost shell would be the ones that were always on display to the outside world. Sam taught me otherwise. It seems that the transition metals and rare earth elements “hide” (or “shield”) their outer shell electrons beneath inner layers. In the case of the rare earths, the f-electrons are so well shielded by the s- and d-orbitals that they often don’t participate directly in bonding. In addition to resulting in counterintuitive chemical reactions, this explains why these elements are so hard to tell apart.
Humans like to give things labels, such as dividing things into “ages,” like the Stone Age, Bronze Age, and Iron Age. Of course, nothing is simple because these epochs occurred at different times for different groups of people.
Also, ages can overlap and run alongside each other. For example, consider the Industrial Age (a.k.a. Industrial Revolution, ~1750–1900s), the Electrical Age (~1880s–early 20th century), the Information Age (a.k.a. Digital Age, ~1950s–present), the Internet Age (a.k.a. Networked Age, ~1990s–present, and the Artificial Intelligence Age (~2010s–onward).
Now, this is where things get interesting, because there’s a strong case to be made that we’re entering (or are already in) a Materials Age; that is, a time when our greatest leaps come from new materials enabling new capabilities.
Some examples of what we’re talking about are things like graphene and 2D materials (e.g., ultra-thin, ultra-strong, super-conductive), metamaterials (i.e., materials engineered to have properties not found in nature), biomaterials (e.g., smart prosthetics, biodegradable plastics, self-healing polymers), quantum materials (e.g., superconductors, topological insulators), and carbon composites and aerogels (lighter, stronger, more efficient).
The impacts of such materials will be far-reaching, solving energy storage (e.g., batteries and supercapacitors), revolutionizing medical tech (e.g., smart implants, drug delivery), transforming electronics (e.g., flexible circuits, wearable tech), and enabling green energy (e.g., solar cells, hydrogen storage).
The interesting thing about all this is that artificial intelligence (AI) has a significant role to play with respect to developing these new materials. However, there’s a problem… a fly in the soup and an elephant in the room, as it were (I never metaphor I didn’t like).
I was just chatting with Scott Genin, who is VP of Materials Discovery at OTI Lumionics. I’m afraid to say that, despite his winsome smile, Scott is not a nice man. He made my head hurt (it’s still aching as we speak).
Scott Genin in the Lumionics lab (Source: OTI Lumionics)
Scott explained that the current “gold standard” method to simulate solid-state materials is to employ Density Functional Theory (DFT). This is a quantum mechanical modeling method used extensively in physics, chemistry, and materials science to investigate the electronic structure (i.e., the behavior of electrons) of atoms, molecules, and solids. The problem is that DFT-based simulations don’t always accurately reflect experimental data. A lot of the time, you end up “tweaking” parameters to make things work, which explains why some recent DFT simulations predict superconductivity, while others do not, often for the same materials, for example.
I’m reminded of the Drake equation, which, as described by the Wikipedia, is “A probabilistic argument used to estimate the number of active, communicative extraterrestrial civilizations in the Milky Way Galaxy.”
The thing about the Drake equation is that you can tweak the coefficients to achieve almost any result you desire. To some extent, the same can be said for DFT simulations. At its core, DFT simplifies the incredibly complex many-body Schrödinger equation (which describes how electrons behave). The point is that it’s a simplification, albeit a mind-bogglingly complex one (if you’ll forgive the oxymoron). Thus, the problem with training AI using DFT data is that the AI will inherit all the problems associated with that data, leading it to invent materials that don’t perform as expected.
This leads us to the first area where the guys and gals at OTI Lumionics distinguish themselves. “Ab initio” is Latin for “from the beginning” or “from first principles.” In computational physics and chemistry, ab initio methods are techniques that model systems using basic physical laws without relying on empirical or fitted parameters. That’s what the chaps and chapesses at OTI Lumionics are doing.
But wait, there’s more, because they’ve developed a sophisticated suite of quantum computing algorithms that leverage their ab initio models to accurately simulate solid-state materials. This enables precise prediction of properties such as the band gaps of semiconductor combinations, as well as the refractive indices and optical emission wavelengths of organic light-emitting diode (OLED) materials.
What? You want more? Well, what about the fact that everyone’s enthused about quantum computing, even though most of us have never actually used one (or even seen one)? One of the advantages that’s often touted for quantum computers is that they can arrive at conclusions in a fraction of the time of their classical computing counterparts.
As an aside, if you want to learn more about quantum computing, my friend Duane Benson offers a free weekly Quantum Edge newsletter (see also my Want to Understand Quantum Computing? Cool Beans Blog).
And we’re back… The term Coupled Cluster (CC) refers to a method used to calculate the electronic structure of molecules. It’s highly accurate and particularly good at describing systems with electron correlation — that is, how electrons avoid each other due to their mutual repulsion. CCS (Coupled Cluster Singles) includes single excitations (one electron jumps to a different orbital), CCSD (Singles + Doubles) adds double excitations (two electrons jump), and CCSD(T) (Singles + Doubles + approximate Triples) adds a perturbative estimate of triple excitations (three electrons jumping at once).
Scott informs me that, when simulated on classical computers, the CC algorithms can get very disturbed if two orbitals are close in energy, oftentimes causing them to fail to converge, which is not a good thing. By comparison, quantum computers, which are essentially considering all states at the same time, don’t seem to have any problem converging on a solution. Pretty cool, eh?
Now, here’s the kicker… which is that Scott and his compatriots have come up with a way of emulating a quantum computer using a classical machine. And I’m not talking about using a supercomputer—just a reasonably hefty workstation. We’re talking about runtimes measured in hours or days (not millions of years as the quantum computing folks might try to imply). The great thing is that when affordable quantum computers eventually come online (say in five years or so), then these algorithms can be directly ported over to the quantum world.
There’s so much more to say, but I can say no more. Did I mention that Scott made my head hurt? Well, if you want to learn more about how OTI Lumionics’ technology can be applied to your next-generation materials problems, I’m sure that Scott will be delighted to make your head hurt, too. In the meantime, as always, I welcome your comments (but not your questions because I am a bear of little brain).
April 24, 2025 at 01:59PM
Max Maxfield