Published 1 July 2025 by Benjamin Skuse
Beyond Chemistry: Physics and High-Performance Computing Make an Appearance

With the vast majority of talks at #LINO25 dedicated to all things chemistry or chemistry-adjacent, and the vast majority of Laureates and young scientists strongly bonded to chemistry (or chemistry-adjacent) topics, attending a talk given by someone with a different perspective can be refreshing.
Three talks from beyond chemistry stand out – two parallel Agora Talks on Monday, 30 June 2025 delivered by physicists Klaus von Klitzing and Reinhard Genzel, and computer scientist Jack Dongarra’s evening Heidelberg Lecture on Wednesday, 2 July.
A Double Physics Whammy
Both regular attendees at the Lindau Nobel Laureate Meetings, von Klitzing and Genzel are enthusiastic and absorbing speakers, capable of winning over a crowd no matter what their background or familiarity with the science. “This is the first time that I’m speaking at a chemistry meeting here in Lindau,” began von Klitzing. “I adjust my talk a little bit to chemistry, but science makes no difference between physics and chemistry.”
von Klitzing’s specialism is somewhat esoteric on face value, fundamentally important to physics, chemistry and all sciences when you dig a little deeper. He was awarded the Nobel Prize in Physics in 1985 “for the discovery of the quantized Hall effect”. American physicist Edwin Hall encountered the regular Hall effect in 1879. It happens when a magnetic field is applied at right angles to a conducting wire with electricity flowing through it, creating a measurable voltage difference across the material. If you slowly increase the power of the magnet, the voltage steadily increases too.
In his Nobel Prize-winning work, von Klitzing repeated the Hall effect experiment but with an incredibly thin metal plate cooled to almost absolute zero. This time, when von Klitzing slowly increased the power of the magnet, the resistance jumped up like steps on a staircase – a typically quantum trait, and in fact an entirely new quantum phenomenon.
During his Agora Talk, von Klitzing did briefly talk about how he discovered the quantum Hall effect (“by accident”, he rather modestly said), but he really wanted to speak about what this discovery led to – a revolution in metrology, the science of measurement.
The Quantum Metrology Revolution
In the late 19th and early 20th Centuries, physics titans James Clerk Maxwell and Max Planck (1918 Nobel Prize in Physics) were already questioning the wisdom of basing humanity’s system of measurement on objects that can change, a practice that dated back to the Ancient Egyptian ‘royal cubit’ – the length of the Pharaoh’s forearm from the elbow to the tip of the middle finger, plus the width of their palm.
Both Maxwell and Planck called for science’s system of measurement to be based instead on fundamental and natural constants that are the same everywhere, for all time. “[Planck] was fascinated by these fundamental constants,” said von Klitzing. “He even developed some natural units based on fundamental constants… [though they were] not very useful for practical application.”
Over time, metrologists did manage to base time and distance on fundamental aspects of the universe. Since 1967, the second has been derived from the unperturbed ground state hyperfine transition frequency of the caesium-133 atom being 9,192,631,770 Hz. And since 1983, the metre has been derived from the speed of light in vacuum being 299,792,458 m/s.
However, up until quite recently other important units were far from fundamental. The kilogram, for example, was defined from a single platinum–iridium alloy object: the International Prototype Kilogram (IPK). If a mote of dust landed on the IPK, its mass would be slightly higher and, by definition, all masses would need to be revised. In theory, any change to the IPK would have knock-on effects for other fundamental units too. “In chemistry, you have the mole,” said von Klitzing. “When I went to school and university, 1 mole was the number of atoms in 12 grams of carbon-12.” This meant that if the kilogram varied slightly, the mole varied slightly too.
What sparked change was how the quantum Hall effect related to another example that had no fundamental basis – the unit of electrical resistance: the ohm. In a quantum Hall system, the value of each step in resistance is precisely an integer multiple of a quantity called the von Klitzing constant. This constant is defined in terms of Planck’s constant, h, and elementary charge, e. Soon, metrologists realized that if Planck’s constant and charge were deemed fundamental and fixed, this could provide a direct, quantum-mechanical basis for the ohm that anyone could realise, just by making a quantum Hall measurement.
This reasoning led to interest in looking for fundamental constants on which to fix all the remaining base units. Fast-forward to 2018 and von Klitzing was front and centre when the decision was made at the 2018 Metre Convention to adopt the most sweeping change to the International System of Units (SI) since its inception – to finally define all base units by immutable constants of nature.
Though adopting this new measurement system had little to no effect on daily life, it finally made the SI units fit for the 21st Century and beyond. “In my life, [the 2018 Metre Convention] was the most important conference,” he recalled. “This was really very emotional and historically great, because all different countries had to stand up and agree – so really, quantum metrology united all countries.”
Our Galaxy’s Heart of Darkness
While a large cohort of young scientists were dipping their toe into the fascinating world of quantum metrology in the Main Hall, it was standing room only in the Conference Rooms for an equally large crowd gathered to hear from Genzel.
Alongside black hole theorist Roger Penrose, Genzel and Andrea Ghez were awarded a share of the 2020 Nobel Prize in Physics for their respective contributions to the discovery of a supermassive black hole at the heart of our galaxy.
Though a supermassive black hole was suspected to reside in the centre of the Milky Way throughout the latter half of the 20th Century, from the 1990s onwards, Genzel and Ghez were pioneers in searching for direct signatures of this extreme object, dubbed Sagittarius A*. They reasoned that the motions of gas and stars in its immediate vicinity would reveal tell-tale signatures consistent with black hole theory.
With Genzel using the four 8 metre telescopes of the European Southern Observatory in Chile, and Ghez the Keck telescopes on Mauna Kea in Hawaii, they produced consistent results that revealed Sagittarius A* measures less than 125 times the distance between Earth and the Sun, but is 4 million times the mass of the Sun. The explanation? A supermassive black hole.
Molecules in Space

Though he freely admitted he is far from a chemist, Genzel took the time to link his interest in astronomy with the interests of his audience. As a result, his Agora Talk was like an advert for astrochemistry. “Most galactic structure is molecules because only if you have molecules can you cool down and make it dense enough so that self-gravity starts dominating,” he said. “If we look into molecular clouds, where new stars are forming, we see hundreds and hundreds of lines of fairly complex molecules – about 200 interstellar molecules have been seen in interstellar space.”
He then detailed how these molecules form in what was once thought to be the barren interstellar medium. Atoms are protected from ultraviolet radiation by the diffuse gas that makes up the interstellar medium. Some of these atoms are ionised by very high-energy cosmic rays, which allows them to be brought together by the electromagnetic force to form molecules.
“Molecules like H2+ can then interact with other molecules over a longer range and form more complicated molecules,” Genzel continued. “In the end of all of this, you may even get something as complicated as amino acids in interstellar space; small amounts, but still, that’s the story – our origins start in these dark clouds which we see when we look at the sky.”
Responsibly Reckless High-Performance Computing
Though 2024’s Nobel Prize in Chemistry went to John J. Jumper and Demis Hassabis for AI model AlphaFold2, and the same year’s Physics Prize went to John J. Hopfield and Geoffrey Hinton for foundational work on machine learning with artificial neural networks, the fact that there has never been a dedicated mathematics Prize nor a computer science Prize has been a bone of contention for decades.
The importance of these disciplines to modern society was the motivation behind founding the Heidelberg Laureate Forum (HLF) in 2013. Broadly based on the Lindau Nobel Laureate Meetings, each year the HLF brings together the recipients of the most prestigious awards in mathematics and computer science with around 200 selected young researchers from all over the world.
Now partners with close ties, each year a Nobel Laureate presents a Lecture at the HLF and a Heidelberg Laureate presents a Lecture at the Lindau Nobel Laureate Meeting. At #LINO25, attendees were treated to a talk from 2021 A.M. Turing Award winner Dongarra.

Dongarra has been front and centre of high-performance computing developments for decades, and has seen tremendous change in this time. In fact, his ‘LINPACK Benchmark’ has been the gauge used to monitor these changes since 1993 through the Top500 list of the world’s fastest supercomputers.
“The number one machine back in 1993 was at Los Alamos National Lab and was used for nuclear weapons simulation,” he explained. “This computer – the thing I’m using to run this slideshow and that I use mainly to read emails – when I run the benchmark, it’s outperforming that machine at Los Alamos National Laboratory, and it would have been number one on the list until 1996; an incredible situation.”
These changes have also led Dongarra to reappraise the Benchmark itself: “The Benchmark is no longer strongly correlated with real applications” he confirmed. For today’s supercomputers modelling climate change, catalysis, molecular dynamics and other sparse matrix problems, Dongarra and collaborators have devised a new benchmark: the High Performance Conjugate Gradients (HPCG) Benchmark.
With HPCG, they have exposed high-performance computing’s “dirty little secret”: “A number of machines in the top 10 get less than 1% of the theoretical peak,” Dongarra elucidated. “Think of it like, you might buy a race car and that race car has a speedometer that goes up to 200 kilometres per hour, but you get 2 kilometres per hour – you’re not going to be very happy with that result!”
Dongarra himself is attempting to fix part of this problem, focusing on algorithmic efficiency. He and collaborators have designed “responsibly reckless algorithms” that achieve high accuracy by performing the bulk of the operations in less accurate 32-bit arithmetic, then postprocessing and refining the 32-bit solution into a 64-bit accurate one – a process at least twice as fast as 64-bit arithmetic.
He believes ideas like this can be combined with other innovative approaches in order to design machines for the problems they are tasked with solving “I’d like to think in the future we would co-design our machines,” he mused. “We’d have the architects get together with the algorithm guys, with the computational scientists, with the software people, to design machines that everybody optimises to get improved performance, and a very energy-efficient system that scales to the kinds of problems that we want to do.”