The Future is Bright for Lithium-Ion Batteries

Lithium-ion batteries. Credit: Andrey Klemenkov/iStock 

Since they were first commercially introduced in 1999, lithium-ion batteries have become an integral part of modern technology and, consequently, of our modern way of life.

They power almost every smartphone, laptop, and tablet sold today across the world. And their role will likely prove to be even more important in the future, as electric vehicles (EVs) are still an emerging market. Such vehicles – which include not only electric cars, but also electric motorcycles, buses, or trucks – are bound to replace conventional petrol-fuelled, further driving the demand for high-density lithium-ion (Li-ion) batteries. 

Batteries are also under-exploited in power supply systems, especially in combination with photovoltaics and wind power, where they’re poised to massively reduce carbon emission. There is arguably no other piece of material science that has touched the way of life of everyone on this planet like Li-ion batteries have. 

This achievement was made possible by the work of John B. Goodenough, M. Stanley Whittingham and Akira Yoshino. Their pioneering work was recognised by the Royal Swedish Academy of Sciences, who in 2019 awarded them the Nobel Prize in Chemistry “for the development of lithium-ion batteries”. (Learn more about their research in this previous blog post.)

The future seems bright for lithium-ion energy storage, but what can we expect?

The EV market is poised to grow to $567 billion by 2025. Credit: MikesPhotos/Pixabay

 

Why Li-Ion Batteries are Amazing Energy Storage Devices

The Li-ion battery (LIB) works similar to other batteries. Its major difference however is that the electrodes are not as strongly affected by chemical reactions. The Li-ions flow from the negative anode to the positive cathode while discharging and vice-versa when charged.

The main reason why LIBs are so popular is owed to their impressive energy density (100-265 Wh/kg or 250-670 Wh/l, depending on the number of lithium ions the electrodes can hold per unit of surface area). This enables mobile devices to draw their power from a very small space. LIBs offer short charging times and can run a high number of discharge cycles before they run out, compared to other battery technologies, such as nickel-cadmium or nickel metal hybrid.

The main shortcoming of LIBs is safety. LIBs tend to overheat and can become damaged beyond repair at high voltage. In extreme cases, Li-ion power systems can even combust as observed with the Galaxy Note 7 smartphone, whose battery defect caused some phones to catch fire. A similar problem caused the grounding of a Boeing 787 fleet. Nowadays, manufacturers are required to implement sophisticated safety mechanisms that limit the voltage and internal pressure.

 

The Future of Li-Ion Energy Storage

The largest market for Li-ion batteries has traditionally been portable electronic devices but there is also an extensive growth in the demand for LIBs in transportation. As electric vehicles are on a path to match conventional cars in terms of price and distance range, it might only be a matter of time before most or all road transportation is electric — powered by LIBs, of course. Today, it’s not uncommon for an EV to last 360-450 kilometres per charge. With the improvement of energy density the car’s autonomy will be increased, making EVs more viable.

Fast charging is another key aspect. Dr. Chao-Yang Wang, professor at Pennsylvania State University, and collaborators used a special setup to charge a LIB to 80% in 10 minutes without damaging it. “The 10-minute trend is for the future and is essential for adoption of electric vehicles because it solves the range anxiety problem,” Wang said in a press release.

The impact of LIBs in transportation also includes aerospace applications, from drones to satellites. The Israeli firm Eviation is working on a prototype of a completely electric aircraft that will be able to carry nine passengers for up to approx. 1 000km at 3 000m and 440km/h — all powered by batteries.

LIBs will also prove essential in tackling climate change, by supplying vehicles and our households with renewable energy. Renewable energy depends on environmental factors. Solar panels don’t generate power at night nor do turbines during low wind. The race is among researchers right now to find the most optimal and cost-effective solution to store that energy in order to make it price-competitive with fossil-powered plants.

 

Crefit: elxeneize/iStock

Already, batteries produced in new factories in China, the U.S., Thailand and elsewhere are driving down prices tremendously. They have plunged 85% since 2010. If this trend continues, it is possible that the electricity grid of the future will be largely supported by energy storage systems based on Li-ion batteries. LIBs can cause an increase in energy decentralisation as more people employ energy storage systems in conjunction with rooftop solar.

Keep in mind that where there is a need for technology, a demand for power follows. This also includes the world of miniature electrical devices.Substantial advances have been made in integrating LIBs in miniaturised medical devices like hearing aids or low-power implantable devices used for glucose sensing, neuro-stimulation, drug delivery, and more.

 

A Finite Resource

Li-ion batteries have tremendous potential to transit the world towards a 100% renewable future on a global scale.

However, such a transition needs to be carried out with responsibility. Lithium is sometimes referred to as ‘white petroleum’, a nod to the fact that it is a finite resource with a major environmental impact.If Li and other rare earths are mined using poor management practices it can result in significant carbon emissions and lasting environmental inpact. By the year 2025, lithium demand is expected to soar to 1.3 million metric tons of LCE (lithium carbonate equivalent) — that’s over three times today’s levels.

Towards this goal, it is important to minimise our dependence of cobalt, introduce battery collection and recycling schemes, exploit novel concepts such as second-hand batteries to exhaust battery cycle life before reaching the recycling plant, shift lithium extraction away from hard rock to brine, and promote market growth in order to take advantage of economy of scale effects.

Today, Li-ion batteries are already mainstream and mean big business. But, in the future, all projections suggest the technology is heading only one way — up. For instance, MIT’s Yet-Ming-Chiang claims there are three times as many scientists working in battery research in the US than there were just ten years ago. With all these researchers working on solving the biggest limitations faced by LIBs, innovation is bound to happen. Perhaps, the best use of LIBs is still sitting in a lab somewhere, waiting to be discovered.

A Rechargeable World: 2019 Nobel Prize in Chemistry

Today, the Royal Swedish Academy of Sciences announced the 2019 Nobel Laureates in Chemistry. John B. Goodenough, M. Stanley Whittingham and Akira Yoshino received the Prize “for the development of lithium-ion batteries”.

John B. Goodenough, M. Stanley Whittingham and Akira Yoshino, 2019 Nobel Laureates in Chemistry, Copyright: Nobel Media. Illustration: Niklas Elmehed

From the popular scientific background of the Royal Swedish Academy of Sciences:

“An element rarely gets to play a central role in a drama, but the story of 2019’s Nobel Prize in Chemistry has a clear protagonist: lithium, an ancient element that was created during the first minutes of the Big Bang. […]

Lithium’s weakness – its reactivity – is also its strength. In the early 1970s, Stanley Whittingham used lithium’s enormous drive to release its outer electron when he developed the first functional lithium battery. In 1980, John Goodenough doubled the battery’s potential, creating the right conditions for a vastly more powerful and useful battery. In 1985, Akira Yoshino succeeded in eliminating pure lithium from the battery, instead basing it wholly on lithium ions, which are safer than pure lithium. This made the battery workable in practice. Lithium-ion batteries have brought the greatest benefit to humankind, as they have enabled the development of laptop computers, mobile phones, electric vehicles and the storage of energy generated by solar and wind power.”

Read more about the 2019 Nobel Prize in Chemistry here.

How to Weigh an Atom: Francis W. Aston’s Mass Spectrograph

Francis W. Aston with the first mass spectograph that was set up in the Cavendish Laboratory at the University of Cambridge, UK, in 1919.

Francis W. Aston was a man of many talents, from glass blowing to playing the piano, as well as being possibly the only surfing Nobel Laureate – he learned in Honolulu in 1909. But it’s for his achievements as an experimental scientist extraordinaire that the physicist and chemist is better known.

This year, it’s 100 years since Aston built his first mass spectrograph, a device capable of measuring the relative masses of individual atoms and molecules. His spectrograph, together with the findings he made with it, was to win him the 1922 Nobel Prize in Chemistry and launch the field of mass spectrometry.

Born in 1877 in the midlands of England, Aston studied chemistry and physics at Mason College in Birmingham. In a time when advances in physics were coming thick and fast, the undergraduate was particularly thrilled by Röntgen’s discovery of X-rays using a Crookes tube in 1895.

It inspired him to investigate the electrical discharge of gases in the tubes after he graduated. His choice was to be fortuitous. Joseph J. Thomson, one of Britain’s leading physicists, shared Aston’s fascination, and in 1909, he invited Aston to work as his assistant at the University of Cambridge. Thomson had heard of the talented experimentalist through a mutual acquaintance.

Francis William Aston (1877-1945)

Thomson had already been awarded the Nobel Prize in Physics, in 1906, for his work in gas discharge tubes. He had correctly deduced that cathode rays were a stream of negatively charged sub-atomic particles – the electron. He then turned his attention to the simultaneously produced positive rays. His work was to pave the way for a new field of mass spectrometry.

Thomson used electric and magnetic fields to deflect the rays, recording the deflections on photographic plates placed in their path. The set-up produced traces in the shape of parabolas on the plates, as the particles comprising the rays were deflected through a range of angles due to a spread in their velocities.

More crucially, however, rays from different elements hit the plates at different locations. Their unique signatures were a consequence of their charge and mass – the fundamental properties determined how much the ions comprising the rays were deflected by the fields.

Aston was to make several contributions to the set-up that improved its performance significantly and, in 1912, the pair took advantage of them to investigate naturally-occurring neon. Except, the gas produced two parabolas instead of one. It was the first evidence of multiple isotopes in non-radioactive elements, but, at that point, the concept of an isotope was still very new and Thomson had significant doubts that this was what they had measured.

Aston set out to investigate the intriguing finding further, attempting, largely unsuccessfully, to separate the two species. His work was then interrupted by World War I and he only returned to the lab in 1919. By then, the concept of the isotope had been widely accepted, increasing suspicion that both were neon.

Consequently, Aston began designing a more powerful device to provide convincing evidence of the two isotopes – his prize-winning mass spectrograph. It was the first of three he was to develop, where each was an order of magnitude more accurate than the previous one.

His spectrograph still used electrostatic and magnetic fields, but Aston changed their orientation and applied them sequentially at different locations. The result was an electromagnetic ‘lens’ that focussed the rays generated by a given element onto a single point instead of a parabola. The more intense spots enabled superior measurements.

In the same year he returned to the lab, he began experiments using his new spectrograph, quickly confirming the existence of neon’s two isotopes with masses of 20 and 22 to an accuracy of one in a thousand. They were the first of 212 isotopes that he discovered in his career – he was to dominate the field.

Aston also demonstrated that isotope masses only occurred as (approximately) integer values: his whole number rule. The finding was an important contribution towards understanding the structure of the atom. It gave rise to an early model of the atomic nucleus that contained electrons and protons and whose mass varied according to the number of protons. At that time, the neutron had not yet been discovered.

But there was more: his meticulous measurements also demonstrated small but significant deviations from the whole number rule. They were due to the binding energy of the atomic nucleus, a concept fundamentally important in nuclear power and nuclear weapons. Aston went on to investigate more deeply with his second mass spectrograph. In his prize acceptance speech in Stockholm in 1922, he presciently recognised it to have profound implications – good and bad – for the human race.

Whether he anticipated quite how widely mass spectrometry would be used today, across the sciences, in research, industry and beyond, is another question. After commercial mass spectrometers became available in the 1940s, it became a staple technique for chemists in identifying and characterising molecules. Since the 1980s, the Nobel prize-awarded ionisation techniques of electrospray ionisation (ESI) and soft laser desorption (SLD) have also enabled the analysis of large biomolecules, making mass spectroscopy an invaluable tool for biologists.

A diverse range of applications include art conservation, drug testing, explosive testing in airports, environmental and climate monitoring, pharmaceutical development and palaeontology. In medicine, mass spectrometry is routinely used to screen new born babies for metabolic disorders, while intelligent scalpels that help surgeons determine if they have removed all of a tumour from a patient are also under development. The list goes on, making mass spectrometry another classic example of how, in basic science research, you really never know where it all might end up.

Cold Fusion, Polywater & N-Rays: Notable Scientific Blunders Throughout History

Dan Shechtman during his lecture at #LINO19. Photo/Credit: Christian Flemming/Lindau Nobel Laureate Meetings

For his seventh Lindau lecture, Israeli materials scientist Dan Shechtman decided to tackle a little-discussed phenomenon in research: scientific blunder. Instead of focusing on successful findings that have stood the test of time, he told the fascinating stories of three scientific discoveries that gained widespread popularity and attention before collapsing like a house of cards.

Shechtman was the sole recipient of the 2011 Nobel Prize in Chemistry for his controversial findings on quasicrystals, or matter which contains an ordered structure that does not repeat itself. His results took two years to get published in a peer-reviewed journal, and once they were released in 1984, other researchers in the field immediately began to doubt them. Double-Nobel Laureate Linus Pauling was particularly vocal about his scorn, even remarking at a conference, “There are no quasicrystals, just quasi-scientists.”

In the end, researchers around the world were able to replicate his results, and he was not guilty of committing a scientific blunder. Unfortunately, the same cannot be said for the scientists working on the discoveries described in Shechtman’s talk.

Young scientists listening to Shechtsmans lecture, Picture/Credit: Christian Flemming/Lindau Nobel Laureate Meetings

“Scientific blunder has to do with bad science, but I define it further… A respectful scientist is involved, because bad science is all around us, but when a prominent scientist makes a prominent mistake, that is something else,” he said. “Also, the media is involved — so the people in the streets hear about this new fantastic science until it dies.”

His first example is the 1903 discovery of N-rays, thought to be a novel form of radiation. Prosper-René Blondlot, a professor of physics at the University of Nancy in France, was studying the polarisation of X-rays when he saw changes in the brightness of an electric spark in a spark gap placed in an X-ray beam. He called the newly discovered emission “N-rays,” and it eventually became the subject of over 300 published scientific articles.

However, many prominent physicists were unable to reproduce Blondlot’s results. Finally, American physicist Robert W. Wood traveled to Blondlot’s laboratory in France to examine the experiments firsthand. Wood took out an essential prism from one of the experimental setups, and yet the laboratory assistant continued to observe N-rays. He even placed his hand in the way of the supposed emission, and the experiment still detected N-rays. Wood reported his findings in the journal Nature, and the belief in the existence of N-rays quickly died off.

Shechtman’s second example involves a new form of polymerised water that had a higher boiling point, a lower freezing point, and the viscosity of syrup. Polywater, as it was later dubbed, was discovered by Soviet physicist Nikolai Fedyakin in 1961 during experiments that looked at the properties of water sealed in quartz capillary tubes. Boris Derjaguin, a Soviet chemist, studied polywater in more detail after hearing about Fedyakin’s finding. A few years later, English and American scientists began to take notice and investigate polywater for themselves.

“During the early 70th, about 100 papers [on polywater] appeared in the scientific literature every year,” Shechtman said. “Derjaguin claimed that polywater was a stable form of liquid water, and eventually all water would transform into polywater.”

The media latched onto the discovery as well, speculating about what would happen to humankind if the oceans and our sources of drinking water morphed into thick polywater soup. But some had their doubts, most notably American physicist Denis Rousseau, who demonstrated that polywater had the same properties as his own sweat. Later, Derjaguin himself analysed 25 samples of polywater for purity and confirmed that the “new form of water” was simply a mix of water and dissolved particles from the quartz capillary tube.

Cold fusion, the last example cited by Shechtman in his lecture, is different from the first two scientific blunders because it still has a small community of researchers who continue to work towards making the nuclear reaction a reality. Is it still a scientific blunder? Yes, said Shectman, because most people still believe it cannot happen.


 

“It is a controversial subject because here there are believers and non-believers. But if it sounds too good to be true, it probably is,” he said.

In 1989, electrochemists Martin Fleischmann and Stanley Pons announced to the media that they had developed a process of achieving cold fusion. In an earlier experiment, Fleischmann had seen how the rare metal palladium could absorb unusually large amounts of hydrogen. He thought perhaps if palladium was loaded up with a huge amount of hydrogen, the atoms would squish together and lead to fusion. They tried this, and after several hours, claimed the setup produced heat and neutrons — in other words, evidence of cold fusion.

The announcement created a huge wave of excitement both within the scientific community and amidst the general public. Once groups around the world started to attempt their own cold fusion experiments without success, the initial enthusiasm was quickly replaced by disappointment and skepticism. A little over a month later, the New York Times declared cold fusion dead.

“All the Nobel Laureates that you find here, almost forty of them, produce results that are reproducible,” Shechtman said. “Everybody with the right equipment can repeat and receive exactly the same results. This is not the case with cold fusion.”

How Biomimicry Leads to New Discoveries

After returning home from a five-year voyage around the world on the HMS Beagle, English naturalist Charles Darwin began to formulate what would eventually become one of the best substantiated theories in the history of science. The theory of evolution, first published in his seminal 1859 book On the Origin of Species, states that populations change over the course of generations through the process of natural selection. Individuals best suited to their environment are more likely to survive and reproduce, while those not suited to their environment are less likely to do so.

What Darwin may not have predicted, however, was that humankind would later harness the power of evolution for its own benefit. In particular, researchers inspired by his groundbreaking biological theory learned to mimic the process of natural selection in the lab — a method called “directed evolution” — to develop customised proteins of interest. Today, these proteins can manufacture everything from biofuels to pharmaceuticals. In 2018, Frances H. Arnold, George P. Smith, and Sir Gregory P. Winter won the Nobel Prize in Chemistry for their revolutionary work on directed evolution.

Billions of years of evolution have created an incredible degree of complexity to life on Earth. Not only are scientists and engineers trying to master evolution itself, but they also want to mimic the high-performance results of natural evolution — for instance, ultra-strong and lightweight spider silk, tail regeneration in lizards, and the excellent aerodynamics of birds. Disciplines such as bionics and biomimetics take inspiration from the biological methods and systems found in nature to tackle human problems, while also furthering our understanding of physics, engineering, and technology

Termites Mounds, Electric Eels, and Gecko Ears

Bionics is the science of constructing artificial systems that have some of the characteristics of living systems. For instance, the idea behind Velcro came to Swiss engineer and amateur mountaineer George de Mestral after a hike in the woods. He noticed the burrs that stuck fast to his clothes with tiny hooks and wondered if he could mimic the design for a commercial application. Eight years later, de Mestral introduced Velcro to the world, the now-ubiquitous “zipper-less zipper” consisting of two complementary strips with tiny hooks and loops.

Biomimetics is a closely related field that has a more general focus, expanding the concept of emulating nature to less science and technology-driven areas such as product design, architecture, and art. An office building in Zimbabwe, for example, has an internal climate control system originally inspired by the structure of termite mounds. These insects build vertical mounds out of soil, saliva, and dung that maintain a constant internal temperature despite huge fluctuations in outside temperature. The Eastgate Centre in Harare has no conventional air conditioning or heating system, yet stays regulated year-round with a ventilation system modeled after the many heating and cooling vents of a termite mound.

Both fields have made significant strides towards our understanding of physics and engineering, while also providing innovative solutions. In 2017, a team led by biophysicist Michael Mayer of the University of Fribourg created a new type of soft, foldable battery inspired by the electric eel. The fish has 6,000 cells called electrocytes that store power like tiny batteries and can simultaneously discharge in order to deliver a strong electric shock.

 

 

Mayer and his colleagues mimicked the eel’s electrocytes by printing out rows of hydrogel dots on sheets of plastic, alternating a dot containing sodium chloride with one containing only water. They printed a second sheet of plastic with two more types of hydrogel dots, one of which would allow the passage of only cations and one that could only conduct anions. When these two sheets are pressed together, the system generates power. Such a power source could one day power implantable or wearable devices, like a pacemaker or biological sensor.

Late last year, a group led by material scientist Mark Brongersma of Stanford University developed a new photodetector inspired by the unique ears of geckos. Larger animals are able to sense the directionality of sounds by recognizing the intensity and time differences of the wave hitting its two ears. But small animals like the gecko — whose ear-to-ear spacing is shorter than audible sound wavelengths — can’t triangulate the location of noises in the same way. Instead, the lizards have a small tunnel through their heads that measures the way incoming sound waves bounce around to figure out directionality.

Brongersma and his colleagues created a similar system for detecting the angle of incoming light with a sub-wavelength photodetection pixel, which are typically only 1/100th of the thickness of a hair. The system uses two closely spaced silicon nanowires and was the first to demonstrate the feasibility of detecting light directionality with such a small setup. Basically, when light hits the photodetector at an angle, the wire closest to the source interferes with the waves impinging on the other. The first wire to detect the light sends the strongest current, and after some calculations that involve comparing the current in both wires, the directionality of the light source can be mapped out

The Past and Future of Biomimicry

Other examples that showcase the sheer innovation of bionics and biomimetics abound, particularly when it comes to fields like physics, engineering, and technology. Mimicry of nature isn’t a new idea — in fact, Leonardo da Vinci studied the anatomy of birds to design a flying machine back in the 15th century — yet it shows no signs of slowing down in terms of contributions to new ideas and inventions. Even though humans have evolved to possess large, complex brains and advanced societies, bionics and biomimetics are proof that we still have so much to learn from the natural world around us.

The Periodic Table: An Appreciation

Today marks the 150th anniversary of Mendeleev’s periodic table, an outstanding achievement whose influence on how we understand the world around us and its constituent elements is hard to overstate.

Scientists love to classify things, be it organisms, proteins or physical and biological processes. It not only makes sense of the world and explains what we already know but also provides frameworks within which to add new findings to pre-existing knowledge. In the history of science, there is perhaps one classification system that stands out by virtue of its sheer brilliance and its influence on how we understand the physical world: the period table, which celebrates its 150th birthday this year and is the result of a lengthy game of solitaire by Siberian chemist Dimitri Mendeleev.

The story goes that after years of attempting to classify the elements Mendeleev hit upon the strategy of creating cards on which he had written the name of a chemical element together with its atomic weight. For three days and three nights, he stayed up playing “chemical solitaire” with the cards. What was the system? How did they all fit together? Finally, on the fourth day he dozed off and had a dream. And just like that it came to him – a table incorporating all of the known elements in a manner that reflected their atomic weights as well as their chemical properties.

 

Early periodic table from 1871, © Wikimedia Commons

In popular discourse, the periodic table as we know it today and the principles underpinning it are generally regarded as being the work of only Mendeleev, but it is not quite as simple as that.

By the middle of the 19th century, over 50 elements had been discovered and isolated. However, as Nobel Laureate Linus C. Pauling emphasised many years later, there was as yet no overarching system devised for how to best classify elements and to represent their relationships with each other. The first attempts in this direction were smaller-scale groupings of elements based on their atomic weights, first in groups of three by J.W. Döbereiner and then into larger groups by other scientists. One challenge to a really systematic representation of the chemical elements was the lack of uniformity in how scientists were calculating atomic weights around the 1850s. Based on a unified system proposed by Stanislao Canizzaro, one early periodic system was the one proposed by A.-E.-B. de Chancourtois, whose ‘telluric screw’ did successfully group together elements with similar chemical properties into straight lines. Another who made great strides towards cracking the chemical code of elements was the British chemist John Newlands who arranged the elements by increasing atomic weight. Indeed, Newlands came close to the whole crux of the issue by noting that there was a periodicity underlying the properties of elements.

The German Julius Lothar Meyer came tantalizingly close to constructing ‘the’ periodic table. His table also listed the elements in order of atomic weight and he also noted that there was a periodicity to the characteristics of the elements. In fact, his table was strikingly similar to Mendeleev’s and save for the fact that Meyer published his periodic table one year after Mendeleev, Meyer’s name might be much better known today.

That honour instead goes to Mendeleev who hailed from Siberia and came from a large family with 16 siblings. After completing his university training in Saint Petersburg, he became an academic, and, as the story goes, wrote a chemistry textbook as he could find none that was suitable for his purposes. It was around this time that he too became obsessed with classifying the chemical elements. 

 

A commemorative stamp from the U.S.S.R. to mark the 100th anniversary of Mendeleev’s periodic table. © iStock/Veronika Roosimaa

Mendeleev’s table would surely not have achieved such renown if it was only a classification of the elements that were already known. The power of the table lies also in its predictive power. Like Meyer, he deliberately left space for additional, not yet discovered elements. However, unlike Meyer, he provided great detail about the properties of those elements, which, he suggested, could be predicted based on the principles of his table. Indeed, when these elements were discovered in later years, his predictions were found to be very prescient. Even more strikingly, a whole class of elements, the noble gases, were not even known when Mendeleev formulated the periodic table. Upon their discovery, a breakthrough for which William Ramsay was awarded the Nobel Prize in Chemistry in 1904, Ramsay realised that there was space for them in Mendeleev’s nascent periodic table. Mendeleev agreed and added them to a later version. The power of the table was also evidenced by the fact that Mendeleev himself realised that the properties of several elements could be better described by moving them to new positions from those indicated by their apparent atomic weights.

The full periodic table as we know it today was only finalised with additional important contributions from other scientists, most notably from H.G.J. Mosely, whose method to precisely measure atomic number contributed to resolving remaining uncertainty about the position of certain elements in the table.

All of this surely warranted a Nobel Prize for Mendeleev? Amazingly no, but he did come very close and was nominated for both the 1905 and 1906 prizes in chemistry. However, he lost out on both occasions, perhaps because his discovery was at that time already regarded as too old.

Nobel Prize in Chemistry 2018 – The Evolutionary Tale of Enzymes and Antibodies

This blog post is part of a series of articles on the scientific research that led to this year’s Nobel Prizes. The official Nobel Prize Award Ceremony will take place in Stockholm on 10 December 2018.

Evolution and natural selection – those are the driving forces that turned us from the simple single-celled organisms into the complex, trillion of cells containing beings that we are today. But how could anyone ever ‘harvest’ the ‘power of evolution’? This year’s Nobel Laureates in Chemistry did just that.

The award went to three highly distinguished scientists – half of it was shared by George P. Smith, Professor Emeritus at the University of Missouri-Colombia, and Sir Gregory P. Winter, Professor at the University of Cambridge, UK. The other half was awarded to Frances H. Arnold, biochemist at the California Institute of Technology (Caltech). According to the Nobel Foundation, Frances Arnold received the prize “for the directed evolution of enzymes”. George P. Smith and Sir Gregory Winter were lauded “for the phage display of peptides and antibodies”.

It was in the early 1990s at Caltech where Frances Arnold, born in 1956 in Pittsburgh, first developed what she was now awarded the Nobel Prize for: a method called ‘directed evolution’. This method is used to generate proteins that are not found in nature. Proteins are the building blocks of life. And enzymes, the catalysts involved in pretty much every biochemical reaction, are made up of proteins. Thus, the function of an enzyme and therefore of a biochemical reaction depends on the proteins involved. The structure and function of proteins, in turn, depends on the underlying DNA sequence. In order to generate a specific protein or enzyme function, Arnold induces DNA mutations within a specific DNA sequence of a protein with a function related to her target product. That altered sequence is then introduced back into a host organism or cell culture, where the sequence is translated into a new protein with an altered function. These altered functions are then tested, and the entire process is repeated over and over again until the protein and thereby the enzyme with the desired characteristics and functions is generated. What makes her method stand out is the fact that she is neither trying to design a new enzyme DNA sequence or protein structure from scratch, nor is she waiting for random mutations to introduce alterations that might or might not be useful (as is the case in evolution). Rather, by specifically inducing mutations at areas of interest, she is harvesting the random power of evolution for her benefit. Furthermore, by using rapidly proliferating bacteria as hosts for her newly synthesised DNA sequences, she is also able to quickly assess whether the proteins do in fact exhibit the desired functions – without needing to fully understand the complex co-dependence of protein-structure and its function. Since the beginning of the method in 1993, it has been refined several times and is now the gold-standard for developing new catalysts.

 

 

Another method to produce large amounts of peptides, proteins or antibodies is the so-called ‘phage display’. This method was first developed by George Smith, born in 1941 in Connecticut. The coating proteins of bacteriophages (or ‘phages’, in short), which are viruses that can infect bacteria, can be altered to specifically ‘display’ certain proteins. In 1985, Smith first introduced a protein sequence into a phage and the respective protein was subsequently ‘displayed’ on its surface. Such proteins on the phages are binding sites for peptides or other proteins. If such peptides or proteins bind to them, their binding dynamics and protein-protein interactions can be studied. Moreover, one phage can display several different binding sites. Thus, large libraries of proteins can be screened and selected for their functions and interactions, a process somewhat analogous to natural selection, where the ‘fittest’ version with the most appropriate functions survive or are ‘selected’.

 

 

Finally, if the DNA sequence of a protein is known, this can be introduced onto the surface of the phage. If a peptide then binds to this surface protein, that specific binding site can be isolated and its genetic code be sequenced; this way a new antibody can be identified and subsequently generated in the lab.

Last but not least, Gregory P. Winter, born in 1951 in the Leicester, UK, developed this intricate method of phage display even further to produce new pharmaceuticals. Because of his improvements, many antibodies intended for human use no longer needed to be synthesised in mice or other host species, thereby minimising cross- and allergic reaction. Winter was so successful with his new antibody developing tool that he founded ‘Cambridge Antibody Technology’ in 1989 – one of the first biotech companies involved in antibody engineering. The first pharmaceutical antibody developed with this method was approved in 2002. It is an antibody to TNF alpha, a part of the inflammatory response, and is used to treat rheumatoid arthritis, psoriasis and inflammatory bowel diseases. It was the world’s top selling pharmaceutical in 2017 with sales over USD 18bn.

However, phage display as used by Winter is not only useful for broad diseases that affect all patients in a similar way. It has also become a very promising approach to treat (metastatic) cancers: Phage display is used to create and select synthetic antibodies that target the surface proteins of the tumour. These are then made into synthetic receptors for T-Cells collected from the patient, another essential part of the immune system. Thus, the patient’s own immune system is then able to recognise and fight the cancer cells, a process particularly useful for treating metastatic cancers.

Additional information: A new series of Mini Lectures concerning DNA and its replication and modification can be watched in our mediatheque

Thomas A. Steitz 1940–2018

Thomas Steitz with young scientists at the Lindau Nobel Laureate Meeting 2018. Photo/Credit: Christian Flemming/Lindau Nobel Laureate Meetings

The Council and Foundation Lindau Nobel Laureate Meetings deeply mourns the loss of laureate Thomas Steitz, who sadly passed away on 9 October 2018 at the age of 78. He received the 2009 Nobel Prize in Chemistry for his studies on ribosomes.

Steitz completed his Ph.D. in biochemistry and molecular biology at Harvard University in 1966. After research stays in Europe, he moved back to the US. He was a Sterling Professor of Molecular Biophysics and Biochemistry and Professor of Chemistry at Yale University.

Thomas Steitz participated in four Lindau Nobel Laureate Meetings, only recently in 2018. The Council and Foundation extend their deep sympathies to Thomas Steitz’ family.

2018 Nobel Prize in Chemistry

2018 Nobel Laureates Frances H. Arnold, George P. Smith and Sir Gregory P. Winter. Illustration: Niklas Elmehed. Copyright: Nobel Media AB 2018.

On Wednesday, 3 October 2018, the Royal Swedish Academy of Sciences has awarded the Nobel Prize in Chemistry 2018 to Frances H. Arnold “for the directed evolution of enzymes”  and to George P. Smith and Sir Gregory P. Winter “for the phage display of peptides and antibodies”.

Find out more about the 2018 Nobel Prize in Chemistry here.

The Power of Evolution: Nobel Prize in Chemistry 2018

Today, the Royal Swedish Academy of Sciences announced the 2018 Nobel Laureates in Chemistry. Frances H. Arnold (USA) received one half of the prize “for the directed evolution of enzymes”; the other half of the prize was awarded to George P. Smith (USA) and Sir Gregory P. Winter (UK) “for the phage display of peptides and antibodies”.

2018 Nobel Laureates Frances H. Arnold, George P. Smith and Sir Gregory P. Winter. Illustration: Niklas Elmehed. Copyright: Nobel Media AB 2018.

From the press release of the Royal Swedish Academy of Sciences:

“The power of evolution is revealed through the diversity of life. The 2018 Nobel Laureates in Chemistry have taken control of evolution and used it for purposes that bring the greatest benefit to humankind. Enzymes produced through directed evolution are used to manufacture everything from biofuels to pharmaceuticals. Antibodies evolved using a method called phage display can combat autoimmune diseases and in some cases curemetastatic cancer.”

Read more about the 2018 Nobel Prize in Chemistry here.