Published 22 March 2018 by Melissae Fellet
Regulating Synthetic Biology When Its Risks Are Unknown
Synthetic biology uses tools from genetic engineering to design, create, and assemble living organisms with a particular function. Picture/Credit: artoleshko/iStock.com
In the fast-moving world of synthetic biology, discoveries are closely tied to their social implications. Synthetic biologists use tools from genetic engineering to design, create and assemble — sometimes from scratch — living organisms with a particular function. Commercial kits let high school students create bacteria that smell like bananas, a company uses engineered yeast to produce an anti-malaria drug on a large scale, and researchers have created a microbe that alters bee behaviour.
Although these applications are playful and practical, the tools of synthetic biology might eventually be used in other ways. Potential environmental, public health and national security risks from an uncontrolled release of a synthetic organism are unknown, making it difficult to imagine ways to regulate the products of synthetic biology.
But that doesn’t stop researchers and social scientists from trying: About 120 people attended a session about the relationship between technology development and risk governance in synthetic biology on the morning of the third day of the American Association for the Advancement of Science annual meeting in mid-February. Political debates surrounding genetically modified organisms provide clues to potential sticking points in synthetic biology regulation, and scientists can contribute to policy discussions.
Risks unknown
During the session, Andrew Ellington, a biochemist at the University of Texas, Austin, described one of his projects that could have future national security implications. Ellington and his colleagues wanted to engineer a microbe to influence bee behaviour. If successful, the researchers imagined feeding Africanized bees customised probiotics to make them less aggressive or giving probiotics to pollinating bees that encourage foraging at pesticide-free plants to prevent colony collapse disorder.
Ellington and his team engineered microbes to produce L-dopamine, a chemical signal in the brain that affects learning. Then they fed the engineered microbe to bees and gave the bees an electric shock at the same time they released a specific smell. Bees with the engineered microbe in their guts extended their stinger at the smell, learning to anticipate a shock, slightly faster than normal bees. They also retained that learned knowledge longer than normal bees.
Ellington said his team wants to test these dopamine-producing microbes in mice — and eventually humans — as a potential treatment for Parkinson’s disease. But considering testing behaviour-altering microbes in humans raises large social questions: Could these engineered organisms be a potential security threat? What are the health and environmental risks if those organisms become uncontrolled?
Existing regulatory pathways analyse risks from nanomaterials, genetically engineered crops and chemical weapons by considering the health and environmental impacts should these materials spread through the air, water or soil. These systems are effective when the hazards are well understood, the risks of exposure known and the materials can be controlled.
But traditional risk assessments don’t apply to synthetic biology, says Igor Linkov, who leads the Risk and Decision Science Team at the US Army Corps of Engineers. Synthetically engineered organisms are often designed to pass their genetic changes to their offspring. This means their uncontrolled release could spread genetic information that impacts other species.
Lessons from GMOs
Clues to important components of effective regulatory systems for synthetic biology can be found in the history of genetically modified organisms. In 2014, Jennifer Kuzma, an expert in governance of emerging technologies at North Carolina State University, tracked the evolution of policies involving GMO insects and plants and found that the pacing of the policy process did not allow time for public participation. Environmental and consumer groups responded by pressuring policy makers, forcing them to take actions that advanced the regulatory process into new phases.
But sometimes, hurdles in a regulatory process keep new products from reaching consumers. Thomas Bostick, Chief Operating Officer at Intrexon, shared the company’s experience trying to bring a genetically modified salmon to market in the US. The AquAdvantage salmon contains a gene so that it grows to market weight in half the time of current salmon, while eating 75% less feed and staying healthy without vaccines or antibiotics. This genetically engineered salmon could also be produced in inland facilities, eliminating disease and parasites that spread into surrounding marine ecosystems from the open cages of typical salmon farms.
The larger of these genetically engineered salmons expresses a growth hormone that allows it to reach market size in half the time as its sister. Picture/credit: AquaBounty Technologies
But much of the 20-year development of this salmon has been held up in regulatory battles, Bostick says. The US Food and Drug Administration (FDA) has approved the AquAdvantage salmon as safe to eat, but it can’t yet be sold in stores. In 2016 and 2017, a bill specifically requiring labeling of the AquAdvantage Salmon as a genetically engineered organism caused the FDA to block imports of any genetically engineered salmon. Now the FDA is waiting for the US Department of Agriculture to decide on labelling requirements before it can pass regulations allowing the salmon to be sold in stores. The regulatory system needs flexibility to accept innovation, Bostick says.
Participatory and anticipatory governance
Kuzma offered several ways to improve risk assessments for synthetic biology applications. First, the policy process needs to operate from a middle ground that respects the knowledge and process of science while also acknowledging the value-based concerns of citizens. The system also needs a combination of participation and anticipation.
There are several models for processes of rulemaking that could be useful for synthetic biology. Kuzma describes one that she developed, along with Christopher Cummings, at Nanyang Technological University in Singapore. This model could help policymakers evaluate synthetic biology risks. The researchers interviewed 48 synthetic biology experts regarding case studies of four research projects: biomining using highly engineered microbes in situ, “cyberplasm” for environmental detection, de-extinction of the passenger pigeon, and engineered plant microbes to fix nitrogen on non-legumes.
For each case study, the interviewees scored how much information was available in each of eight categories: (1) human health risks, (2) environmental health risks, (3) unmanageability, (4) irreversibility, (5) the likelihood that a technology will enter the marketplace, (6) lack of human health benefits, (7) lack of environmental benefits and (8) anticipated level of public concern.
Then the researchers plotted the average score for each category in a given case study on an octagonal chart to show the relationship between all the categories. A deficit in environmental or human health risks might suggest more research in that area, while low health risks and high public concern might warrant an outreach campaign. The same tool could be used early in regulation development to gauge the perspective of concerned citizens too.
Questions about how to regulate synthetic biology are not just for social scientists to wrestle. Communicating policy implications of research, along with associated uncertainties, is part of responsible conduct for researchers worldwide, says the IAP, a global network of science academies, in a 2012 report. “There’s a role for the scientific community to be involved with discussions about implications, concerns, and governance from the conception of research to funding to execution,” says Katherine Bowman, at the National Academies of Science, Engineering, and Medicine.