Advertisement
MichaeI Imperiale, professor of microbiology and immunology at University of Michigan. Photo: University of Michigan Medical School

Significant advances in life sciences research—including emerging techniques such as CRISPR—have been making waves within both the scientific community as well as the general public in recent years.

But these new tools and developments, coupled with the rapid pace of knowledge-sharing in the 21st century, opens the door for terrorists or rogue nation states to exploit scientific findings meant to be beneficial, and misuse them to create bioweapons. This type of research is known as dual use research of concern (DURC).

As an example, Science reported in July 2017 that a team of Canadian researchers synthesized the horsepox virus, a relative of smallpox, from genetic pieces ordered through the mail. The experiment cost them only about $100,000. Smallpox took decades and billions of dollars to eradicate, and while the related horsepox strain is not known to be harmful to nature or humans, it has the potential to be used to recreate smallpox.

To prevent such catastrophic events from occurring, the U.S. does have some policies and restrictions in place. Currently, 15 pathogens or toxins and seven types of experiments are classified as DURC, which means the U.S. government monitors and restricts research on this group knowing they are at high-risk of being misused for harmful intent.

The National Academies of Sciences, Engineering, and Medicine was recently called upon to review current U.S. policies and practices governing the conduct and distribution of DURC in the life sciences.

The report’s main conclusion? There are multiple shortcomings and “fragmented” regulations surrounding DURC.

“Optimizing policies that encourage scientific openness while in appropriate cases limiting the dissemination of research results that might be misused is a difficult challenge,” said Harold Varmus, professor at Weill Cornell Medicine, and co-chair of the committee that wrote the report. “We hope that our report will inform future discussions and policies on managing dual use research.”

To complete the report, the committee first gathered information during a meeting July 11 to 12, 2016, and again at a public workshop on Jan. 4, 2017. They commissioned papers on a variety of topics—including biosafety and biosecurity, internal approaches to biosecurity, ethics, export control and more. They listened to expert presentations, and engaged in both public and private deliberations.

Raising awareness
The final report, titled “Dual Use Research of Concern in the Life Sciences: Current Issues and Controversies,” identified multiple policy shortcomings, particularly in regard to synthetic biology and emerging techniques like CRISPR, which allow biologists to edit genes or make completely new lifeforms from scratch.

Restrictions on DURC in the life sciences are only enforced on institutions that are conducting research with federal funding, so any studies being done by private companies or “do-it-yourself” communities do not have to abide by the restrictions, even if their research meets the definition of DURC.

Another primary concern among the committee is the apparent lack of awareness about DURC issues among life scientists. Individuals training to become life scientists are rarely introduced to the topic in a systematic way. Programs highlighting the topic across all education levels don’t typically involve courses or even discussions about dual use research of concern, unless the trainees are involved with a select agent. But even in these instances, the focus is on biosafety, as opposed to security issues, the report states.

“In a number of cases, the scope of these programs includes biosecurity and enables these particular communities to develop sophisticated views about these issues. Expanding these programs beyond a focus solely on specific pathogens could increase the ability of the broader research community to take greater responsibility for safeguarding dangerous information in ways that do not impede scientific advances,” the committee suggested.

Michael Imperiale, professor of microbiology and immunology, and associate vice president for research at the University of Michigan, echoed this statement.

“One concern I have is that the current U.S. policy is focused on the list of agents and types of experiments, and I think there may be experiments outside this area of coverage that also present concern,” Imperiale said.

Imperiale is a trained virologist, and has been involved in biosecurity policy for more than 10 years.

“Ideally, one would like to avoid regulations and get all the stakeholders in agreement that additional thought needs to be given to experiments [that] fall into this realm, from conception to publication. We cannot wait until the work is ready to be published because we don’t have a good mechanism to deal with it at that late stage,”
Imperiale told Laboratory Equipment.

Sam Weiss Evans, currently a visiting research fellow with the Program for Science, Technology and Society at the Harvard Kennedy School, told the committee that the strongest change will come from efforts to promote—in the next generation of academic leaders in emerging technologies—the view that science and security are not mutually exclusive, and then support efforts to achieve institutional change in the training of students.

Evans cited a few programs as successful examples, one being the Synthetic Biology Leadership Excellence Accelerator Program (Synbio LEAP), which brings a network of people from academia, industry and government into broad discussions about responsible innovation and stewardship within synthetic biology.

Also, the International Genetically Engineered Machines Competition involves more than 6,000 students from 40 countries. The Human Practices Committee works closely with the FBI and other organizations to design a range of methods to both make students aware of security concerns in their work, and structure the type of work they are allowed to do to avoid the most likely security-sensitive areas, such as a policy on the development of gene drivers.

Other researchers with papers commissioned for the report noted a code of conduct at many universities, which ensures all research staff involved in working with one of DURC’s agents is committed to ethical and responsible conduct of science. Expanding these types of codes could boost the concept of a “culture of awareness” among researchers.

Key conclusions
There is a lack of national consensus, framework and commitment for assessing the risks and benefits of DURC experiments, the report concluded. The evolution of open-access journals and preprints further demonstrate the need for more discussions on how to handle and share the results of DURC experiments.

The committee “hopes these findings provide a baseline for the development of principles that will, in turn, lay the framework for government policy for managing and the dissemination of information about the conduct and results of DURC research by federal agencies, the research community and the international scientific community.” 

Advertisement
Advertisement