Return to the audio for How to Science episode 9, with Professor Michael Imperiale.

 

Michael Imperiale: My name is Mike Imperiale. I am a professor of Microbiology and Immunology at the University of Michigan.

The story I would like to tell you today is concerning controversy relating to research with infectious agents that is dangerous.

Lapses in biosafety in various laboratories came to light between 2012 and 2014. For example, a vial of variola virus—which is the causative agent of smallpox—was found in a closet at the National Institutes of Health. That vial of smallpox should not have been there: After smallpox was declared eradicated, all of the existing stock of smallpox were supposed to have been destroyed, except for two places—the CDC in Atlanta, and a lab in Russia, which were permitted to keep stocks of smallpox. So that was really a biosecurity breach.

The biosafety lapses were more just carelessness, where people weren’t following proper protocol. The CDC accidentally shipped some H5N1 virus to a laboratory, when they were supposed to be shipping H1N1, which is much less pathogenic. So this can raise people’s concerns. If a lot of laboratories make viruses more pathogenic, and if those viruses were to accidentally escape the lab, then that can lead to a very dangerous pandemic.

I first became interested in this when I first learned about a concept that’s called the “dual-use dilemma.” “Dual-use” was a term that was really used in the military, and it was used to describe technologies that had both military use and civilian use—work that has beneficial applications, but the results of which may also be misused, so it’s a much broader definition.

The dilemma is, then: Should we be putting restrictions on that beneficial work because we’re concerned about potential misuse? There's always the potential that someone may take that same technology and those same pieces of information and try to misuse those to cause harm.

I was appointed to a national advisory board that was called the National Science Advisory Board for Biosecurity by the government, who advised them on dual-use issues.

Things really came to a head in the fall of 2011. What happened then was a highly pathogenic avian influenza strain called H5N1.

This strain was starting to arise in poultry in China and southeast Asia. What was frightening was that when humans were getting infected with this strain of the virus, the mortality rate was very high—it was about 60 percent. To put that in perspective, the 1918 influenza pandemic was the worst that we know about, and the mortality rate was about 2 percent. When we have seasonal influenza outbreaks every year, the mortality rate of those is probably about 0.1 percent.

The way that influenza viruses spread is through a respiratory path. If I’m infected, and I sneeze, then that puts particles out into the air that the next person can inhale and then become infected with the virus. What was happening with the H5N1 was that humans were coming in direct contact with infected birds, and that’s how they were getting infected. But there was no evidence at that time that those humans were spreading it to other humans.

So the question was: Can this virus start to spread from human to human? If it did, one can imagine that can be a pretty dangerous pandemic.

Some researchers decided to ask the question experimentally: Can we make H5N1 influenza transmissible in mammals?

Obviously, you can’t do the experiment in humans. So the experiment was done in ferrets, which is the accepted experimental animal that most closely mimics humans with respect to influenza. The experimental setup is to infect a ferret in one cage, then take a naive ferret in another cage, then put those cages close enough to each other, such that if the first ferret sneezes, those particles can make their way into the second cage, inhale those particles, and get infected.

Two papers reporting the transmissibility of the H5N1 virus were first submitted in 2011. The U.S. government learned about this, and they asked the advisory board to weigh in as to whether those papers should be published. The concern here was: Is this a biosecurity risk? Might someone misuse this information?

We decided that this information was dangerous enough that the overall results should be published, but that the exact mutations that allowed these viruses to be transmissible should be withheld.

When we made that recommendation, there was a lot of pushback. There was pushback from the authors; there was pushback from WHO (the World Health Organization), which felt that this information was important for pandemic preparedness; and there was pushback from the journals themselves, who felt uncomfortable redacting information. The publishers were reluctant to redact any information because they felt that it was really their job to present the entire story to their readership. I think they felt like they were being put in an awkward position by the government, with the government asking them not to publish this. I think we arrived at this redaction decision as a bit of a compromise, because again, there was enough concern that someone might take the information and run with it in a bad way. We wanted to put a limit on what information got out there.

So the authors were asked to revise the papers, and we voted to recommend publication of both papers in their revised forms, along with comments as to why it was important to be publishing this information.

A colleague of mine at Johns Hopkins University, Arturo Casadevall, and I wrote a series of articles in which we tried to make the point that the whole concept of dual-use research needed to be reconsidered because it’s too fuzzy of a concept. Waiting until the work reached the publication stage was really too late in the game. These types of experiments are going to become more common, and hopefully we have a better process in place to work with them.

 

Monica Dus: This reminds me a lot of discussions people were having about nuclear power, because it’s cleaner energy compared to fossil fuels in general. But there’s also clear risks that are potentially catastrophic. Can you draw some parallels between the two, since we’re still battling this dual-use dilemma and this controversy we have between knowledge and potential risk? Is there anything we can learn from discussions people have had about nuclear power and how to safely do experiments?

MI: I think the difference with respect to the life sciences is twofold. Number one is that the resources necessary to develop a nuclear program if you want to deliberately cause harm are very, very large. It’s expensive and requires very, very high levels of expertise.

In the life sciences, the resources necessary to make an influenza virus are not that high. The expertise is not at that same level.

You can imagine someone or some group or some nation-state that wants to cause harm reconstructing a virus in the laboratory. So the technology for producing an influenza virus is pretty straightforward. It doesn’t require highly sophisticated laboratory skills. It does require some expertise, but it’s not rocket science, I guess you could say. It’s easier to hide, so someone can do this in their garage or in their basement. You can’t develop a nuclear weapon in your garage or your basement, so I think that’s one difference.

The other difference, I think, is that humans have more of an inherent or innate fear of infectious agents.

In light of what you just said, how do we do these experiments— this “dangerous science”—in a safe way?

MI: I think if it is, for example, an infectious agent, that we’re going to have an infrastructure in place that will, number one: prevent it from accidentally getting out. And, number two: If we really think it’s a biosecurity risk, to have some sort of physical security, such as locks and those sorts of things.

We also have to make sure that the individuals who are doing the work are properly trained, because that may be the weakest link—human error.

The other thing I think we have to remember is that oftentimes when we’re doing experiments in the laboratory, we get unexpected results, and we may unexpectedly isolate something that has a gain of function. So I think we have to be aware of that possibility and know what we’re gonna do when that happens. We need to stop and think, "Are we gonna continue these experiments? How are we gonna publish the results? Who do we need to talk to, to get an informed opinion about the way to move forward?"

In terms of the board, there were two members of the public representing all of us as the general public. What was their take on how much transparency there should in public research?

MI: I should talk a little about the composition of the biosecurity board. There were scientists; individuals with backgrounds in science policy; lawyers; and two members of the public who were not working on these things day in and day out, who brought a different perspective. So the expertise on the board was quite diverse.

I think we all realized we had a common goal here, which was to do what’s best.

Scientific research needs to be transparent because, number one: That allows other laboratories to verify and reproduce the work that was done. And secondly, that work will act as a foundation for further work.

In this particular case, the argument for publishing the work rested on a few things. Number one was that, as I mentioned earlier, it was just important to know: Is it possible that this virus could become transmissible? Some pathogens are highly contagious; measles is probably the best example of that. Other pathogens are not very contagious. In the case of the avian H5N1, the strains that are circulating in birds cannot be transmitted from one human to another without heroic efforts.

The second thing is: What can we learn about what the determinants are of transmissibility? That can maybe allow us to do better surveillance when these viruses are infecting poultry or other birds. Secondly, it also allows us to ask questions about whether or how to best approach developing vaccines against this particular strain.

The public is paying for this research, and the public is the beneficiary of the research, so I agree with that. We have an obligation to the public to ensure that we are doing what’s right as well as what’s best.

I guess what I'm hearing from you is that, in many ways, we haven’t really resolved this dual-use dilemma.

MI: I think, in many ways, we have not. I think biotechnology is rapidly advancing. It’s hard to keep up policy-wise and oversight-wise with the new advances. The board continues to meet; I know it’s still discussing these types of issues, and hopefully it’s going to continue to provide recommendations to the U.S. government on how to proceed here. Having said that, it’s an advisory board, and the government could have decided to ignore our advice. Fortunately, they did not.