Transcript
Cold open [00:00:00]
Alison Young: One of the things that’s important to remember is that microbiology is relatively a new science. It’s a young science compared to chemistry and radiological sciences. Those scientists seem to be much more open to the scrutiny of their practices than those working in microbiology labs — who, for much of the history of microbiology, because there were not ways to keep them safe, were often catching their experiments. Some of these scientists took great pride in how many times they had become infected, because they were doing this for the greater good.
Luisa Rodriguez: I remember finding it striking in the book, reading about these cases where scientists, working before a bunch of better safety practices, would basically brag, as you said — like, “I’ve gotten TB four times already” — and it was almost a battle scar that they wore with pride.
Alison Young: Also there’s just not a culture of tracking these kinds of infections. There never has been a culture of that. To this day, there are no universal tracking systems for these kinds of illnesses in labs or accidents.
Luisa’s intro [00:01:13]
Luisa Rodriguez: Hi listeners, this is Luisa Rodriguez, one of the hosts of The 80,000 Hours Podcast.
In today’s episode, Alison Young walks me through some of the highest-profile lab leaks we know about.
We cover:
- The most egregious biosafety mistakes made by the CDC, and how Alison uncovered them through her investigative reporting.
- The Dugway life science test facility case, where live anthrax was accidentally sent to labs across the US and several other countries over a period of many years.
- The time the Soviets had a major anthrax leak, and then hid it for over a decade.
- The 1977 influenza pandemic caused by a vaccine trial gone wrong in China.
- And the last death from smallpox, caused not by the virus spreading in the wild, but by a lab leak in the UK.
I was completely blown away by these biosecurity mishaps and lab leaks. I’d never heard of most of them, and perhaps naively, thought modern biosecurity practices would make sure these the kinds of mistakes Alison describes never happened, or at least happened exceedingly infrequently. But that just seems wrong.
While we don’t discuss all of the things going wrong in the world that allow these mistakes to happen in the interview, I took quite a few discouraging lessons from the episode:
- First, I would’ve thought I could trust an institution like the CDC to be particularly cautious and thoughtful about biosecurity issues. It’s not, and it needs to be fixed.
- You can’t trust microbiology labs in general to follow biosafety level requirements — which are the requirements biosecurity labs have to follow in order to work with certain kinds of pathogens — or to fix problems as they come up, so we should expect plenty of lab leaks. That’s true even in BSL-4 labs: labs working with the most dangerous pathogens, and with the strictest biosafety requirements.
- The problem is not that people don’t care about keeping the public safe from lab leaks: it’s that the microbiology community sees biosecurity measures as onerous and superfluous. That, plus a culture of martyrdom, means they take biosecurity measures less seriously than chemists and nuclear physicists, for example.
Perhaps we’ll dive even deeper into some of these systemic issues that make lab leaks so common in a future episode, but for now, I hope this episode makes clear for you what it made for me: lab accidents happen disturbingly often, and it’s not OK.
Without further ado, here’s Alison Young.
The interview begins [00:04:02]
Luisa Rodriguez: Today I’m speaking with Alison Young. Alison is an award-winning investigative journalist and a professor at the University of Missouri School of Journalism who’s been uncovering lab leaks and accidents at prestigious US labs for over 15 years. Her new book, which I found very haunting — Pandora’s Gamble: Lab Leaks, Pandemics, and a World at Risk — describes a number of extremely disturbing safety breaches at some of these prestigious labs. Thanks so much for coming on the podcast, Alison.
Alison Young: Thank you for having me.
Luisa Rodriguez: I hope to eventually talk about how common lab leaks are and why they happen. Before reading the book, I knew that lab accidents happened, and that was a familiar fact to me, but I had the naive intuition that they weren’t that common. And I guess to the extent that they did happen sometimes, that they were handled responsibly, to make sure that there weren’t more widespread consequences. Learning about some of the specifics of these accidents made it much more visceral to me how careless and irresponsible even really well-intentioned researchers and institutions can be with regards to biosafety.
Investigating leaks at the CDC [00:05:16]
Luisa Rodriguez: But first, let’s talk about the CDC, which is an institution that you’ve investigated quite a lot. A lesson I took from your book comes from some of this investigation: basically that even extremely respected institutions make huge, egregious biosafety mistakes.
We don’t have time to talk about every single accident and leak that you’ve investigated at the CDC — and honestly, that’s a testament to how many there have been — but I do want to talk about some of the examples. Before we get into those examples, can you give some background to how you even got into this work investigating these leaks at the CDC?
Alison Young: Like most people, I had never really given a whole lot of thought to the safety of biological research labs before I started covering the Centers for Disease Control and Prevention. I was, at the time, the CDC beat reporter for the Atlanta Journal-Constitution. And in the course of doing my initial reporting, I got a tip that there had been an hour-long power outage at their brand-new $214 million high-containment laboratory facility. And at the time, this was considered and being promoted as one of the most advanced of these facilities in the entire world.
It turned out there had been a lightning strike and it had knocked out the power to the building. The reason that’s important is that there are a variety of systems, in particular the airflow systems that keep negative air pressure in these buildings, that get knocked out. But more troubling than that, as I started digging in on the story, was that there were backup generators that should have come on — they should have guaranteed the continuity of power in this facility. And I ultimately obtained internal records that showed that the CDC’s own engineering officials had warned years earlier that the way they were configuring these backup generators was putting this building at risk, and yet they went ahead with the design anyway.
Luisa Rodriguez: And was this basically your first time investigating the CDC?
Alison Young: It was my first time investigating labs at the CDC. So that was my first introduction to problems with labs. And over time, I kept getting more tips from people inside the agency who were concerned about the safety practices in the facilities. And probably the most eye opening of those early accidents that I looked into was when I got a call from someone at the agency who wanted to alert me that one of the BSL-3 labs in this same building, called Building 18, was sealed with duct tape.
Biosafety level 3 labs are supposed to have negative airflow to keep the pathogens inside. This particular lab had had a malfunction of its air handling system about a year earlier. It was working with an organism that causes a disease called Q fever. This is a bacteria that takes very few organisms to cause infection. And the lab had a situation where the airflow systems had essentially reversed and blew outward outside of this lab into what was considered to be a clean corridor — so this would be into an area where people would unwittingly be walking by in their normal street clothes, not wearing any respiratory or other protections, because they would assume that they were safe in that area. So it had blown positive. And in that incident, there were multiple people who had been potentially exposed. They all were given antibiotics, and nobody was ultimately infected.
Luisa Rodriguez: And then, if I’m understanding correctly, someone who was I guess whistleblowing called you to tell you that a year later that BSL lab was still being sealed with duct tape. Is that right?
Alison Young: Correct. It was duct tape. And it actually gets worse from there. So I start putting in inquiries to the CDC press office asking them why, in a $214 million building that is supposedly, again, one of the most high-tech ones in the world, why a BSL-3 lab is sealed with duct tape. And they ultimately took me to go see this lab. I remember walking across the CDC campus escorted by multiple officials, and we went into this high-tech Building 18, and stood out in front of this BSL-3 lab. And there was this door, and it was still sealed with duct tape.
I said to the people who were escorting me, “Why is this sealed?” And they looked at me and they said, “The air handling system has functioned properly. There have been no other problems since that incident a year earlier.” And I said, “But that doesn’t make sense. Then why is it still sealed with duct tape?” And they said, “Well, we could remove it. It’s just a precaution. It’s just an enhancement.” It just didn’t make any sense.
And as I write in the book, it was one of those moments that I have thought about a lot over the years. It goes to some of the things that are just huge disconnects between what the public assumes about these facilities and how they sometimes operate.
Luisa Rodriguez: Yeah. Where the assumption is that they’re sensible, responsible people who are honest with themselves about the risks they’re taking, and conscientious when taking risks that might jeopardise people’s health.
Alison Young: And one of the things that was said to me at the time when I was writing about this duct tape for the Atlanta newspaper is that if its regulatory side of CDC had encountered the duct tape in any other facility, it would have been written up: it would not have been allowed. And that may well be the case. But part of the problem here is that there is no transparency and no public accountability. Policymakers have no idea whether or not the agencies that are overseeing this research are applying the rules equally to all facilities — their own versus others.
One of the adages in the world of health is that “sunlight is the greatest disinfectant.” There is no sunlight when it comes to the safety of biological research. And I will point out that only because of the articles we did about the duct taped door was that door ultimately replaced with a proper sealed door.
Luisa Rodriguez: Yeah, that’s extremely unsettling. OK, so that was kind of a light bulb moment for you. And then from there, it sounds like you did a bunch more investigative work, partly fueled by the fact that people, including people in the CDC, were so unnerved by some of the things happening there that they’d call you.
Actually, I loved reading about how they got in touch with you, because it was all sorts of ways and it felt like I was honestly reading a spy novel. Do you mind talking about that?
Alison Young: Sure. It was really a fascinating time to cover that agency, because the CDC is full of people who are really true believers. They could make a whole lot more money if they went to work in industry in some way. These are people who got into public health because they want to save the world; they really want to make a difference. And there were a number of people who were concerned about a wide range of things, labs just being one of them, who really saw the work that I was doing in the news media as a way of shining light and hopefully improving the agency.
So I would get all kinds of tips in a variety of ways. Some of them came through the mail. Various people had codenames. There was one person who did dead drops under park benches. I’ve met with people in empty stadium seats. It was just kind of a wild time, because these are people who are involved in a scientific endeavour. They weren’t used to doing spy-like things. And it was pretty amazing, the creative ways they came to get information to me.
Luisa Rodriguez: I want to get to some of the things that they were telling you, but I’m curious: What exactly were the potential repercussions for them?
Alison Young: They were all concerned about the potential damage to their careers or repercussions if they were to be found talking about this publicly, especially to a journalist. One of the things that has happened over time is — and this is not about Democrats or Republicans — with each passing administration that has been in office in the United States, there has been increased control over whether workers in federal agencies are allowed to talk to the news media. And I would say that those controls are even tighter today in 2023 than they were 15 years ago, when I first started reporting on the CDC and other federal agencies. So they were very concerned about that.
And then ultimately, one of the things that was a result of some of my earlier reporting at USA Today, after I left the Atlanta paper, CDC really started looking inward at its culture of safety under then CDC Director Tom Frieden. They asked a blue-ribbon panel of safety experts to come and really do a deep dive about the agency’s safety culture, and one of the things that they found is that there was, rightly, a concern about the reporting of accidents, even internally at the agency.
Luisa Rodriguez: OK, so there was a culture that made it so hard for some of these scientists to go through channels at the CDC to report these things that they ended up going to you.
High-profile CDC accidents [00:16:14]
Luisa Rodriguez: Let’s talk about a few more of the things you uncovered. So there’s the duct tape on the door. Are there other examples that you remember particularly vividly from this period of investigation?
Alison Young: Yeah, definitely. So that same building with the duct tape, Building 18, has had over the years a whole series of these airflow problems. Whether it’s other power outages, which it also has had, or the airflow systems not working properly — either being neutral or, on occasions, going positive.
That negative airflow has been a challenge. And one of the things I’ve learned over time is that that is one of the more complex systems in these labs. Not just this Building 18, but getting that airflow to work properly is something that is very difficult for these facilities. So there have been numerous incidents of that: puffs of air where people are walking by and all of a sudden they feel this puff of air enough that it sort of blew someone’s hair coming out of a lab. Luckily, that particular lab was not hot at the time. So there have been those kinds of things.
One of the worst periods of high-profile accidents for CDC occurred in 2014. And during that period, it started with an incident where dozens of workers at the CDC were potentially exposed to live anthrax, where the CDC was killing anthrax spores and they thought they were dead, but they had used a different kind of method and it turned out they weren’t all dead. Yet they had already been transferred to other lower-level labs where people weren’t using appropriate PPE to protect themselves from those spores. So that was a big deal. That happened in 2014.
And then shortly thereafter, there was another incident that involved a mixup with a very deadly strain of avian influenza virus. In some ways, this other incident was even more serious. It involved a worker at CDC, an experienced scientist who had two different strains of flu viruses that they were working with. It’s unclear exactly what happened, but they shipped off a specimen of what was supposed to be a pretty benign strain of flu virus to another federal lab also in Georgia. When the other lab started its experiments with some poultry, they were shocked, because the poultry started dying and that should not have happened. They started looking at the virus that they had gotten from CDC, and found that it actually was not the virus they thought they were getting, and that it was this very deadly strain of avian influenza.
Part of the challenge in this particular case is the CDC did actually some pretty good detective work in trying to figure out what went wrong here. And the experienced scientist would have had to have either done one of two bad things in preparing these specimens: Either they cross-contaminated the specimens by working with them together in a biosafety cabinet at the same time, and that’s something you shouldn’t do; or they didn’t disinfect this safety cabinet or work with the less dangerous strain first, decontaminate, and then do the more dangerous later.
What CDC did, that I thought was pretty fascinating, is they looked at the keycard swipes of the scientist and they calculated how many minutes it would take to do everything properly to prepare these specimens. And they basically found there was no way in the time that the scientist went in and out of the facility that they could have done everything that they were supposed to do. And ultimately, the scientist admitted to investigators that they were in a hurry trying to get to a meeting.
Luisa Rodriguez: So this is another case of not the kind of mistake that, when you look at it, feels like it could happen to anyone. It feels like some real human negligence and carelessness to knowingly not follow procedures because they were in a hurry to get to a meeting.
Alison Young: Yeah. And even in the end, the researcher didn’t admit that they did anything wrong, but they did admit to being in a rush.
The other thing was this particular incident was outrageous to the CDC director at the time, Dr Tom Frieden. When I interviewed him for the book, he talked at length about how it was not only all of the mistakes that were made in the handling of it; it was the delays internally of even reporting that this incident had happened within both the division, but also all the way up the line. It took an incredibly long time for those who knew about the incident to report it.
I would also note that even though the CDC and the USDA, which was where the chickens died, are the two agencies that run the Federal Select Agent Program, neither the lab that received the cross-contaminated sample nor the CDC which sent it reported it to the arms of their agencies that regulate this particular virus under the Federal Select Agent Program. So once again, it raises the questions about the oversight mechanisms.
Luckily, this particular incident, I will say, under Dr Frieden’s leadership, this was told to the public: he did insist that reports get released about this incident. And it’s a rare moment in time in these 2014 incidents that the agency did put out reports for the public about them.
Luisa Rodriguez: Yeah, good for him. It sounds like he did care a lot about these bad safety practices getting better under his watch.
Let’s talk about a few of the other cases you investigated. One that really shocked me was the case of the forgotten vials of variola, or smallpox. Can you explain what happened there?
Alison Young: Sure. Around the same time the CDC was having all kinds of incidents in 2014, in the middle of all of that, there was a cold storage room on the campus of the National Institutes of Health, just north of Washington DC, where they were moving around some old cardboard boxes. And they look inside and they see all of these little, tiny, very fragile vials from decades ago that are labelled in typewriter print with various pathogens’ names on them. And it’s powdered material. And as they’re going through these glass vials, they see some that are labelled as variola.
Luisa Rodriguez: Which, just to be totally clear, variola is the pathogen that causes smallpox. So go on: they found vials of smallpox in a box in a storage room?
Alison Young: Exactly. In an unlocked storage room. So this should have been incredibly concerning, because smallpox is incredibly deadly. It has been eradicated from the planet and smallpox virus is only supposed to be found under treaties in two labs in the world: one is in Russia, and the other is a specific lab on the campus at the Centers for Disease Control and Prevention in Atlanta. So these vials shouldn’t have been in this cold storage room at NIH.
What was also concerning was how they responded to it when they found these vials. Ultimately, it was one scientist, by themselves, who basically picked up the cardboard box and walked it down the corridors of this building at the NIH and across the street and into another building. All the while, they’re hearing this clink, clink of these fragile old vials hitting each other as they’re walking along.
The FBI report that I read of the incident criticised the scientist and just the whole handling of this box, because when it was properly catalogued, in the end, there was a vial that had broken inside this box — and once again, the world got lucky, and it was not smallpox virus, it was some sort of a tissue sample. But as the FBI report noted, had that been the freeze-dried smallpox specimen, there was nothing really protecting the person who was carrying it.
Luisa Rodriguez: It’s so disturbing. And so many things went wrong there. There was the fact that the vials were left there in the first place. There was the fact that the people transporting them didn’t use the right PPE, that they didn’t treat them with caution.
Alison Young: And there are also the biosecurity aspects of this. You would hope that everyone who is working around really very dangerous pathogens like smallpox, which should not get into other people’s hands, part of the concern that was raised is you shouldn’t necessarily have one single person by themselves carrying a box that contains smallpox virus.
Luisa Rodriguez: And that’s because it’s one of the few pathogens that a single person could use as a bioweapon, basically?
Alison Young: Correct.
Luisa Rodriguez: Yeah. Another one that felt pretty egregious to me was the colour-coding mixup with the Ebola virus. Do you remember that case?
Alison Young: I do remember that case, because I was busy cooking dinner on Christmas Eve for my family when I was alerted that the CDC had had yet another incident. This was also in 2014. There was a researcher at CDC who was basically being put on a 21-day fever watch, because they had been preparing specimens of Ebola virus, and they had used a colour-coding system for the vials — some which were being left as live and some that were being killed. Again, it’s this inactivation of these specimens which is a risk factor if it’s not done right. They had prepared these specimens and shipped as “killed” some of the specimens to another lab, where a worker wasn’t using PPE that was appropriate for live Ebola virus. And it turned out that they had mixed up the colour coding on them.
In the end, they got lucky yet again. It turned out, after much more investigation, there wasn’t live virus in even the ones they thought was live virus. But again, it’s an example: it’s not because they were good, it’s because they were lucky.
Luisa Rodriguez: I mean, is it unsettling that they thought they were shipping some live and some inactivated samples of Ebola and they didn’t even get that right? That all of them ended up being inactivated? It just seems like even though that ended up being for the best, it doesn’t seem great that they weren’t able to nail that goal.
Alison Young: It goes to the whole issue of human error. There is the potential for human error whenever human beings are involved. It’s one of the things where looking at processes and how do you create foolproof processes. It’s part of the reason why, in other safety areas, having good data of knowing where near-misses occur and how often and under what circumstances allows for safety experts to study them and disseminate lessons learned and help create better universal protocols to reduce the chances of these kinds of issues. And one of the things that as I’ve interviewed safety experts over the years, they have said over and over that the lack of data has created also a lack of good data and lack of uniformity in what kinds of safety processes are validated and known to actually prevent accidents.
Luisa Rodriguez: Yeah, right. Part of me is somewhat reassured that my impression is that many or maybe even most of these accidents didn’t end up with people becoming infected. To what extent should we consider the fact that these didn’t cause outbreaks as a sign that the overall system works?
Alison Young: The good news is that most of these accidents don’t seem to cause infections. So that’s great, but it’s also a matter of luck in many cases. So sometimes it’s that the systems are working. Sometimes, clearly, these accidents show the systems have not worked, and that precautions haven’t been taken, and mechanical systems have failed.
The concern is that rare events can still cause catastrophic accidents. That is the concern about these kinds of labs and the lack of lab safety oversight. The the US Government Accountability Office, which is the nonpartisan research arm of Congress, has been warning for more than a decade that a rare accident can cause a potential pandemic if it’s the right kind of situation with the right kind of virus. That is something that all of us should be concerned about.
Luisa Rodriguez: Yeah. It sounds like your take is that primarily we are getting lucky. That there are potentially dangerous accidents happening, and it might only be a matter of time before the right pathogen at the right time with the right kind of system breakdown does have more catastrophic consequences, including at somewhere as prestigious and well respected as the CDC.
Alison Young: That is the concern that has been raised by experts going back over time, is that really, the more of these experiments, the more labs, the more opportunities there are for a rare event to occur — that the right pathogen is involved and infects somebody in one of these labs, or is released in some way from these labs. And what I chronicle in Pandora’s Gamble is that there have been these previous outbreaks that have been associated with various kinds of lab accidents. So this is not a theoretical thing that can happen: it has happened in the past.
Dugway live anthrax accidents [00:32:08]
Luisa Rodriguez: So those are some examples of lab accidents that happened while you were investigating the CDC. I want to dive into the details of a few of the lab leaks and accidents that I have found particularly disturbing while reading your book. To start with one of these examples: the Dugway life science test facility case. Can you talk through what happened there?
Alison Young: Sure. So there is an army lab in Utah, near Salt Lake City, that has spent a very long time producing large quantities of anthrax specimens for use by other researchers. These specimens are essentially killed or inactivated — the reason being that if you’re creating a test for anthrax, or if you want to create protective equipment, you don’t need the live anthrax to work with; you just need something that has some of the characteristics, and it’s much safer to use that way.
But one of the things about working with these killed specimens is that the scientists who receive them don’t wear protective equipment in the same way they would if they were working with live anthrax. So this lab at Dugway, it was discovered that for essentially 10 years, they had been shipping anthrax labelled as killed, when it was in fact alive and capable of causing infections.
Luisa Rodriguez: So that’s insane. And it’s insane on the face of it, but it doesn’t get any better when you learn about the details. Do you want to explain how it was possible for live anthrax to get sent all over the country? And actually, I think to several other countries.
Alison Young: Yeah, it was sent all over the world; it was something like 200 labs. And there were so many levels at which there were failures.
First of all, it came out as I was reporting on this that regulators eight years earlier, before all of this was discovered, knew that Dugway had a problem with inactivating its anthrax. There was an incident with one specimen that was sent to another government lab and there was a live anthrax spore in it. When the CDC came to Dugway to investigate, Dugway initially pointed fingers and said, “It wasn’t our spore; it was their spore. Somehow this other lab must have done something wrong.”
But then it further came out that the process that Dugway was using to kill the anthrax, which was a chemical method of killing it with chemicals, they did this very critical sterility test to just make sure that everything was dead. So after they supposedly killed the anthrax, they put some of it in some vials and tried to see if it would grow. And nothing is supposed to be growing in these various test tubes, but one of them actually started growing. That should have been a sign that the whole batch was suspect, but instead what the scientists did is they just threw out the one test tube that grew and stamped the rest of them as being dead and went on with shipping them.
And that’s bad enough, but years later, when all of it would come to light that their other method of killing anthrax spores (which was radiation) also had similar problems and similar failures in quality control to make sure that it was truly dead, the CDC, which had known eight years earlier there was a problem, said that’s not the same thing, and they started splitting hairs. But it was an opportunity for the agency to dig deeper and see that there were larger systemic issues. This could have been stopped very early in the process, but the regulatory agency just didn’t go deep enough.
Luisa Rodriguez: With this thing with the vial having things grow in it just being thrown out, and then the rest of the batch being stamped as fine, I remember thinking, Is that common practice? Or is that the scientists being very low integrity? Or is that just like a policy that’s OK and allowed?
Alison Young: Everything in my research was this was not at all how it was supposed to play out. But one of the problems throughout the oversight and lack of transparency of this research is that there’s no way of knowing how often things like this happen, because there is just so little transparency when accidents happen.
Luisa Rodriguez: Yeah, right. Another thing that was really horrifying to me about this was that it wasn’t a regulator or Dugway that realised that they were sending around live anthrax samples. Can you explain how it was discovered?
Alison Young: The way that this was discovered is that one of the labs that had purchased some of these supposedly killed anthrax spores actually did what apparently hundreds of other labs didn’t do: they ran their own tests. They didn’t trust just blindly this death certificate that came with the anthrax spores. They ran their own tests, and sure enough, the anthrax grew. And because they ran that test, that ultimately led to the scrutiny that caused this huge international incident caused by a US biodefence lab shipping live anthrax all over the world.
Luisa Rodriguez: Yeah. It just blew my mind that it was just kind of luck that that biotech company decided to test whether the anthrax was live or not. They didn’t have to. Clearly no one else did. It wasn’t standard practice. And so this could have gone on much longer.
The fact that no one seems to have been hurt or killed because of the live anthrax going around for a decade, is that comforting? This was hundreds of samples, and so maybe we should take comfort in knowing that there weren’t terrible consequences?
Alison Young: I would say we got lucky. That’s one of the things that also is a theme in the book: Over and over, labs are lucky. It’s not because they’re practising good biosafety. It’s because they’ve gotten lucky.
I think one of the other things that is important to note about the Dugway incident is it wasn’t just that they had these unsafe practices on killing specimens. When finally this incident came out, there was intense scrutiny of the lab and its wider safety practices, and the kinds of things that the army wrote up in its final report were about floors being contaminated with anthrax and just generalised unsafe practices within the labs. And those had never been detected — again, not by the institution and not by the regulators. And there also were other concerns about the lack of training of the person who is responsible for biosafety at the facility.
Luisa Rodriguez: How do you make sense of that? Is that just negligence? In some sense, this scientist, or maybe a couple of scientists or employees, are responsible. But in some sense, someone neglected to do the training. Someone neglected to follow up on these concerns. I think there were cases where senior scientists at Dugway, working in BSL-3 labs there, raised concerns to senior leadership, and they were ignored. How does that happen?
Alison Young: One of the challenges is the idea of establishing safety culture within organisations. Part of my book goes way back into the history of biological safety, and I spent a lot of time reading the papers of a man by the name of Dr Arnold Wedum, who is considered the father of modern biosafety. And part of the reason the book goes into depth about Arnold Wedum’s findings is that I think many of his concerns about the lack of safety culture in microbiology, and the difficulty in getting certain scientists to accept the importance of following safety protocols, some of this resistance to safety culture that he saw way back in the 1950s are some of the same kinds of things that play out today in these incidents.
Arnold Wedum talked quite a lot about this idea of being a martyr to science. Obviously, the people who went into microbiology over time are people who are very dedicated to the study of science, to trying to improve the lives of people around the planet.
One of the things that’s important to remember is that microbiology is relatively a new science. It’s a young science compared to chemistry and radiological sciences, and Dr Wedum said that those scientists seem to be much more open to the scrutiny of their practices than those working in microbiology labs — who, for much of the history of microbiology, because there were not ways to keep them safe, were often catching their experiments. Dr Wedum also talked about how — again, this is back many years ago — some of these scientists took great pride in how many times they had become infected, because they were doing this for the greater good.
Luisa Rodriguez: I remember finding it striking in the book, reading about these cases where scientists, working before a bunch of better safety practices, would basically brag, as you said — like, “I’ve gotten TB four times already” — and it was almost a battle scar that they wore with pride.
Maybe there, the takeaway is this field is coming from this initial foundation of getting these diseases is a norm and is even kind of a good thing. It’s like a badge of honour. So when you try to throw all these safety practices on top, they’re resistant because they’re used to this; they don’t regard it as a terrible thing. And that’s part of what’s made making safety a norm a much harder problem. Does that sound right?
Alison Young: Some of that, I think, is very much the case. Also, there’s just not a culture of tracking these kinds of infections. There never has been a culture of that. To this day, there are no universal tracking systems for these kinds of illnesses in labs or accidents.
I think part of the challenge as well is that nobody likes having to do things that make it harder to do your job. And one of the realities of the kinds of safety procedures and equipment that are required, depending on the pathogen, they can make doing your work slower and more cumbersome. It can be more expensive. There may be limited access to certain kinds of equipment. All of those kinds of things — at least over time, in what Dr Wedum wrote about — created a culture where there were questions about whether any of it was necessary.
And that’s where that idea of the “martyr to science” culture comes from. So that was back in the ’50s, ’60s, and ’70s, when Dr Wedum was really writing about those kinds of things. Here we are in 2023: What is the culture inside individual labs? It’s hard to say, but you can see in incident after incident that there are individuals and institutions that are not paying the attention to safety that they should be.
Soviet anthrax leak [00:44:42]
Luisa Rodriguez: Moving on, another thing I didn’t fully understand before reading your book was that lab leaks can also have huge impacts not just through infecting a couple of people in a lab and then spreading; they can also affect many people at once if there’s an accident that causes environmental contamination. This just hadn’t occurred to me somehow that it was possible. But it is possible, and it’s happened many times, disturbingly.
There are several examples of this that you cover in the book. So Tulane apparently spread a bunch of sludge that might have contained anthrax over a compost field on its property. Over a two-year period, the USDA National Animal Disease Center had three releases of wastewater that was potentially contaminated with select agent pathogens. But the example that most disturbed me happened in the USSR in 1979. Can you talk us through what happened?
Alison Young: In this particular incident, there was an accident at a lab that was believed to actually be a bioweapons facility by US intelligence, but a lab nonetheless, that was working with large quantities of anthrax. It appears that it spewed a giant plume of anthrax spores over a town. And people downwind were sickened, animals were killed, about 60 people in that case died. Initially, the authorities sought to claim that there was no airborne anthrax — that this was ultimately a result of anthrax food poisoning, possibly from black market meat or some sort of contaminated cattle feed or agricultural feed. And that was sort of where it was.
Over time, because it was such a huge and deadly outbreak, there was intense scientific community interest. And eventually, there was a group of scientists who invited officials from these former Soviet communities to come to the United States and give a presentation at the US National Academies of Sciences. There, they produced all kinds of slides and charts and told compelling stories of racing up into the mountains and how they were there to help save these people, and they showed all kinds of information that really was making the case that this was a foodborne anthrax outbreak. Coming out of that meeting, there are news clippings in The Washington Post and The New York Times and elsewhere where prominent US scientists say they’ve been incredibly transparent and they’ve made quite the case — it looks like this really was gastrointestinal anthrax, and not some sort of an airborne release.
Then it took many more years, until 1992, when then Russian President Boris Yeltsin came out and made this very surprising statement in a Russian newspaper that in fact that outbreak was the result of a military lab accident.
Luisa Rodriguez: So this case absolutely shocks me. One, it’s just horrific: 60 people died. Two, there was this extremely successful coverup by the Soviets, which was particularly because they were violating the Biological Weapons Convention and wanted to hide that. And then three, just bizarrely, Boris Yelstin later unprompted admitted that this was caused by military bioweapons research.
But I wanted to talk about what happened after all of that, which was this joint effort by American and Russian scientists to find out exactly what happened. I just found this extremely moving. Can you explain what they did?
Alison Young: Yeah, it’s fascinating. Here were these Russian scientists who, at the time all of this occurred, were incredibly brave and basically hid away evidence to keep the KGB from taking it away. So they hid away their notes. They had samples from the people who died, and they kept them in jars — but they put them out in the open, almost hiding them in plain sight, so that they wouldn’t be confiscated. They had kept these for all of these years, and so, as the political situation changed in Russia, it became possible for them to actually disclose that they had this information.
And they did some remarkable investigations, where they even went and looked at other records that weren’t destroyed, such as who got compensated. They went to graveyards and looked at the death records. And ultimately, even some of the main US scientists who were the biggest proponents that this was not some sort of a bioweapons lab and that it was absolutely what the Soviet officials had said, and they were absolutely believing of this initial cover story that this was a meat problem, those same scientists ultimately came around — some of them assisting with the Russians’ research that this was a huge anthrax plume, and there was plenty of documentation for it.
And I think the thing that is so instructive is it took 15 years to get to that point from when the accident happened, and all of the years of coverup, and all of the years of many international scientists believing the cover story, to ultimately getting to the truth.
Luisa Rodriguez: Yeah. A point you made in your book was that we might end up seeing something similar with COVID-19. It’s very much still up for debate whether COVID-19 was the cause of a lab leak or had some more natural origin. I’ve followed it a bit, and have felt that it seems totally possible to me it was a lab leak, but also that there’s just not good enough evidence right now to be confident either way. And you point out that in a very different but kind of similar case, it took 15 years, but we did eventually get a conclusive story of what happened. And maybe we’ll get that with COVID. Maybe not, but maybe.
A couple of other things that these scientists did that I remember from your book was they did autopsies on pets. So there were pets that were in the fallout zone that died and they were able to basically dig up those pets, do autopsies, and find that the kind of anthrax that killed those animals wasn’t the kind that you would have expected had it been the story that the Soviets were painting.
They did kind of a geographic analysis of where were these cases, and it didn’t really make any sense that it was a foodborne illness, because it basically ended up being that the people who got sick and died were in this oval, basically — that you’d expect to see if there was an airborne plume that settled on a certain area — and not the kind of thing that you would see if there were random households across a city buying contaminated food.
I basically found that really fascinating, and again, just pretty heartwarming, especially this case of these Russian pathologists who literally saved patients’ preserved organs in case it was ever going to be possible for them to share what they thought really happened.
Alison Young: One other theme that’s worth noting in this case is there also was this rallying of US and other international scientists to support the story of the Soviet scientists. And I think that’s worth noting, in the sense that that has been something that has been the subject of some level of criticism and debate in the current search for the origin of COVID-19: this idea that there have been a number of prominent scientists who have been very vocal that their colleagues in China are saying this was of natural origin, but have also not cooperated with international investigations, but that because they’re saying it, it should be taken as truth.
Luisa Rodriguez: Yeah, that makes sense.
The 1977 influenza pandemic [00:53:43]
Luisa Rodriguez: Let’s talk about another example. Can you describe what happened during the 1977 influenza pandemic?
Alison Young: So in 1977, during the flu season, there were people who started falling ill with flu with an unusual pattern. Most of the time when we have the flu season, some of the people who are at greatest risk of influenza are the elderly. But this particular strain of flu seemed to be mostly sickening younger people — people who were 30 and younger and children — and that was pretty unusual. Then when the scientists took a much closer look at the genetics of this particular virus, they found that it looked like it had been frozen in time from a strain that circulated in 1950.
So, even back in the year that followed, there were scientific papers that were flagging the concern that somehow this thing really looked like it had been preserved for more than two decades. And they weren’t calling it a lab accident initially, but then over time, there were additional papers and more prominent scientists who were writing that it appeared it had come from a lab accident.
Ultimately, it’s unclear exactly what occurred, but there is some indication that it may have been part of a vaccine trial that may have gone wrong with a weakened strain of an influenza virus. So while there’s no definitive proof, in the world of lab accidents causing outbreaks, there actually is pretty good consensus on this one that this one was associated with some form of a research- or vaccine-related incident.
Luisa Rodriguez: Is there just lots of variation in the DNA of influenza viruses year on year? I didn’t realise that you could pinpoint exactly whether a strain was from this year or 20 years ago.
Alison Young: There is a certain amount of expectation that, over time, there will be changes. When they looked at this one, it did not have the changes that would have been expected.
Luisa Rodriguez: Cool. So how long did it take until consensus was reached that this was not a naturally arising influenza strain?
Alison Young: From almost the very beginning, there were credible scientists expressing those concerns. But over time, it has very much been the sense that it was either a lab accident or a vaccine research trial that went wrong. And there is debate out there. There are those — especially now in the more political climate over the questions of the lab leak hypothesis and whether research in this realm should have more regulation — who have tried to slice and dice whether or not this, if it was a vaccine trial, whether that counts in some way as a lab accident.
But the reality is that pathogens really don’t care what human beings label things as. And if research is involved, it really shows the continuum of actions that are occurring from when vaccines are prepared and the kinds of strains that are used, or weakened, or not properly weakened to use them: those are all research-related accidents. So this case is an important cautionary tale for how some research-related accident can potentially sweep the globe.
Luisa Rodriguez: Right. What exactly do we think went wrong?
Alison Young: Nobody knows exactly how this particular strain came to be circulating again back in 1977, but there was some correspondence with a very prominent influenza researcher in China who indicated that it was their feeling that this was due to a vaccine trial that had just simply gone wrong. But exactly what aspect of the vaccine trial it was that went wrong, that information has never come out.
Luisa Rodriguez: OK, so there was this trial, it was meant to be kept enclosed somehow, and it didn’t. And it sounds like some people quibble and say this wasn’t a lab leak or a lab accident. But to your point, all of this research into dangerous pathogens, we should be asking questions about when it’s appropriate, when it’s worth it, and what the risks are, and making sure to mitigate those risks — and this just feels like a case where that was probably not done sufficiently.
Alison Young: Yeah. Again, the idea of research and this biosafety being a continuum of activities — everything from going out and collecting specimens in the wild; to how you house those specimens once you get them; to how you potentially inactivate those specimens; to whether you’re creating a vaccine, and the biosafety used when you’re working with viruses for that vaccine; to then if you’re launching a vaccine and you’re going to challenge it in some way with some sort of a pathogen, how you handle that and what are the protocols that are involved — all of that is a huge continuum of research, all of which has important biosafety measures that need to be taken.
Luisa Rodriguez: Yeah, right.
The last death from smallpox [00:59:27]
Luisa Rodriguez: OK, so another example I wanted to talk through: the last death from smallpox, which I found to be just a really heartbreaking story. For background, smallpox is one of the deadliest pathogens humans have encountered. It’s killed something like 300 to 500 million people over the last 3,000 years. And in 1977, after years of advocacy and medical intervention, it was eradicated, which was a huge achievement. I think it’s the only disease that’s ever been eradicated. But a year after it was eradicated, that monumental achievement was jeopardised. Can you explain what happened?
Alison Young: Yeah, this was a really awful case. There was a medical photographer at Birmingham University Medical School in England, a woman named Janet Parker, who became ill with pretty nonspecific symptoms: she had headaches and muscle pains, and eventually, over time, she developed a rash. It wasn’t obvious at first that she had smallpox, and this wasn’t a disease that doctors were on the lookout for, because it had been eradicated. And not only that, Janet Parker had been vaccinated against smallpox several years earlier. So it took about two weeks before she was eventually hospitalised and doctors and health authorities realised that she had smallpox.
And it was a pretty shocking thing. It’s concerning to note that she was, for this extended period of time, out there, sick among her family, her friends, medical workers, and others who she was coming into contact with. She ended up dying from the smallpox infection. Her mother contracted smallpox but ultimately recovered. Her dad ended up dying, although they said it was from a heart attack. And more than 300 people ended up being quarantined, though nobody else was infected. But this case had numerous warning signs before it happened.
Luisa Rodriguez: Yeah, I found reading about those warning signs particularly horrifying, because they very much should have raised alarms and caused people to change their behaviour, but they didn’t. Can you talk about what some of those warning signs were?
Alison Young: In the months before Janet Parker became infected, the World Health Organization had sent a team to this lab, and they had identified some concerning safety problems. But the scientist who ran the lab really was quite dismissive of what had occurred. He essentially took the tack that his lab had been operating in this fashion for a number of years and there had been no problems in the past, and so clearly any sort of issues were pretty minimal. But the irony in all of this, and really tragic irony, is that as they were debating back and forth in correspondence over whether this lab had serious safety issues, ultimately that’s the time period where Janet Parker became infected with smallpox.
Luisa Rodriguez: Yeah, it’s just awful. I mean, to imagine having someone tell you that your lab is unsafe while you’re working with smallpox, and then to push back and say, “No, it’s actually fine.” And while you’re doing that pushback and resisting some kind of reform, to have someone infected with smallpox because of you, and die — it’s just horrific, and heartbreaking. And I guess it did have a huge impact on this scientist Bedson. Do you want to say what happened to him?
Alison Young: He talks about the misplaced trust, the trust that was misplaced in him. I think one of themes in this smallpox case is the tremendous trust that institutions are placing in individual scientists and their expertise. The reality is that these scientists are human beings, and they make mistakes and they can be careless, they can make assumptions, and they can also sometimes be a bit arrogant about their safety practices. And in this particular case, after Janet Parker became infected, this particular scientist took his own life.
Luisa Rodriguez: Yeah. Which is a horrible fact, and also highlights how much he deeply cared. I mean, he devoted many years to advocating for the eradication of smallpox, so clearly it was one of his most fundamental goals to protect people from this disease, and yet he was primarily responsible for the final death caused by smallpox.
Going back to some of the warning signs and how egregious they were, some things that stuck out to me were: there was air that was meant to be run through a HEPA filter that wasn’t, there were PhD students who were meant to be trained on working with smallpox who reported not receiving any training, there was no disinfection of containers of smallpox that were stored with other organisms in the freezer’s lab. I think those are the big ones. Am I missing any?
Alison Young: Yeah, those very much are. The airflow issue is a problem that comes up over time. One of the things I think that’s important for listeners to understand is that one of the main ways that labs try to keep these pathogens from infecting people outside the lab is to control the airflow in the labs: they operate under what’s supposed to be negative air pressure. And this lab had problems with airflow, allowing particulates to float out into areas where people would not have been thinking they were at risk of becoming infected.
Luisa Rodriguez: Yeah. What is the best guess for exactly how Janet Parker was infected?
Alison Young: It is unclear, and still a mystery and a matter of dispute. There have been, over time, the various investigations have speculated that perhaps it went up through some of the air ducts, that maybe she visited the lab. There are a variety of areas of speculation, but it ultimately was never solved.
Luisa Rodriguez: Right. And maybe one of the most disturbing things about this case is that we don’t know, because there are several possibilities, because there are several egregious ignoring of key policies happening.
Alison Young: Yes. And the other thing that I think is disturbing about this case — and unfortunately, it is a theme that plays throughout the cases in my book — is you had these poor safety practices, bad supervision, issues that were risky with this kind of especially risky research, and that all of the institutions that we in the public are trusting to catch these things missed them. The university missed them. The various oversight bodies missed them. The people working in the labs didn’t call it to the attention of others. So that is one of the themes that’s very concerning in this area.
Luisa Rodriguez: As you’ve already said, one lesson from this case is the fact that these labs, and the institutions that govern those labs, are made up of humans. And humans, even biologists and virologists who care a lot about preventing these diseases, make mistakes. I think I had it in my head that biologists working with smallpox would somehow be immune to those mistakes, because they’d understand the stakes. And they’re just not, because they are humans.
Are there any other lessons that stand out to you from this case?
Alison Young: There are a couple of other lessons for me. One of the things that I think it illustrates is that when people become infected from a lab accident, most of the time there is no recognised accident. So that’s important to process, because if you know an accident has occurred, you can go and quarantine yourself, you can take precautions, you can isolate yourself. But if you are somebody in a lab building and you encounter an invisible pathogen and you don’t know it — you don’t know that a sloppy practice has led to the spread of something that you can pick up — you now have the potential, if it’s the right kind of pathogen, to start an outbreak. So I think that’s one of the really important lessons of this.
Luisa Rodriguez: Yeah, that’s a great point. I think I would have naively imagined that lab accidents happen when someone has a test tube with a virus in it and it smashes on the table and spills, and maybe a person touches it, and then you know that we should quarantine this person. But a theme over and over again in the book is that these accidents are happening totally invisibly, and so we actually have no idea when to quarantine people if we’re not doing things like regular testing for different diseases that they’re working with, temperature checks, other things. And that was terrifying to me, because some of these diseases are super infectious — and to the extent that a disease is caught invisibly, without anyone’s knowledge, it’s much more easy for me to imagine how that person then goes on to infect loads of people.
How common are lab leaks? [01:09:05]
Luisa Rodriguez: So those are a few of the examples that really stood out to me from your book. I wanted to zoom out next, and talk about what we know about why these lab accidents happen, how common they are, and what we should be learning from them. To start, what do we know about how common these lab leaks are?
Alison Young: I wish I had a really good answer to that question. The problem is that there is no good data on this. There is no universal collection of data on how often accidents happen. There’s no universal mandatory reporting on accidents, and there also is no universal mandatory reporting on laboratory-associated infections. One of the most basic things that is needed in this area is data. You can’t change what you can’t count. So we don’t have a good answer there.
I can give you some general information about what is known. For example, in the United States, there is one area of mandatory regulation of labs that work with certain kinds of pathogens called select agents. These are essentially particularly dangerous pathogens like Ebola, anthrax, and certain agricultural pathogens. There are about 70 to 100 incidents reported a year to the Federal Select Agent Program, among about 200 labs that are working with these pathogens, and more than 800 lab workers have required medical assessment or treatments in those federal select agent labs between 2015 and 2021.
So that gives you an idea. It’s also been documented in some of the studies over time that lab accidents do tend to be underreported. But those are the ones that are at least reported to the Federal Select Agent Program.
Luisa Rodriguez: So already we’ve got hundreds. And that’s an undercount; that’s with the culture of underreporting that exists.
Alison Young: And with only a subset of pathogens. So then the question becomes: How often do people get infected? Infections are believed to be relatively rare, but again, we don’t know. And one of the things that’s important to remember is that not all infections of lab workers are created equal. Obviously nobody wants a lab worker to get sickened, but a salmonella infection is less concerning than, say, an engineered avian influenza infection. So everything’s different. Also, not all infectious agents are capable of human-to-human spread. So that’s the other thing.
But what we know in that arena is, I interviewed a woman named Karen Byers. She is a biosafety expert in Boston, and she really has spent much of her career scouring the scientific literature looking for publicly reported laboratory-associated infections. What her data shows is that between 1979 and 2015, there have been about 2,230 lab-associated infections and 41 deaths. But again, those are what have been publicly reported in that time period.
Luisa Rodriguez: Even ignoring the fact that that’s only what’s been reported, that’s shocking to me. I wish I’d taken a guess before reading your book. I just would never have guessed that 41 people have died because of lab accidents and lab leaks. It feels like surely if that were the case, there would be enormous incentive and pressure to reform these institutions, such that we wouldn’t see this nearly as often. And as far as I know, there have been some cases where there have been pushes to improve things, but overall, this field is still just unreasonably prone to leaks and accidents. How do you explain that? Who’s dropping the ball here?
Alison Young: There has been tremendous resistance by the biological research community into having the collection of data and the public reporting of data on laboratory incidents, accidents, and infections. There is currently legislation that’s being considered by the US Congress that would create a reporting system for laboratory accidents. But as written, the bill would not only have this be voluntary reporting, it also would exempt from the federal Freedom of Information Act the information that is collected about those incidents. So essentially, the public would not have a right to find out about them.
Luisa Rodriguez: What’s the rationale there?
Alison Young: Well, I can tell you what those who are proponents of the secrecy of these kinds of incidents say.
Sometimes they’ll say it’s a national security issue: that if we release information about incidents and accidents and the kinds of pathogens that are being studied at individual labs, that somehow terrorists or those with nefarious intent can somehow do harm to the lab.
Now, it’s worth noting that these same labs who take that position also routinely post on their own websites promotional materials about the experiments they’re conducting and the kinds of pathogens they’re working with. They publish their research detailing all of that in scientific journals that are available on the internet for anyone to see. So when it’s positive information about what’s going on in these labs, they have no problem with it being public. But if it’s an incident or an accident, then it somehow becomes somewhat dangerous to their operations.
The other thing that they say about why the reporting of incidents should be secret is they say that it will make it less likely that scientists and labs will actually report incidents when they occur. And there is some basis for that: the issue of fear of some sort of negative punitive consequences within organisations or to yourself and your personal reputation can cause people to be reluctant to report.
But it’s notable that there are other high-risk scientific fields where incident and accident reporting is public. A notable one is for facilities that are working with radiological and nuclear materials. So if you go to the Nuclear Regulatory Commission website in the United States, you can look up right now and find university labs that have had mishaps with radiological materials. You find the names, you find the dates, and it’s kept very current: it’s usually within days or weeks of the incident occurring. But there appears to be special considerations that are given to the biological research community; the case is made that it’s somehow more dangerous for this information to be public.
Luisa Rodriguez: Yeah. I love this example of this Nuclear Regulatory Commission case, because some part of me can imagine being convinced that it would be too hard to create a culture where people are willing to admit very publicly to these kinds of accidents and mistakes. But clearly it’s been done and it’s possible. What do you think is the difference between these fields? Why haven’t we been able to achieve that in the biological sciences?
Alison Young: It’s a really good question. I have asked that of a number of places: What is it that makes biological research somehow more sensitive to these kinds of issues than maybe some other kind of science or field? And I haven’t gotten good answers.
The closest thing I’ve gotten is that the kinds of regulations and such that were put in place for the nuclear industry and for research that involves those kinds of materials was done at a different time: it was done earlier, and there was more of a public attention approach to it. And also, things have changed over time with the kinds of research that biological research may have been doing decades ago to now — where you have a growing, still small, but a growing number of labs that are manipulating these pathogens in ways that have the potential to make them more dangerous than what is found already in nature.
Luisa Rodriguez: Yeah, that’s a great point. Let’s dive into that.
Improving the regulation of dangerous biological research [01:18:36]
Luisa Rodriguez: So the thing that you’re getting at here is what’s often called gain-of-function research. Do you want to say what gain-of-function research is real quick?
Alison Young: Well, gain-of-function research is one of these things we talked about earlier, how everyone slices and dices words and labels and names. So gain-of-function research can mean a lot of things depending on who you are talking to. I think that for the vast majority of people who are reasonable, what the term is trying to mean is when a lab is taking an organism and making it somehow more dangerous than what is found in nature: it’s making it perhaps more transmissible, more deadly, more capable of causing disease, perhaps able to move from infecting one species to another.
Luisa Rodriguez: So the idea here is that, despite the fact that we’re doing that when we weren’t decades ago, the institutions and regulations aren’t exactly keeping up with the increasing levels of danger of working with these pathogens. When I think about how hard it is to change culture in large fields like this, it sounds really hard to me. Does it feel tractable to you?
Alison Young: I think many people are just now, because of COVID-19, becoming interested in the questions about how safe are research labs around the world.
This is a topic I’ve been now covering for 15 years, and it’s important to know that going back at least 10 years ago, the US Government Accountability Office started issuing reports raising concern that as more of these kinds of biological research facilities are built and doing more experiments with more risky pathogens, there is this increase in the aggregate risk of a catastrophic accident. So I’ve been covering hearings in Congress going back over time, and back then, there was not one political party or another that was interested in this: this was a bipartisan concern.
And as I wrote Pandora’s Gamble, it was a huge reminder as I went back and read through some of the transcripts of hearings that I’d sent in as a reporter, and seeing both Democrats and Republicans asking really important questions about the policy issues of how we deal with the safety of these labs. There was a recognition of the importance of conducting biological research. I mean, we all need this — I don’t want lost in any of this the idea that this world has benefited greatly from the COVID-19 vaccines and from all kinds of work that these labs do. But we also need that work to be done safely. And how many labs do we actually need?
And Congress was holding hearings and looking at this stuff closely. There were pushes in the 2014–2015 timeframe — when I was writing about a bunch of accidents at the Centers for Disease Control and Prevention and at Dugway, as we’ve discussed — there were even more hearings raising questions of did there need to be a single federal entity that was overseeing lab safety? And then it went nowhere. And that has played out over and over, over the years.
Part of it is that the organisations that operate labs, nobody wants more regulation on them: nobody wants more scrutiny, nobody wants more red tape. And the federal agencies that Congress and the public rely on to advise on what we need to do in these arenas all have potential conflicts of interests. The agencies like the National Institutes of Health: it’s one of the largest funders of biomedical research in the world. They conduct their own research; they are funding the research often at the labs that are having the accidents that are of concern. You have the Centers for Disease Control and Prevention: they are one of the two primary regulators in the limited subset of these labs that are actually subject to any regulation on safety. The CDC’s labs have their own series of issues with safety problems in their labs.
So it is something that every few years, at least in my coverage of it, you see interest in Congress and then it dies back down again. And now with COVID-19, obviously this is back in Congress and being discussed again, but the whole political climate in Washington has become so toxic that that is now adding a new layer to the whole debate.
Luisa Rodriguez: Right. I guess a lot of the examples we’ve talked about today happened decades ago. Do you have a take on how much better things have gotten?
Alison Young: It’s difficult to know in some sort of an analytical sense, in part because there’s so little data on it. Obviously, technologies have improved over the decades, and various countermeasures have improved. So scientists working in labs, where possible, now are vaccinated often against the organisms they’re working with. There are antibiotics and various treatments that can be taken after a known incident occurs, and that also will prevent people from becoming infected.
But there is the challenge that there are far more of these labs than there ever were before. So when you have rare events, the more of these labs you have — and the more experiments that are being done with potential pandemic pathogens, which are the ones of greatest concern — there is that nonzero risk that increases over time.
Luisa Rodriguez: Right, OK. So if you have even very small chances of these things happening, just the sheer number of labs that are working on these things going up means you’ll probably have more accidents?
Alison Young: It’s possible. One of the things that just is so frustrating in this arena is that nobody is even tracking how many of these labs there are. One of the biggest surprises for me when I started covering this is that the US Government Accountability Office, which is the nonpartisan investigative arm of Congress, produced reports going back more than a decade ago that said even the US government doesn’t know how many biosafety level 3 labs there are.
Part of the issue here is that it is such a fragmented area. If you are a privately funded lab, and you’re not taking government money and you are not working with a select agent pathogen, the government may not really know that you exist as a lab. They may know piecemeal — like, you might have to have workers’ compensation, or you might have to have some OSHA things, or you might have to have a wastewater permit. But you don’t have a lab permit, and so there’s no chronicling of where all these labs are.
So one of the things we did when I was a reporter on USA Today‘s national investigative team is we set out to find out how many biosafety level 3 labs can we even identify. And it was incredibly difficult. We identified a couple hundred of these labs across the country, but what it took to do that is literally googling “biosafety level 3 lab” and then we could find where places advertised it. Or we looked at government grant records where they mentioned that they were using a biosafety level 3 lab or a BSL-3 lab. Or we looked at LinkedIn and looked where people promoted the fact that they’d worked in these labs. But this is cobbling it together from an incredible number of records that it’s something that you would think that the government would know.
And that’s just in the United States. I have a Google alert that is set up for BSL-3 and BSL-4 labs, so I see the press releases that go out when various countries or various universities are announcing that they’re building a BSL-3 or a BSL-4 lab. But there is no one place that policymakers or the public can go to see where these labs are, or how many there are.
Luisa Rodriguez: How can we not be tracking those?
Alison Young: There just is no mechanism. There’s a case right now that has gotten some recent attention out in California, where there was a biotechnology lab in Reedley, California, that has gotten some attention because literally a code enforcement officer in this small city discovered that there was this lab, and they had 1,000 mice, and they had -80° freezers out there, they had all sorts of biological materials. And ultimately, and I’ve been working on some reporting in this area, what the local officials have said is that the only way they were able to address this lab — because it was privately funded, it didn’t receive any government grant money, and they weren’t obviously working with any select agent pathogens — they had to cobble together and use local code enforcement and other piecemeal regulations in order to address the facility. There was no lab authority to go to to address the biohazards of the facility.
And this issue has come up over and over over the years, but it’s not one that policymakers have so far addressed. There has been a lot of talk, and it has been known for a long time that there are gaping holes in the oversight because of the fragmented nature of how we look at these biolabs.
There’s one other aspect of the proposed legislation that is worth pointing out: It does include a provision that asks for a biosecurity board in the US government to evaluate the effectiveness of the current Federal Select Agent Program in overseeing biorisks in this country. And it asks for proposals to, in its words, “harmonize” the various fragmented pieces — whether it’s at the NIH; the NIH Guidelines; the Select Agent Program; and the recommendations in something called the BMBL, basically the biosafety manual, its recommendations (but not regulations) of safety practices. But what’s interesting in how it is written is it sounds like harmonising, but leaving in place the fragmented system of multiple agencies being responsible for this kind of work.
Luisa Rodriguez: Interesting. I see. So right now, multiple agencies are responsible for regulating. What are they? Is it the CDC and the USDA and maybe some others?
Alison Young: There’s limited regulation, and that is only through the Federal Select Agent Program. So that’s a certain couple dozen pathogens that have the potential to be used for nefarious intent, whether it is by bioterrorists or by bad actors. Those actually are subject to what the public knows as “regulations”: there are inspections, and there are certain things that they must do or they’re potentially subject to fines or some sort of other regulatory actions.
Then there’s a whole series of recommendations that are entirely voluntary: they’re best practices, and you as a lab use your best judgement as to whether or not your experiment has the need for various safety equipment and practices.
And then, if you receive federal funding, there may be certain requirements as a condition of your funding that you do certain things. And that can be from the Department of Defense or the Department of Homeland Security or FDA, or whoever is funding your research. And then, if you happen to be doing some sort of experiments with a genetically altered organism, and you receive federal funding — so you have to also receive federal funding — then you are required to report incidents to the National Institutes of Health Office of Science Policy. If you have a worker safety issue, then just like any other business, you might have to report something to OSHA, the Occupational Health and Safety Administration.
But it’s all this sort of fragmented nature, with only a limited amount of it being regulatory.
Luisa Rodriguez: OK, so part of the problem is that there are maybe something like a dozen different kinds of institutions that somewhat oversee these labs to some extent, and on different issues.
Alison Young: Exactly.
Luisa Rodriguez: So no single entity is keeping track of how many accidents are there? Why are they happening? Which labs are following the best practices? Which aren’t? How do we develop a more high-level view of which labs are doing safe work and which aren’t, to make sure that there’s accountability and best practices being followed? And what would it look like to have the system be less fragmented?
Alison Young: Those who feel that the current fragmented system is problematic have proposed for many years that there should be a single federal agency that oversees safety of biological research. The model for this is what is already done by the Nuclear Regulatory Commission.
It would do two things. One is make it a single point of expertise in the federal government that looks at the safety of biological research. It would also be potentially an independent entity: one that does not engage in research itself, and one that does not fund the research that it is also overseeing. That is one of the areas that critics say is also problematic currently in the United States: those entities that have various aspects of oversight — whether it’s regulatory or through guidance or through their funding mechanisms — they have, they say, a potential conflict of interest because they have skin in the game.
Luisa Rodriguez: Right. So the institutions that are regulating the labs doing the work are also institutions that are funding the work, and their interests are so intermingled that you might think that they’re not able to appropriately do the kinds of regulating that they’d need to do.
Alison Young: Correct. And part of the challenge here also is, I think the public would be surprised to learn that the entities that oversee this research, particularly in the Federal Select Agent Program, the lead federal agency is the Centers for Disease Control and Prevention. The CDC is also a major research lab operator, and has had, in the past, one of the worst regulatory histories within the Federal Select Agent Program. However, you wouldn’t know that, because all of the regulatory records are kept secret. And it’s only really through some very diligent reporting that we did when I was at USA Today that it unearthed the fact that the CDC is one of the secret labs that has had some of its labs suspended from the Program.
Potential solutions [01:34:55]
Luisa Rodriguez: Let’s talk about some potential solutions. You’ve already talked about how we might be able to make these systems less fragmented, so that we’ve got more reliable oversight and accountability for these labs. Is there another policy idea that you’d be particularly excited to see implemented to reduce these risks?
Alison Young: When I’ve interviewed various experts about what they say are the most important areas to focus on in making labs safer, they focus on a couple of things. One is the idea of having a single focal point for authority and oversight. The other thing they talk about is transparency. Transparency is seen as an important mechanism to ensure that oversight authorities are actually functioning properly, and that it also creates incentives for labs and funders to ensure safety. Also, a number of experts have pointed out that the current unsolved questions over the origins of COVID-19 point to some important gaps in the various international treaties and mechanisms for investigating the potential of a laboratory accident leading to some sort of an outbreak.
Luisa Rodriguez: One thing that’s sticking out to me is that a lot of the solutions you’re most interested in have to do with regulation and oversight and the kind of culture in these labs, not with the science. It’s kind of interesting that it seems like maybe we’ve actually just got the science to make all of this much safer, and it’s a matter of humans understanding what’s at stake, humans being willing to follow protocols to report on things when they go wrong, to hold each other accountable and to know that there are consequences when they do things that are inappropriately unsafe.
Alison Young: One of the other things that I think is important to add in addition to the regulatory areas: There are biosafety professionals who have expressed concern that there has not been enough emphasis over time or funding for the study of biosafety, the study of what works and what doesn’t work. Are we doing things that maybe are taking money and energy, but don’t add that much to the safety, that maybe could be discontinued? Are there other things that we are not doing in these labs that would make them safer? And they need funding to be able to do that, but they also need data — they need to know about the accidents that are happening and the near-misses. So that is another area where there is an opportunity for improvement.
Luisa Rodriguez: That reminds me of another topic you covered in your book, which is the history of biosafety work, and how that was kind of born from one person at the US Army’s Fort Detrick, noticing that there were way more lab accidents and lab leaks than he felt was reasonable or OK. So he started tracking all of the leaks and accidents that were happening and infections that were happening, and basically tried to develop a science of how to reduce these leaks. That meant things like noticing which types of pathogens were most likely to get people sick, and he found out it was respiratory ones, and that’s what led him to do this negative airflow thing, where you suck the air out of a room to reduce the chances that someone breathes in a pathogen.
Alison Young: I think that in the period of research that was done under Arnold Wedum, who is considered the father of modern biosafety, the work that he did in the period after World War II, they did actually do a lot of applied biosafety experimentation. They published a lot of journal articles, everything from literally looking with high-speed photography at the spread of droplets from various activities and doing measurements. They did a lot of testing and banking of blood of the workers. And they did do a lot of that kind of applied biosafety research in that era that led to the modern-day concepts used today.
One of the problems, modern biosafety experts say, is that really a lot of the work that Arnold Wedum and his team did all those decades ago has not been built upon. It has not been moved forward because there hasn’t been the commitment to it or the funding for it. That organisations like the National Institutes of Health spend a lot of the money, the bulk of the money, on funding the research or building the buildings, but they’re not funding applied biosafety research.
Luisa Rodriguez: Got it. So it sounds like Wedum did lay this very solid foundation, but there’s not been the additional kind of work that you would have expected over the last several decades. To say, “We want to do more. We want to figure out what’s best.”
Alison Young: Correct.
The investigative work Alison’s most proud of [01:40:34]
Luisa Rodriguez: OK, let’s move on to our last question. So you’ve investigated and exposed so many issues: lead levels in water sports supplements, episiotomies, cruel animal treatment, and of course lab leaks, among other things. Is there a case of your work having an impact on the world that you’re particularly proud of?
Alison Young: I got into journalism because I wanted to do work that would make a difference and help lead to positive change. I am incredibly grateful that I’ve had the opportunity to do that through a variety of the stories that you mentioned. The biolabs research is something that I’m incredibly proud of.
I will say that one of the things that has led to change, that I’m also very proud of, is a story I did not long after I got out of college, which involved an old, forgotten lead smelter in an area just outside of Dallas, in West Dallas, that it was a site that had been declared cleaned up. The neighbourhood and the soil contamination was supposed to be clean, and the EPA had said there was no problem. The local environmental officials and city officials said there was no problem.
And one day, a minister in West Dallas called me up and said there’s lead in West Dallas. My editors told me, “Oh, this guy doesn’t know what he’s talking about. All of the federal agencies said it’s clean.” And he called me again, and I went out one day, and the bottom line is there were battery chips and slag heaps — and ultimately environmental investigations found that this neighbourhood was still contaminated, and it eventually became a Superfund site.
And I’m proud and I tell that story in this context, and I talk about it often with my Missouri School of Journalism students. It taught me very early in my career that it’s so important for journalists to look at the facts, and look at them in detail, and not just assume, because somebody has a PhD or comes from a respected federal agency and tells you something is one way. That you have to independently look at those facts and determine as best you can what is actually the truth.
I would say that that is something that has been a guiding light for me throughout my career. And it’s difficult. It is very difficult to be a journalist or a scientist or anyone else who is reporting on information that maybe runs counter to the official narrative — but that’s something that is so important that we do, and it’s our job as journalists to get to the truth.
Luisa Rodriguez: Yeah, I did have this reaction while reading your book, that you just have to be disagreeable. Or not even disagreeable, but you have to open yourself up to people being really angry with you for exposing things that make them look really bad. And it just seems like a very brave thing to go into.
So thank you for doing that, and thank you for coming on the show. We’ll leave it there for now. My guest today has been Alison Young.
Alison Young: Thank you so much for having me.
Luisa’s outro [01:44:05]
Luisa Rodriguez: It’s been an exciting time for AI safety, between the recent executive order in the United States and the UK Government’s creation of the new AI Safety Institute.
I wanted to let you know about an opportunity to join the community of people working on these problems and make a career transition into AI safety.
The Astra Fellowship is a programme that pairs fellows with experienced advisors to collaborate on a two- to three-month AI safety research project.
The list of advisors includes past podcast guests Richard Ngo from OpenAI, Ajeya Cotra and Tom Davidson from Open Philanthropy, and Robert Long from the Center for AI Safety.
There are around 10 other advisors, including ones from organisations such as Redwood Research, whose work was presented last week at the UK government’s AI safety summit, and ARC Evals, whose work was highlighted in a statement by Barack Obama last week.
Astra Fellows will be part of a cohort of researchers working out of the Constellation offices in Berkeley, California, allowing them to exchange ideas with the AI safety researchers who work there.
The programme will take place between January and April 2024, and the deadline to apply is fast approaching: Friday, November 17 — so apply soon.
If you’re interested in applying, search for “Astra Fellowship”; we’ll also put a link in the blog post associated with this episode.
All right, The 80,000 Hours Podcast is produced and edited by Keiran Harris.
The audio engineering team is led by Ben Cordell, with mastering and technical editing by Milo McGuire and Simon Monsour.
Additional content editing by myself and Katy Moore, who also puts together full transcripts and an extensive collection of links to learn more — those are available on our site.
Thanks for joining, talk to you again soon.