News Medicine Public Health

Self-Experimentation in the Time of COVID-19

Scientists are taking their own vaccines, an ethically murky practice that has a long and sometimes celebrated history in medicine.

Long before there were rumors of COVID parties (decline your invitation), there were actual “filth parties,” and the guest list was exclusive. Joseph Goldberger, an infectious disease expert in the US Public Health Service, was tasked in 1914 with determining the cause of pellagra, a skin disease known for its four Ds—dermatitis, diarrhea, dementia, and death.

Many physicians at the time believed pellagra stemmed from an unknown germ, but Goldberger felt strongly, correctly, that it was the result of a nutritional deficiency. To prove it, he and his wife Mary held small gatherings during which they and a few brave volunteers  injected themselves with the blood of pellagra victims and ate the feces and urine of patients in pill form, what Mary called “the most nauseating diabolical concoctions,” according to a 2014 book detailing Joseph’s work. Goldberger repeated this spectacle multiple times in cities through the country to press his case. He died of cancer in 1929, but before then his work earned him four Nobel Prize nominations.

History is spattered with sometimes gruesome examples of medical self-experimentation, some of which have netted notoriety and reward, and the COVID-19 pandemic is no different. Across the world, researchers are offering up their own bodies to science in pursuit of a vaccine to treat infection with SARS-CoV-2, a virus that has so far killed more than 700,000 people and sickened more than 18.5 million worldwide.

“In the history of medicine, [self-experimentation] has certainly been a well-recognized tradition,” says Susan Lederer, a professor of the history of medicine and bioethics at the University of Wisconsin. “I would argue . . . it was almost required. The fact that you would risk it on your own body, or on your own children, was a sign of your good faith.”

In one of the more famous examples, Jonas Salk, a virologist at the University of Pittsburgh, first tested his polio vaccine on himself and his children in 1952 before giving it to strangers. Marina Voroshilova and Mikhail Chumakov, a married pair of Russian polio experts, likewise self-administered a potential vaccine in 1959 before giving their three sons sugar cubes laced with weakened poliovirus. After a contentious debate over how to design and administer the vaccine, the consensus settled on an oral vaccine using live poliovirus. The disease was declared eradicated in the Western Hemisphere by the World Health Organization in 1994.

Yellow fever vaccine development also inspired great self-sacrifice. A research team led by US Army physician Walter Reed arrived in Cuba at the end of the 19th century during the Spanish-American war to study the virus, which killed 13 soldiers for every one killed in battle. To establish that the disease was transmitted through mosquitoes, several of Reed’s colleagues intentionally exposed themselves to mosquitos that had previously fed on victims of yellow fever. Many of them got sick, and one man, Jesse Lazear, subsequently died. This early work prompted several successful mosquito-eradication programs that greatly reduced the number of cases, but it wasn’t until 1930 that Max Theiler, a virologist with the Rockefeller Foundation, began developing a vaccine that he first tested on himself. For his discovery, he was awarded a Nobel Prize in 1951.

Now, in the time of COVID-19, researchers in the United States have again begun sharing their experiences of testing their own vaccines. MIT Technology Review reports that Preston Estep, a cofounder of the citizen science initiative Rapid Deployment Vaccine Collaborative (Radvac), developed a nasal coronavirus vaccine and joined at least 20 other researchers, including Harvard Medical School geneticist George Church, in self-administering it. Last month, they shared details of their vaccine for others to copy, and have since lost track of how many people have used it.

Beyond the US, Chinese news media widely covered statements made in February by Huang Jinhai, an immunologist at Tianjin University, who claimed he had taken four doses of a vaccine developed in his lab even before it had been tested in animals. In late July, the head of the Chinese Center for Disease Control and Prevention, Gao Fu, said in a webinar that he too had been injected with an experimental vaccine, adding, “I hope it will work.”

Similarly, the director of the Moscow-based Gamaleya Research Institute, Alexander Gintsburg, made headlines when he claimed to have tested a new COVID-19 vaccine on himself ahead of the start of human clinical trials. Days before, Kremlin spokesman Dmitry Peskov had called self-experimenting scientists “fanatics in what they do in the best sense of the word.”

Nowadays, it isn’t only expert virologists who have access to the materials needed to create new vaccines. So-called “biohackers,” some of them scientists who have left academia to form independent groups, apply a DIY attitude toward manipulating the human body. Mild biohacking might be as simple as monitoring one’s sleep or exercise, but its more extreme forms can involve implanting computer chips under the skin or injecting oneself with CRISPR DNA, as Josiah Zayner did in 2017.

Zayner and anther biohacker, Justin Atkin, have publicly hinted at or shared plans to inject themselves with DIY coronavirus vaccines and document their experiences on social media. Several videos shared as part of an ongoing investigation by MIT Technology Review reporter Antonio Regalado suggest that as many as 10 people may have already begun self-administering vaccines, although this cannot be confirmed.

Because they mix the vaccine themselves and administer it only to themselves, groups such as Radvac and biohackers like Zayner and Atkin have so far avoided the need for regulatory approval. But if subjecting oneself to an untested vaccine seems ethically or legally murky, that’s because it is.

There is no mention of self-experimentation in the Declaration of Helsinki—a set of ethical principles put in place by the World Medical Association in 1964 to govern human experimentation—or in the Nuremberg Code, a separate set of research ethics set down after atrocities committed during World War II. In a 2012 study detailing the history of self-experimentation in medicine, cardiologist and medical historian Allen Weisse claimed to have written to the US Food and Drug Administration (FDA), the National Institutes of Health (NIH), and the Institute of Medicine about their policies and received no replies.

The practice isn’t explicitly forbidden—it’s hard to argue against success—but it certainly isn’t encouraged. All research carried out on human subjects in the US must be approved by an institutional review board under the 1974 National Research Act, and most agencies also abide by the “Common Rule” requiring informed consent from participants and providing extra protection for at-risk groups such as prisoners, pregnant women, children, and fetuses. Otherwise, researchers can put themselves forward as candidates for treatments just as anyone else might.

The moral and ethical views surrounding self-experimentation therefore seem to be largely dictated by the medical community itself, and the practice has seemingly been falling out of favor, at least among Americans.

Lederer says that people now “look askance” at scientists who subject themselves and their families to unregulated treatments. Weisse’s study identified 465 examples of self-experimentation in medical research in the 19th and 20th centuries, but only 82 of those instances that took place between 1950 and 1990. “The trend in recent years toward collaborative studies, often on a massive scale, makes self-experimentation by a single individual, tucked away in his laboratory, seem almost quaint, a relic of the past,” Weisse explains in his study.

Still, the last Nobel Prize awarded for work involving self-experimentation was only five years ago, when Tu Youyou was honored for developing an anti-malaria drug that she first tested on herself. Lederer says it’s likely that self-experimentation more often goes unreported nowadays, even as it still happens.

Amid the frenzy to develop a COVID-19 vaccine and the notoriety gained by biohacking, Lederer points to history as a sobering counterpoint. After testing his polio vaccine on himself, Salk launched a massive trial involving more than 200,000 children. In what became known as the Cutter incident, the company charged with developing the vaccine failed to sufficiently neutralize the virus before adding it to the vaccine, resulting in 40,000 cases of polio and 10 deaths.

Self-experimentation is likely to remain a murky, fringe endeavor. Reflecting back on over 200 years of medical advancement, Weisse shared this parting thought: “My own conclusion is that, despite some unwise decisions in the past to indulge in this activity, many self-experiments have proved invaluable to the medical community and to the patients we are seeking to help. Therefore, rather than scorn such intrepid colleagues in their search for truth, I am inclined to salute them.”

Trust in medical science and in vaccines in the United States is low. Only 50 percent of Americans have said they would take a COVID-19 vaccine if it were developed. Self-experimentation has long been a way of reassuring the public, and indeed, Gao told the Associated Press he had offered himself up to instill public confidence in vaccines, especially at a time when social media spreads misinformation seemingly faster than the virus itself.

Source: https://www.the-scientist.com/news-opinion/self-experimentation-in-the-time-of-covid-19-67805

Leave a Comment