February 1975: After a series of back-and-forths, of carefully worded letters written to scientific journals, and occasionally hyperbolic public discussion about the appropriate path for the future of DNA research, a handful of scientists had recommended a voluntary and temporary moratorium on certain types of experiments until a consensus could be reached. The scientists agreed that a conference should be convened, and Berg was appointed its leader. He chose to hold it at Asilomar, not far from the Stanford campus. Roughly 150 scientists from around the world flew to California and spent several days engaged in a spirited and often contentious debate. One writer would later call it “the Woodstock of molecular biology.”
How much risk was too much? Luminaries like James Watson insisted at Asilomar that the fears were overblown, and that restricting science would only make things worse. Some, Berg recalls, accused others of engaging in precarious research rather than casting a critical eye on themselves. But after several days of discussions, a breakthrough came when legal experts were asked to testify about the consequences of an experiment gone wrong. “They brought up the idea of, ‘What if you’re doing experiments that are identified as potentially dangerous, and something bad happens?’” Berg says. “‘Then Congress would come in and impose. And you guys will be sued.’ So everyone began to feel we had to do something.”
Berg and several others stayed up all night on the last evening of the conference in order to draft a statement. In it, they agreed to lift the moratorium on certain experiments, but suggested strict guidelines on how those experiments should be conducted. Fail to follow those guidelines, and you’d lose the money granted to you by organizations like the National Institutes of Health.
“In short order, these recommendations became the basis for rules adopted around the world,” wrote Alex Capron, one of the legal experts who testified at Asilomar, in a 2015 New York Times op-ed. “‘Asilomar’ came to be shorthand for the social responsibility of science.”
And that, Berg says, is a level of impact he could never have anticipated.
“It’s interesting that Asilomar has now become a paradigm which is almost implanted in the consciousness of research,” he says. “We felt like a bunch of amateurs at the time. We bungled things. But we built some sort of edifice.”
Berg never envisioned, early in his career, that he’d become so enmeshed in the public policy surrounding science itself. But in the years before and after Asilomar, he became a prominent voice, consulting with experts in other fields as they sought ways to confront their own ethical issues without blocking the path to advancement. In the mid-2000s, he advocated strongly against a ban on embryonic stem-cell research. And in early 2019, he signed on to a letter that advocated a global moratorium on genetically engineered humans in the wake of a Chinese scientist’s use of what’s known as CRISPR technology to modify the genes of a pair of twin embryos.
There are, of course, tremendous possibilities inherent to CRISPR technology itself, particularly for curing disease. The problem, Berg says, is that the Chinese scientist took it a step further: By editing genes in the “germline” of those children, those genetic modifications will be passed on to the next generation, rather than be limited to a single person. And that step introduced the kinds of risks that feel, to Berg, as if they’re still far beyond modern science’s understanding. What are the potential cascading effects of altering such genes over the course of generations? What are the unintended impacts? Could it actually introduce new types of disease, or introduce new and frightening complications?
“We don’t really know,” Berg says. “Let’s not be arrogant about this and say, ‘We think that this is a good thing to do.’ Clinical trials and things like that should be part of this. But we should be very cautious about proceeding.”
This is the challenge in creating a code of scientific ethics: If it’s too restrictive, it can impede progress itself. Nearly any technology or scientific advancement in human history carried the potential to be perverted for dark purposes, Berg says, as those two words—TOO MUCH—linger in the background. But often, the benefits of progress outweigh the harm. It’s a matter of balancing research and responsibility.
“And I think,” Berg says, “we’re just going to have to live with that.”
This feature appeared in the March/April 2020 issue of the Penn Stater. Paul Berg died Feb. 14, 2023. He was 96.
Michael Weinreb is the author of four books, including Season of Saturdays: A History of College Football in 14 Games.