Bio-Terror Scapegoats: Africa, Agriculture (Food & Animals), Airports & Air Travel, Al Qaeda, Bio Labs, Bio-Terrorism Is Easy, Bio-Terrorists (Bio-Hackers), Black Market, Bugs & Insects, Censorship / Lack Thereof, Domestic Terrorists, Exotic Animals (Zoonosis), Government Ineptitude, Mail-Order DNA, Mexico, Missile Shield Failure, Mutation, Natural Disaster, No Clinical Trials (Vaccines), and The Monkeys.
Date: October 11, 2002
Abstract: In1863, when Abraham Lincoln was waging war against renegade Southern states,the U.S. Congress recruited the nation's topscientists to help. They formed a private body, the National Academy of Sciences, that has advised presidents and Congress ever since, through two global wars and the threat of nuclear annihilation. Now, as the United States finds itself again in battle, the academy has broken with its independent tradition and allowed the federal government to assume unusual oversight power.
Earlier this year, an academy committee was finishing a report on agricultural bioterrorism that found the nation markedly ill-prepared for countering an attack against livestock and crops.
The report identified broad weaknesses, especially within the U.S. Department of Agriculture, which had paid for the study. Mindful of the threat of bioterrorism, the report's authors had taken pains not to divulge any information they felt would compromise the nation. More than a dozen scientific reviewers, including specialists from the U.S. Army and the Federal Bureau of Investigation, raised no security concerns about the study. The White House Office of Homeland Security, in its review, also determined that the report contained no classified data.
Nonetheless, the office and the Agriculture Department asked the academy to withhold the report from the public indefinitely, to keep potentially dangerous information away from enemies of the United States.
Academy officials say they struggled to balance security concerns against the need to advise the nation about a serious threat. The academy's leaders eventually made the extraordinary decision to cut out a substantial section of the document. The incomplete report, including an explanation of the self-censorship, was published last month on paper, but it is noticeably missing from the academy's Web site, which makes other reports available for free downloads. The academy is providing Congress, the administration, and select others with access to the report's missing parts, which detail scenarios for agricultural attacks.
The excisions deeply disappointed R. James Cook and some other members of the committee that had written the report. "We in the scientific community depend on openness to do our work, and we depend on being able to evaluate everybody's results," says Mr. Cook, a professor of plant pathology at Washington State University. "That's how we wrote that [report]. The strategy we seem to be up against from the standpoint of the Office of Homeland Security, and even of a lot of the legislation that's already been passed, is a strategy of security and secrecy and protection. So the two don't match."
The conflict reaches far beyond the marble halls of the academy, potentially spilling into every laboratory in the United States, especially those, in the medical and biological sciences, that investigate diseases. Congress and the executive branch are considering plans to restrict scientific communication in order to prevent the spread of information that could be used in terrorist attacks, such as the anthrax letters of last year. But some of the proposals have sent a chill through academe.
"We are concerned that there may be, in the future, a gutting of some of our publications," says Mr. Cook.
"This could change the very definition of science," says Ronald M. Atlas, president of the American Society for Microbiology and a professor of biology and dean of the graduate school at the University of Louisville. "It can really alter the way we communicate as scientists and who has access to information."
But national-security experts warn that the current system of openness in science could lead to dire consequences. Some papers already published "can be used for nasty, evil, illicit purposes by criminals or terrorists," says Raymond A. Zilinskas, a microbiologist and specialist in biological weapons at the Monterey Institute of International Studies. "Some of the more technical papers could be used by national bioweapons programs, by, for example, Iraq or North Korea."
A Tug of War
Mr. Atlas says he recognized the coming crisis within minutes of learning about the anthrax attacks last October. His association includes members who study anthrax, smallpox, Ebola, and other highly infectious and deadly diseases.
Meeting a few days later with other leading microbiologists, Mr. Atlas posed two questions: "What do we need to do within the scientific community to ensure that we're not providing the information that these terrorists would be using? And at the same time, how do we provide legitimate information to the research community so that we find the next vaccine and cure?"
The tug of war has grown stronger since then. The introduction to the academy report on agricultural terrorism says it appears "in the midst of a vigorous national debate" over the issue of so-called sensitive information -- a vast gray zone of material that is not classified but might, in theory, provide aid to our enemies.
The debate reached the national stage this past summer, following a series of events:
1. In late May, the Proceedings of the National Academy of Sciences published a study by scientists at the University of Pennsylvania that provided details about how smallpox uses a protein to evade the human immune system. An editorial in the journal noted that there had been calls not to publish such observations because of fears that terrorists might use the information to cook up new bioweapons.
2. In June, the Massachusetts Institute of Technology released a report endorsing openness as one of the fundamental values of the university. Among other findings, the report stated that faculty members were facing increasing restrictions on access to scientific information.
3. In July, Science published a paper in which scientists at the State University of New York at Stony Brook described making poliovirus from mail-order DNA. The publication of that study spurred Rep. Dave Weldon, a Florida Republican, to introduce a resolution criticizing Science for publishing "a blueprint that could conceivably enable terrorists to inexpensively create human pathogens for release on the people of the United States." The resolution, which is still in committee, calls on the executive branch to review policies on the publication of federally financed research.
4. In August, officials from the White House Office of Management and Budget met with scientists and lobbyists to discuss new restrictions on what kind of information could be published by scientists employed directly by the federal government.
The issue of sensitive information has academic scientists and government officials fumbling like newlyweds in an arranged marriage, unsure of how to deal with each other but certain that they must. The tension has produced little agreement and big fears about the dangers that come from leaning too far in the direction of either openness or restrictions.
A case study of those concerns might well focus on a paper published in February 2001 in the Journal of Virology. In that study, Australian researchers tried to use a relative of smallpox, called mousepox, to render mice infertile. The strategy was to insert a gene for a molecule called interleukin-4 into mousepox, hoping it would stimulate the mouse's immune system to block reproduction and thereby keep pest populations in check. But the pathogen ended up being the ultimate contraceptive, killing the mice that normally were resistant to mousepox. Even those mice that had been vaccinated against mousepox died of the engineered virus.
To some, the interleukin-4 paper provided an instruction booklet for enterprising terrorists. "There are people who were horrified," says Mr. Atlas, whose organization publishes the journal. "The paper told of the potential that you could manipulate smallpox, if you had access to smallpox, and you could create a more deadly strain that could circumvent our current vaccine." When viewed that way, he says, "that's dangerous information."
After the anthrax attacks, the criticism of Mr. Atlas's group intensified. In meetings with other scientists and with government officials, he heard the study denounced as "the greatest mistake we ever published."
Mr. Atlas was not president of the society at the time of the paper's publication, and he played no role in accepting it. Nonetheless, he defends the report, saying that it raised important points the public needed to know. "That paper said to me that you can't rely on vaccination as your only line of defense against smallpox." When a panel of the National Academy of Sciences met to discuss future security measures, Mr. Atlas used the interleukin-4 paper to push for more research on antiviral drugs that could work even on people already infected with smallpox or other viruses. What's more, the paper placed a renewed importance on quarantine procedures, he says.
Samuel Kaplan, head of the society's publications board, calls the paper an example of excellent science. "If we had to do it all over again, we would proceed to publish it, notwithstanding the flak that it has taken," says Mr. Kaplan, chairman of the department of microbiology and molecular genetics at the medical school of the University of Texas Health Science Center at Houston.
Changing the Rules
The controversy did, however, spur the organization to change its review policy at the start of 2002. The editors of its 11 journals and its book division now look out for sensitive information as they examine manuscripts. If a question comes up, the editors of a particular journal consult and can, if necessary, bring the issue before the society's entire publications board. The members of the board can then decide whether to send the paper on for external review or to tell its author of their concerns.
So far, the society's journal editors have flagged more than 50 papers dealing with such topics as the anthrax bacterium, the botulism toxin, and the Ebola virus. But only two papers have triggered greater concern among the board members, says Mr. Kaplan. Even in those cases, however, the society has proceeded with the review process. The editors have not yet deemed any paper unpublishable, he says.
After adopting the new policy, journal editors at the microbiology society encountered an unanticipated problem: Several authors decided on their own to withhold important pieces of information from their papers because of concerns about giving away too many details to terrorists, they told the editors. But that meant the papers lacked sufficient information for other scientists to replicate the experiments -- procedures that form the foundation of modern science.
The society has always required authors to provide such information, but it had never explicitly said so in its publication policy, says Mr. Kaplan. So the group revised the policy once more this past summer.
Some authors of the problem papers have agreed to supply the missing information, while others have declined. In one instance, an incomplete paper slipped through and was published; the journal now plans to publish the missing data.
As events were heating up over the summer, Mr. Atlas found himself making many trips to Washington to discuss the issue of how to handle sensitive information. In most meetings, participants mentioned the interleukin-4 paper, the smallpox-protein report, or the poliovirus study, and debated their risks and benefits.
Given the consternation and the possibility of new federal restrictions on publishing, Mr. Atlas called for backup. He asked the National Academy of Sciences to organize a meeting of scientific publishers to discuss the issue. The academy agreed, but it realized it would need the input of security experts to put together comprehensive recommendations. So it decided to collaborate with the nonprofit Center for Strategic and International Studies to organize the meeting, tentatively scheduled for December.
But that center has run into its own publication problems of late. Earlier this year, the group wrote a report on the nation's response to last year's anthrax attacks and submitted it to the Defense Threat Reduction Agency, which had paid for the study. To produce the report, which recommended ways to improve the response to any future attacks, the center used only publicly available information. But the Defense Department shelved the document, saying it was not to be released.
The center's staff has argued to reverse that decision. "This is really bad policy," says the report's author, David Heyman.
A Slippery Slope
In public, the White House has discussed restricting publications of federal scientists only. But the Defense Department confirms that it is considering whether to require all researchers financed by the department to submit papers for review prior to publication. Other agencies are also considering changes to their review policies. More restrictions could emerge if academic scientists do not adopt voluntary measures, which could come in several forms, says Mr. Atlas.
In one scenario, publishers would manage access to information, keeping sensitive material out of the open literature and allowing only certain people to view it -- the path chosen by the National Academy of Sciences with the recent report. In another plan, authors would submit papers to a security agency that would decide whether to scrub out information, classify an entire article, or let it go through unchanged.
Such policies, however, could create major problems, says D. Allan Bromley, a professor of the sciences at Yale University who served as science adviser to President George H.W. Bush. "I'm not at all happy with federal involvement."
The category of sensitive but unclassified, he says, "is a rather slippery slope because the question then becomes, Who decides whether it's sensitive or not? Bureaucrats can cause considerable damage if they start trying to make that decision rather than the scientists themselves."
He cites the Department of Energy and its national laboratories, which struggled through their own version of the current debate in the late 1990s. The crisis struck after the labs suffered several high-profile security lapses, including the possible loss of nuclear secrets to China.
The department clamped down on the labs, in part by restricting access to sensitive but unclassified information. Mr. Bromley served on a commission that examined the effect of those policies and found serious problems. The definition of "sensitive but unclassified," for example, remained fuzzy and differed from place to place, so scientists and security officers had a hard time developing clear standards. Morale plummeted among scientists, as did their productivity.
Mr. Bromley worries about applying such an approach to universities. "That would have a chilling impact on the training of students -- undergraduate, graduate, postdoc -- and on the conduct of U.S. research generally."
He suggests instead that individual scientists and journals watch out for dangerous information, much as they did in the early 1940s, when physicists agreed to keep a lid on nuclear-fission and microwave research.
Controlled Free Speech
The current debate mirrors one that 20 years ago engulfed mathematicians and computer scientists who did research in cryptography, the making and breaking of codes. During the mid-to-late-1970s, cryptographers were developing significant new algorithms, and they attracted unwanted attention from the National Security Agency.
At first, the agency adopted an ironhanded approach. It pressured a researcher to pull a paper from a conference, and it threatened to seek restrictions on publications if the cryptographers did not voluntarily censor themselves, says Susan Landau, a senior staff engineer at Sun Microsystems and a co-author of Privacy on the Line: The Politics of Wiretapping and Encryption (MIT Press, 1998). The American Council on Education intervened and recommended that cryptographers voluntarily submit their papers to the agency for review.
The result now is that some researchers submit and others do not, says Ms. Landau. Some people have altered their papers at the agency's request, she says, and at least one person she knows has withheld a paper at its suggestion. But the agency did help lift a restriction that the U.S. Army had unnecessarily placed on the work of one cryptographer, she says.
"Some people will say it's an uneasy peace" between the cryptographers and the agency, she adds. "Others say it's a good working relationship. There is not the kind of tension as there was 20 years ago. There seems to be a mutual respect."
Administration officials have asked biologists why they can't live with the same restrictions that cryptographers and nuclear scientists have accepted. An obvious reason is the difference in numbers. The main journal in cryptography, the Journal of Cryptology, publishes fewer than 20 papers a year, and researchers present about 125 papers at conferences annually. The American Society for Microbiology, however, publishes about 6,000 papers a year in its journals.
Even more important, says Mr. Atlas, are the consequences of suppressing information. "These are public-health issues," he says. "If we fail to communicate information vital to public health, then people die."
The definition of sensitive information could be quite broad, because all such data have good and bad sides, health researchers say, especially as they start designing drugs aimed at particular spots on the genome. "Every target we have for drug therapy is also a potential target for manipulation by a terrorist," says Mr. Atlas. "That's the Catch-22 of all this."
But others argue that the gray zone of truly sensitive information is actually quite narrow. "We're talking about a very small subset of papers that would not be completely published," says the Monterey Institute's Mr. Zilinskas, who recently organized a conference on the issue that brought together scientists, Defense Department officials, and journal editors.
And not all academic scientists are displeased at the prospect of tighter restrictions. Harley W. Moon, a professor of veterinary pathology at Iowa State University and the chairman of the National Academy panel that wrote the recent agriculture report, says he understands the academy's decision to leave out information that his committee had deemed safe. "I think it was the right process," he says.
Some researchers even denounce the journals' policies of openness. "That's fine in theory, but we live in a new world right now, a world with the threat of bioterrorism," says Richard F. Meyer, director of a laboratory at the Centers for Disease Control and Prevention that designs tests to quickly determine the presence of bioweapons such as the anthrax bacterium.
Mr. Meyer ran afoul of the publication policies when he and colleagues submitted a paper on a smallpox-detection test to the Journal of Clinical Microbiology, published by the American Society for Microbiology. The journal initially refused to publish the report because the researchers did not list the specific DNA sequences they had used to identify the virus. "It makes absolutely no sense to me to give away the shop by giving out all the information that would allow someone with the knowledge and sophistication to create a genetically altered organism that perhaps could bypass our detection strategies," says Mr. Meyer, who notes that he is not speaking for the CDC.
His group finally put the information in the article, because it was an experimental technique and was not being used by the government to detect smallpox. In cases where the stakes were higher, however, Mr. Meyer has withheld the information. "The way things are going now," he says, "unless there's a change in attitude from the journals, it's really going to prevent people in this area of work -- the bioterrorism arena -- from going forward and publishing anything." That would cut off needed information for public-health officials, and could limit its value to prosecutors bringing alleged terrorists to court, he says.
Another scientist at a national laboratory encountered problems with the same journal when editors insisted that his team provide complete DNA sequences. "For reasons that are obvious to us at least, this was not possible," he says. "Letting potential terrorists know exactly what portions of a genome we are using for detection signatures is not worth whatever benefit is gained by having a publication."
Security concerns were not the only roadblock in this case, admits the scientist, asking not to be identified. The research group withheld information on one of the organisms in the paper in order to protect the intellectual-property rights of the university that runs the laboratory. The team plans to rewrite its paper using organisms not encumbered by security or commercial restrictions and then resubmit it to the journal.
Still, the scientist
blames the journals for having unrealistic policies. "It appears that the
journals themselves have been caught ill-prepared to deal with these kinds of
issues that arise when research science -- and scientists -- get thrust onto
the front lines of a war against terrorism."
The Wrong Kind of People
Most scientific leaders, however, support openness and fear that the restrictions will chill more than just publications. The federal government has removed volumes of data from the World Wide Web, and it is prohibiting certain scientists from having access to even unclassified information.
Federal regulations bar the export of hardware, software, and information related to military technology, but the rules exempt basic and applied scientific research at universities. Now, there is growing concern that the government will begin to apply the same strict regulations to university research in sensitive areas, according to MIT's recent report "In the Public Interest," which examined the issue of access to scientific information. Such rules could forbid scientists to discuss research with foreign colleagues and students.
Indeed, the MIT report says that "the designation 'No Foreign Nationals' is often placed on scientific and technical material, and access to such materials and meetings discussing them is restricted. Clearly, such restrictions are not compatible with the educational environment at MIT."
As new laws and regulations emerge, the university will comply with them, says Sheila E. Widnall, an MIT professor of aeronautics and astronautics and head of the commission that wrote the report. "But we always reserve the right not to do research in an area that is heavily impacted by rules, regulations, and legislation."
"It's a very slippery slope, once one turns the coin over and decides that scientific openness is a threat," adds Ms. Widnall, who served as secretary of the Air Force under President Bill Clinton. "Then there really is no limit to what can be regulated and restricted. And it's a failure to understand the conditions under which science advances and the benefits that flow to our society, both in terms of actual results and in terms of the people who are educated."
Because foreign graduate students play important roles in many top research programs, restricting their activities would force major changes on American universities and would slow scientific progress here. What's more, international students bring back American ideals of openness to their home countries when they return, says Ms. Widnall.
In the wake of last year's airplane and anthrax attacks, the push to restrict information has picked up speed, and scientists must make their voices heard to avert draconian policies, says Margaret A. Hamburg, a bioweapons specialist and vice president for biological programs at Nuclear Threat Initiative, a group working to reduce the threat of nuclear, biological, and chemical weapons.
At the same time, however, she says that some scientists are unaware of the threats that might arise from their work. "There needs to be a deepening appreciation about just how powerful the tools of science are in the brave new world we live in," says Dr. Hamburg, who was New York City's health commissioner at the time of the 1993 World Trade Center bombing and who later served as an assistant secretary of the U.S. Department of Health and Human Services.
The CDC's Mr. Meyer is even
blunter: "American scientists are virtually naive to certain things."
Some scientists do discount the possibility that their work might pose a threat, despite a chorus of criticism. Ariella M. Rosengard, an assistant professor of pathology and laboratory medicine at Penn, caused a stir this past summer, when she published her paper on creating a smallpox protein.
Dr. Rosengard remains unrepentant, however, saying that scientists must disseminate their work in order to cure diseases. "We need to galvanize the scientific community to develop safer vaccines and therapies, not to make it so difficult that scientists say there are so many restrictions that I'm going to study something else. Because then the terrorists really do win."
PAPERS UNDER FIRE
Three recent scientific papers have drawn criticism, and support, in the debate over whether scientists should publish information that might help bioterrorists.
By Ronald J. Jackson and colleagues at Australia's Commonwealth Science and Industrial Research Organization and Australian National University
Journal of Virology, February 2001
In trying to develop a mouse contraceptive to control pest populations, the researchers inserted a gene for an immune-system molecule called interleukin-4 into the mousepox virus. Instead of rendering mice infertile, the engineered virus was far more deadly than the natural strain, killing even mice that had been vaccinated against mousepox.
The technique described in the paper could be used to make a more powerful smallpox that could kill people vaccinated against the virus. "That paper shouldn't have been published," says a biodefense researcher at a national laboratory. "You don't want to publish how to make an organism more virulent."
By publishing, the Australian scientists alerted the world to the possibility of much more deadly diseases. "The best protection against any misuse of this technique was to issue a worldwide warning," says the director of the research center that performed the work. "We also want researchers to use this knowledge to help design better vaccines."
By Ariella M. Rosengard and co-workers at the University of Pennsylvania
Proceedings of the National Academy of Sciences, June 25, 2002 (online edition, May 28)
The team took a protein from a relative of smallpox and altered it to form a smallpox protein. In test-tube studies, the researchers studied how the protein turned off human immune molecules.
All scientists contacted by The Chronicle said that the paper should have been published, but some noted that bioweaponeers could put the smallpox protein studied by Dr. Rosengard into a relatively innocuous virus, rendering it more deadly.
"The potential for good in doing this kind of research greatly outweighs the bad," says Dr. Rosengard, who emphasizes that she studied just one protein. "One protein does not make a virus. They have thousands of proteins, and they have several hundred for evading the human immune response."
By Eckard Wimmer and researchers at the State University of New York at Stony Brook
Science, August 9, 2002 (online edition, July 11)
The scientists used the genetic sequence of poliovirus to order pieces of DNA from a company. By patching the pieces together and putting the complete DNA chain into a soup of cellular molecules, the team created poliovirus particles capable of paralyzing and killing mice.
A resolution was introduced in the U.S. House of Representatives criticizing Science for publishing a potential blueprint for terrorists. Although poliovirus would not make a good weapon of mass destruction, Raymond A. Zilinskas, a bioweapons specialist at the Monterey Institute of International Studies, says that the paper could help a motivated nation to assemble other small viruses that would be suitable as biological weapons.
created by the team was 1,000 to 10,000 times as weak as the natural form and
may offer clues on how to make new vaccines for poliovirus and related viruses,
says Mr. Wimmer. What's more, the funds for the research came from the Defense
Department, which would not have financed unclassified work with dangerous
applications, he says (UCLA, 2002).
Title: U.S. May Classify Some Data On Disease Due To
Date: January 10, 2003
Abstract: With germs looming as potential weapons of mass destruction, biomedical scientists can expect some federally funded projects to become classified under national-security law, a White House science aide warned.
Such restrictions, historically more common to nuclear research, now apply equally to disease research, said John Marburger, director of the president's Office of Science and Technology Policy, addressing a meeting on science and security at the National Academy of Sciences in Washington.
Under the law, researchers applying for federal grants would be notified upon approval of funding if their projects are classified, meaning the results can't be shared or published.
Balancing national security with the freedom to share research results will be tough, Mr. Marburger said. Of greatest concern is how to handle new research on deadly bacteria and viruses. Much basic research has therapeutic applications leading to new vaccines or drugs, but can also be diverted into germ weaponry.
Ron Atlas, head of the American Society of Microbiology, issued a plea against scientific censorship. Still, he and others have been haunted since the Sept. 11, 2001, attacks by potential misuse of medical discoveries.
"Every time we move toward a cure or identify a gene, we give information to terrorists," Dr. Atlas said at a biosecurity meeting in November in Las Vegas. "I don't know where we'll end up," he said. "Everything tears at our fundamental values."
Still, putting disease data in terrorist hands is unthinkable, said Gary Fleisher, a professor at Harvard Medical School. "I don't think anyone would say during World War II that we should have published information on the atomic bomb so the Nazis could use it. We are at war. [Terrorists] will use it to kill Americans."
Unusual scrutiny of people in biomedical circles already is taking place. Dr. Atlas said he gave a list of the microbiology society's 42,000 members to the Federal Bureau of Investigation during its anthrax probe, but wouldn't supply other information without a subpoena. At Biosecurity 2002, the November meeting hosted by Harvard in Las Vegas, sponsors sought an Federal Bureau of Investigation agent's help in "surveillance" of the list of attendees, said Miles Shore, a professor at Harvard Medical School.
Screening was done to allay fears that "someone with evil intent would sign up for the meeting" to hijack data for weapons or to attack the meeting itself (UCLA, 2003).
Title: Journals To Censor Bioterrorism Data
Date: March 14, 2003
Source: Down to Earth
Abstract: In response to worries about bioterrorism, prominent journals have decided to take security issues into account while reviewing research papers. Details of published studies that might help terrorists make biological weapons would also be deleted. The 'Statement on Scientific Publication and Security' was endorsed by journals like Science, Nature and The Lancet.
At present, research is reviewed for accuracy. The journals are now amending this process to include the assessment of the security implications. The journals would establish their own expert panels to review the papers. Editorial boards would work with the authors to make specific changes and "tone down the research papers".
Experts say that the
restraint is not enough, as there is nothing to stop scientists from posting
their research on the internet. Ronald Atlas, president of the American Society
of Microbiology, emphasises that the process does not give governments a formal
role in vetting research. The statement was partly aimed at staving off calls
to make the US government a prime scientific censor board (Down to Earth, 2003).
Title: Will Bioterror Fears Spawn Science Censorship?
Date: April 25, 2007
Abstract: Since September 11th, people have been increasingly worried about the misuse of legitimate scientific research to create dangerous weapons or to bypass security measures. Now a federal advisory board is about to recommend new guidelines to limit publication of life-sciences research that could be misused by terrorists. I think it's treading on dangerous ground.
Last Thursday, a draft of the rules was formally adopted by the National Science Advisory Board for Biosecurity, or NSABB, at a meeting in Bethesda, Maryland. The draft proposes voluntary compliance by scientists, universities and journals, but leaves open the possibility of federal legislation to turn the guidelines into law. Indeed, it almost invites that result by supporting application of the NSABB recommendations to researchers that do not receive federal funds -- a result that can only be achieved through regulation.
As a lawyer for computer security researchers, it is impossible to regard this prospect with anything but dread. For example, the proposal (.pdf) broadly defines "dual use research of concern" as any "research that, based on current understanding, can be reasonably anticipated to provide knowledge, products, or technologies that could be directly misapplied by others to pose a threat to public health and safety, agriculture, plants, animals, the environment, or materiel."
That's a perfectly reasonable description of an article or paper worth a closer look before publication. But if this language becomes a statute that prohibits publication under some circumstances, the author risks criminal prosecution if law enforcement disagrees with a scientist, university or peer-review publication's decision that the research should be published.
And, legally, I'd find it extremely difficult to advise the author with any certainty whether publishing the research is lawful or not. Whose "current understanding" applies? What does "reasonably anticipated" mean? When is research "directly" misapplied, or merely indirectly used? How much of a risk "poses a threat"?
The NSABB draft also sets out a procedure to follow once a scientist has identified research of concern. Instead of outright suppression in every case, the proposal suggests a risk/benefit analysis, which can result in a variety of options for communicating the research to the public.
This seems flexible and case-specific, which again, is great in a guideline, but terrible when you are trying to advise a client how to avoid the risk of jail. We know that reasonable scientists can and do disagree about these things. What do prosecutors, judges and juries think?
Rejecting new regulation doesn't mean we have to be subject to the whims of bioterrorists. Voluntary self-regulation, ethical training, peer review and additional practices currently followed by recombinant DNA researchers, microbiologists and other scientists all have a track record of success. And smart federal laws can control access to pathogens -- and prohibit dangerous practices -- while steering clear of restricting scientific publications.
Until recently, U.S. policy has been to allow the publication of information, with only narrow controls on classified information. Then, in 2002, the president signed the National Security Act, which requires federal agencies to create procedures to protect "sensitive but unclassified" knowledge. The statute is unclear about whether these procedures should take the form of voluntary guidelines, or regulations with the force of law, and whether they'll apply outside of federal agencies. But the NSABB report appears to be part of the effort to craft such procedures.
The scientists on the board have good reasons for wanting to be involved in crafting the guidelines. They want to stop terrorists, and they take the dangers from dual-use research seriously. They also want to protect the scientific process, and they believe correctly that if regulation is going to happen, it would be much, much better if scientists were involved.
Once such scientist is NSABB board member David A. Relman, M.D., associate professor of medicine, microbiology and immunology at Stanford University School of Medicine. He told me about a 2004 addition to federal law which criminalizes possession of the smallpox virus. Unfortunately, the statute defines the pathogen as any virus that contains 85 percent or greater sequence similarity to smallpox, effectively outlawing a whole range of pox viruses, including the smallpox vaccine. The maximum penalty for violating the law is a fine of $2 million dollars and 25 years in prison.
Doctor Relman views his role on the NSABB as helping the government avoid a similar kind of mistake in the future. He and his colleagues are doing us a service by participating, but they have to be extremely careful that their work is not used to legitimize regulation. Any guidelines should be crystal clear that they are good only as that -- guidelines.
If the NSABB is not
careful, its well-balanced recommendations may become a precursor for
abandoning voluntary self-regulation in favor of federal regulation of
scientists. Once we have regulations, we will also have penalties for
non-compliance. At that point, the only question left will be how much scientific
self-determination remains (Wired,
Title: US: Don't Publish Lab-Bred Bird Flu Recipe
Date: December 20, 2011
Abstract: The U.S. government asked scientists Tuesday not to reveal all the details of how to make a version of the deadly bird flu that they created in labs in the U.S. and Europe.
The lab-bred virus, being kept under high security, appears to spread more easily among mammals. That's fueled worry that publishing a blueprint could aid terrorists in creating a biological weapon, the National Institutes of Health said.
But the NIH said it was important for the overall findings to be published in scientific journals, because they suggest it may be easier than previously thought for bird flu to mutate on its own and become a greater threat.
"It's very important research," NIH science policy director Dr. Amy Patterson told The Associated Press. "As this virus evolves in nature, we want to be able to rapidly detect . . . mutations that may indicate that the virus is getting closer to a form that could cross species lines more readily."
Bird flu, known formally as H5N1 avian influenza, occasionally infects people who have close contact with infected poultry, particularly in parts of Asia. It is highly deadly when it does infect people because it's different from typical human flu bugs. The concern is that one day it may begin spreading easily between people.
The NIH paid for two research projects, at the Erasmus University Medical Center in the Netherlands and at the University of Wisconsin, to better understand what might fuel the virus' ability to spread. The NIH said researchers genetically engineered bird flu that could spread easily among ferrets — animals whose response to influenza is similar to humans.
So the government's biosecurity advisers — the National Science Advisory Board for Biosecurity — reviewed the research as it was submitted to two scientific journals, Science and Nature. Following the board's recommendation, the Department of Health and Human Services asked the researchers and journal editors not publish the full genetic information that could enable someone to copy the work.
Patterson said publishing the general findings, however, could help scientists better monitor bird flu's natural evolution and spur further research into new treatments. The government will set up a way for scientists who are pursuing such work to be given the unpublished genetic details, she said.
Patterson said researchers were making changes in their scientific reports.
But in a statement, Science editor-in-chief Dr. Bruce Alberts said his journal "has concerns about withholding potentially important public health information from responsible influenza researchers" and was evaluating how best to proceed.
Nature's editor-in-chief, Dr. Philip Campbell, called the recommendations unprecedented.
"It is essential for public health that the full details of any scientific analysis of flu viruses be available to researchers," he said in a statement. The journal is discussing how "appropriate access to the scientific methods and data could be enabled" (MSNBC, 2011).
Title: Bird Flu Bioterror Risk Seen Increased By Censorship
Date: December 21, 2011
Abstract: Any number of laboratories worldwide could engineer bird flu viruses into bioterror weapons capable of causing a human pandemic, and U.S. government efforts to censor research might only increase the risk that rogue elements may give it a try.
Experts say an unprecedented request by the U.S. National Science Advisory Board for Biosecurity (NSABB) for two leading scientific journals to withhold details of research into H5N1 bird flu is unlikely to block anyone already intent on evil.
Yet ironically, the fact that the potential for H5N1 to be deliberately engineered into a highly pathogenic form has become headline news might put fresh thoughts into the wrong minds.
"Anything like this has the potential to trigger ideas in some maverick," said Peter Openshaw, director of the centre for respiratory infection and Britain's Imperial College.
"There are many crazy people out there, and there are also people who are fixed on some idea at the extreme end of the political norm. Both groups have the potential to cause harm."
H5N1 bird flu is extremely deadly in people who are directly exposed to it from infected birds.
Since the virus was first detected in 1997, about 600 people have contracted it and more than half of them have died. But so far it has not mutated into a form that can pass easily from person to person.
Scientists around the world have been working for many years trying to figure out which mutations would give H5N1 the ability to spread easily from one person to another, while at the same time maintaining its deadly properties.
The U.S. National Institutes of Health funded two research teams, one in The Netherlands and one in Wisconsin in the United States, to carry out research into how the virus could become more transmissible in humans.
The aim was to gain early insight on how to contain the public health threat if such a mutation occurred naturally, but now the NSABB says it wants publication of the studies censored to stop the information falling into the wrong hands.
"It's very important work that has shown that with relative ease it is possible to mutate H5N1 into a mammal-to-mammal transmissible virus," said Openshaw.
Wendy Barclay, an expert in flu virology at Imperial College, said stopping the Science and Nature journals from publishing full scientific data from the work would do little more than set an uncomfortable precedent.
"The exact mutations that made this transformation possible were not particularly novel or unexpected so anyone with a reasonable knowledge of influenza virology could probably guess at them if they so wished," she said.
"I'm not convinced that withholding scientific know-how will prevent the highly unlikely scenario of misuse of information, but I am worried that it may stunt our progress towards the improved control of this infectious disease."
Scientists agreed on the need to maintain high levels of safety and security around labs working with dangerous viruses, and be cautious about deliberately doing things that enhance their pathogenicity and disease potential.
One the other hand, seeking to impose unprecedented levels of regulation on laboratories and scientists who proceed with extreme caution anyway is unlikely to have a positive impact.
"Whatever regulations are put in place in sensible, well-run labs in the developed world, we have no way of regulating what goes on in facilities in China or Korea, or possibly in India and Pakistan," said Openshaw.
He pointed out, however, that as a weapon, a mutated H5N1 virus would be pretty unsophisticated and virtually impossible to target at any one group or population.
In contrast, the anthrax powder attacks in the United States
in 2001, which prompted the formation of the NSABB, were able to be highly targeted by the perpetrators who sent the powder in letters to particular groups and individuals.
"(With H5N1), you'd really have to have a grudge against the whole of humanity," Openshaw said.
"These would be very indiscriminate bioweapons that couldn't be controlled. They wouldn't be selective. So it would be a very bizarre decision for someone to release an agent like this, because it would cull humanity" (Reuters, 2011).
Title: Bioterrorism Fears Spark Censorship Call
Date: December 22, 2011
Abstract: The US government has asked the publishers of Nature and Science magazines to censor details on a laboratory-made version of the deadly bird flu virus for fear that the information could be used as a biological weapon.
Scientists seeking to fight future pandemics have created a variety of the H5N1 virus that is so dangerous that the US National Science Advisory Board for Biosecurity has for the first time asked two science journals to hold back on publishing details of research.
The editors at Science are taking the request of the advisory board seriously, said the journal's editor-in-chief, Bruce Alberts. They are trying to balance the need for other researchers to have detailed information against possible threats, he said in a statement.
''Science editors will be evaluating how best to proceed,'' Mr Alberts said.
Responses are contingent on the government developing a plan so that withheld information can be provided to researchers who request it ''as part of their legitimate efforts to improve public health and safety'', he said.
In the experiments, university scientists in the Netherlands and Wisconsin created a version of the H5N1 influenza virus that is highly lethal and easily transmissible between ferrets, the lab animals that most closely mirror human beings in flu research.
Members of the National Science Advisory Board for Biosecurity, which was created after the anthrax bioterrorism attacks of 2001, worried that such a hazardous strain might be intentionally or accidentally released into the world if directions for making it were generally known.
The board advises the US Department of Health and Human Services, which agreed with the non-binding recommendation.
After weeks of reviewing papers describing the research, the board said it had recommended that the experiments' ''general conclusions'' be published but not ''details that could enable replication of the experiments by those who would seek to do harm''.
''Censorship is considered the ultimate sin of original research. However, we also have an imperative to keep certain research out of the hands of individuals who could use it for nefarious purposes,'' said Michael Osterholm, a member of the board who is also the director of the Centre for Infectious Disease Research and Policy at the University of Minnesota. ''It is not unexpected that these two things would clash in this very special situation.''
The board cannot stop publication. Its advice went to the Department of Health and Human Services, whose leaders asked the authors of the papers and the journals reviewing them - Science, published in Washington, and Nature, published in London - to comply.
The journals' responses to the request were initially very chilly.
Reuters reported that the journals objected to the request, although later reports suggested they were willing to go along with it under certain conditions.
Dutch researchers said they ''are currently working on a new manuscript that complies with the recommendation''. The scientists at the University of Wisconsin could not be reached.
About 600 people, mostly in south-east Asia, have become ill from the H5N1 virus since 1997. About 60 per cent have died. The virus is rarely passed from person to person, more likely to occur with close contact with sick birds.
Because of its extreme virulence, H5N1 has been the flu strain most feared as the source of a possible influenza pandemic.
What it lacked were the genetic changes permitting easy transmission by coughing, sneezing and touch. The new research has produced those changes for the first time, at least in ferrets.
Exactly how the key new mutations occurred is unclear, although it seems in part to be the product of chance. Influenza viruses are constantly changing in small ways, which is one of the reasons vaccines against them have to be reformulated every few years. Infecting ferrets enough times with the virus may have been sufficient to allow mutations favouring easy transmissibility to emerge by chance and then be ''saved'' by natural selection (SMH, 2011).
Title: Should Medical Journals Print Info That Could Help Bioterrorists?
Date: December 27, 2011
Abstract: Bird flu is deadly, but it generally does not spread easily from human to human. Now, scientists in Wisconsin and the Netherlands have created a strain of bird flu that can spread through the air — a virus that could kill millions if terrorists managed to create a batch and weaponize it. This raises a thorny question: Should medical journals be allowed to print the details of how the virus is made?
A government advisory board has urged two scientific journals to omit some of the specifics about the virus — the first time it has issued such a request. Supporters insist that the board’s request is a much-needed precaution that could save millions of lives. But critics say that the government is engaging in censorship and interfering with academic freedom.
It is a classic clash of liberty versus security. The question is such a difficult one because whichever course the government takes carries risks and costs. Which option — blocking publication or allowing it — is the lesser of two evils?
It is not hard to see why the government is seeking to keep details of the virus out of print. The H5N1 bird-flu virus rarely infects humans. But when it does cross the species barrier, the mortality rate can be as high as 60%. If terrorists were able to use the new research to make a contagious strain of the virus, the result could be a real-world version of the movie Contagion. That is: worldwide panic and mass deaths.
The government is trying to avoid this by urging scientific journals to describe the virus only in general terms and keep out the sort of details that could be used to replicate it. The National Science Advisory Board for Biosecurity, which was created after the deadly anthrax attacks of 2001, asked the journals Science and Nature to be selective when they published articles on the highly contagious strain of H5N1.
So what’s the problem? Critics say the government is engaging in censorship by telling the media what it should and should not write about. It sets a terrible precedent, they argue, for the government to set itself up as a national-security censor. The next time, they say, the government will try to prevent the publication of information that is far less dangerous than contagious bird flu.
Press-freedom watchdogs have a point: the government often trots out national security to try to intimidate the press into not doing its job. A few years back, the New York Times was about to expose the NSA spying program, in which the government was intercepting emails and phone calls without getting court orders. President George W. Bush called the paper’s top brass down to the White House and warned them that exposing the program would compromise national security. The Times went ahead and published — and we are all still here.
The skeptics raise another important concern: the long tradition of scientific openness. Research science works by having experiments reported publicly, so other scientists can test the findings — and build on them with their own research. This tradition breaks down when the government puts a shroud of secrecy on some research.
The editor of Science has suggested that his journal might agree to withhold the information the advisory board is worried about — provided that the government creates a system that would allow legitimate scientists to access the full results.
That sounds like the
right answer. We should be wary of government attempts to stop the media from
publishing information. But in extreme cases, it may be necessary — and
weaponizable highly contagious bird flu could be just such a case.
What factors should we be looking for in considering whether the government should try to stop publication? First, the threat of harm should be real and it should be truly extraordinary. That is a test the contagious strain of H5N1 seems to meet. Second, it should be clear that the government has no ulterior motives — that it is acting to protect the nation, not to advance a political agenda.
That can be a tough thing to evaluate — governments that use national-security arguments for political goals are quick to deny that they are doing so. The best check on this sort of politicization is making sure that anyone who feels pressure from the government not to publish or speak is able to challenge the policy in court. Judges are in the best position to balance risks of serious harm against the infringement on speech — and to determine whether the government is crossing any First Amendment lines.
Those who oppose the Scientific Advisory Board’s decision are right that we must be wary whenever the government tries to suppress speech. As Supreme Court Justice Potter Stewart said, censorship is “the hallmark of an authoritarian regime.” But the board’s defenders are right that ultimately the government has a duty to protect the public from the most serious threats. They can cite Supreme Court Justice Robert Jackson, who noted that the Constitution is not a suicide pact (TIME, 2011).
Title: Can Scientific Censorship Stop Bioterrorism?
Date: January 31, 2012
Abstract: Today the U.S. National Scientific Advisory Board for Biosecurity recommended that the journals Nature andScience restrict publication about controversial new research relevant to the transmission of avian flu between humans. The fear: Would-be bioterrorists are combing the pages of the journals for tips on how to wreak havoc.
The H5N1 avian flu virus has killed 60 percent of the 600 or so people known to have come down with it since it was first identified in 1997. For comparison, seasonal flu in the United States kills about 0.1 0.003* percent of those who catch it. So far the H5N1 virus has not become easily transmissible between humans. But recently two research teams, one in the Netherlands and another in Wisconsin, reported that they had succeeded in transforming the virus into versions that are transmissible via respiratory drops through the air between mammals. In the normal course of scientific research, the teams approached the journals Science and Nature about publishing their results. Publication is the way that scientists get credit for their achievements and enable fellow researchers to benefit from and build upon their work.
Reports of this research, however, provoked worries that publishing the recipe for making the bird flu virus transmissible could enable bioterrorists to unleash a devastating global epidemic that could kill billions of people. The editorial page editors atThe New York Times are so frightened at the prospect that they have called on the researchers to destroy their new strains of the virus. Consequently, concerned journal editors and peer reviewers sought the advice of the U.S. National Scientific Advisory Board for Biosecurity (NSABB). In December, the NSABB recommended that the journals withhold research details to impede would-be bioterrorists.
In January, the two research teams agreed to a two-month moratorium on further research on their modified flu viruses. In addition, the World Health Organization is convening a meeting of prominent influenza researchers to discuss what should be done. Today, the NSABB is publishing its recommendation torestrict communication [PDF] of these scientific results inNature and Science.
A research moratorium is not new to the life sciences. Back in 1974, several prominent biologists concerned about the “potential biohazards”[PDF] posed by then new gene-splicing techniques published in leading scientific journals a call for a moratorium on certain kinds of experiments. A year later, a group of 140 scientists along with a few lawyers and journalists convened at Asilomar in California where they proposed a scheme for containing gene-spliced experimental organisms [PDF] in laboratories. This scheme evolved into laboratory regulations under the auspices of the National Institutes of Health. The NSABB cites this history, arguing, “We believe that this is another Asilomar-type moment for public health and infectious-disease research that urgently needs our attention.” That’s about right, but not necessarily in a good way.
The positive spin on history is that the 1974 research moratorium and the 1975 Asilomar meeting calmed public fears and enabled the new biotech research to proceed. Some participants now disagree, arguing that the fact that researchers had called for a moratorium instead inflamed the public. “I knew the [Asilomar] letter would give rise to a sort of fire-storm of ill-informed brave new world stuff,” said Asilomar participant and former New York Timesscience reporter Victor McElheny in 2009.
In fact, The New York Times in 1976 helped fan the flames of “brave new world stuff” by publishing an article, “New Strains of Life—or Death,” in which Cornell University biochemist Liebe Cavalieri warned that gene-splicing could lead to accidental outbreaks of infectious cancer. “In the case of recombinant DNA, it is an all or none situation—only one accident is needed to endanger the future of mankind,” warned Cavalieri. Forty years after the first gene-splicing experiments by biologists Paul Berg, Herbert Boyer, and Stanley Cohen, unregulated molecular biology experiments are common in high school biology classes and humanity is not yet afflicted with lab-made infectious cancers.
The NSABB research censorship recommendations provoke reflection on two general issues. First, governments, and especially defense bureaucracies, are addicted to secrecy [PDF]. Knowledge is power and government bureaucracies are in the business of accumulating and hoarding power. This is the opposite of science, which thrives in an atmosphere of transparency. While on very rare occasions there may be reasons to withhold temporarily scientific findings from the public, the default must always be openness.
The second issue is just how plausible is it that bioterrorists or hostile governments are eager to brew up and release a pandemic strain of deadly flu? The would-be bioterrorists would have no way to prevent it from infecting themselves, their families, friends, fellow citizens, and co-religionists. It’s possible that unleashing a pandemic might appeal to some kind of millenarian death cult, but your average terrorist and dictator are unlikely to conclude that a flu epidemic is a good idea. Bioterrorism using infectious agents is likely self-deterring.
On the other hand, even as the NSABB recommends secrecy and restriction, it acknowledges “that there are clear benefits to be realized for the public good in alerting humanity of this potential threat and in pursuing those aspects of this work that will allow greater preparedness and the potential development of novel strategies leading to future disease control.” First, avian flu is percolating out in nature, and there is every possibility that it will eventually mutate into a strain that infects people. The new research may have given public health officials a jumpstart on what to look for as they monitor changes in natural avian flu strains.
Second, researchers have been working on various treatments aimed at ameliorating or preventing avian flu among humans. These new air-transmissible strains could be used to see how effective current treatments may be and to guide the development of new treatments and vaccines.
Consider an earlier case of bioterrorism jitters provoked by publishing research on how to resurrect the Spanish flu. In 2005, researchers published the details of the viral strain that killed perhaps 50 million people in 1918. At the time some warned that the 1918 flu was “perhaps the most effective bioweapons agent now known.” However, as a result of the publication of that research we now know that that bioweapon fear was overblown.
As one of the lead researchers on the Spanish flu genome project, Peter Palese, recently pointed out, after publication lots of new researchers focused on the virus and happily discovered that it responds to seasonal vaccines and anti-flu drugs like Tamiflu and Symmetrel. “Had we not reconstructed the virus and shared our results with the community, we would still be in fear that a nefarious scientist would recreate the Spanish flu and release it on an unprotected world,” writes Palese. “We now know such a worst-case scenario is no longer possible.”
On January 25th, one of the lead avian flu researchers, Yoshihiro Kawaoka from the University of Wisconsin-Madison, argued in Nature that the research on transmissible avian flu must continue in order to protect people. Now that researchers know that avian flu can be transformed into transmissible strains, monitoring could facilitate eradication efforts and other countermeasures should such changes be detected in natural strains.
“The redaction of our manuscript, intended to contain risk, will make it harder for legitimate scientists to get this information while failing to provide a barrier to those who would do harm,”asserts Kawaoka. Spanish flu researcher Palese concurs, “The more danger a pathogen poses, the more important it is to study it (under appropriate containment conditions), and to share the results with the scientific community. Slowing down the scientific enterprise will not 'protect' the public—it only makes us more vulnerable.” Both are right.
The best defense against bioterrorism is the open and international scientific enterprise itself, not government recommended (and perhaps one day enforced) secrecy (Reason, 2012).
Title: No Way Of Stopping Leak Of Deadly New Flu, Says Terror Chief
Date: February 8, 2012
Abstract: The bioterrorism expert responsible for censoring scientific research which could lead to the creation of a devastating pandemic has admitted the information "is going to get out" eventually.
Professor Paul Keim, chairman of the US National Science Advisory Board for Biosecurity, controversially recommended that researchers be stopped from publishing the precise mutations needed to transform the H5N1 strain of birdflu virus into a human-transmissible version.
In an exclusive interview with The Independent, he argued it had been necessary to limit the release of the scientific details because of fears that terrorists may use the information to create their own H5N1 virus that could be spread easily between people.
Professor Keim said that it was necessary to slow down the release of scientific information because it was clear that the world is not yet prepared for a strain of highly lethal H5N1 influenza that can be transmitted by coughs and sneezes.
“We recognised that, in the long term certainly, the information is going to get out, and maybe even in the mid term. But if we can restrict it in the short term and motivate governments to start getting busy in terms of building up the flu-defence infrastructure, then we’ve succeeded at a certain level,” he said.
“If we can slow down the release of the specific information that would enable somebody to reconstruct this virus and do something nefarious, even for a while, then that was a good thing.”
By withholding key details of the mutations needed to make an airborne strain of H5N1, this would give time for governments to prepare for and prevent a possible pandemic, he added.
“The infrastructure to stop a pandemic in this area is not there. We just don’t have the capabilities. The very first time we knew that the swine flu virus [coming out of Mexico] was there, it was already in 18 countries. I’m not confident at all that we have the surveillance capability to spot an emerging virus in time to stop it,” he said.
“And even if we did spot it early on, I don’t think we have sufficient vaccines. The vaccines aren’t good enough, and the drugs are not good enough to stop this emerging and being a pandemic.”
Although H5N1 spreads rapidly between birds, it has so far affected only about 600 people worldwide who have had direct contact with infected poultry. However, two teams of researchers have shown independently that it only requires five mutations for H5N1 to become an airborne pathogen for laboratory ferrets, the standard animal model for human influenza.
Professor Keim said that the biosecurity board was asked by the US Government to review the two independent studies because they had already been submitted to the journals Science and Nature. The board had to make a recommendation on whether any or all of the information should be published.
Scientists involved in showing how the H5N1 birdflu virus can be transmitted in the air between ferrets have criticised the biosecurity board’s decision to part-censor their research on the grounds that it would hinder the development of new vaccines and drugs.
However, Professor Keim dismissed the criticism as disingenuous. “The argument that we need this information to make better vaccines and better drugs does not ring true,” he said. “There are lots of ways to make drugs against this virus. The very drugs they were using against this virus were the very same ones used against other flu viruses. The drug-invention problem has nothing to do with having this virus to hand,” he added.
Professor Keim revealed that although he is personally in favour of the research that led to the creation of airborne strains of H5N1, some other members of the board were not convinced. “I’m personally in favour of this research but that opinion is not universal on the board. Some people on the board wanted to stop this research and destroy the virus,” he said.
“I don’t think we need
this virus to prove that Tamiflu works against it. And we know that the H5
antigen is not a great antigen for vaccines, we don’t need the virus to tell us
that. But there are some experiments that can only be done with the live virus
and I’m in favour of keeping the virus for those type of experiments” (Independent, 2012).
Title: No Consensus Reached On Keeping Potentially
Dangerous Studies From The Public
Date: April 6, 2012
Source: Bio Prep Watch
Abstract: Scientists at a two day meeting recently held in London achieved little consensus concerning whether some potentially dangerous studies should be kept from the public for security reasons.
Bruce Alberts, the editor of the journal Science, told an audience at the Royal Society that it could take years before an international understanding could be reached on whether or not it is appropriate to publish censored versions of scientific papers, according to the Washington Post.
“My fear is that now this crisis is over, nobody will work on this,” Alberts said, the Washington Post reports.
The London meeting was called after the journals Science and Nature agreed to redact portions of two independent studies on H5N1 avian influenza in response to a request by the U.S. government. Both journals recently received a go-ahead to print revised versions of the studies.
The issue touches on the very nature of modern scientific research, its openness, funding, cybersecurity and the regulation of human behavior.
The papers in question described the successful efforts to create a strain of H5N1 that is transmissible in human being through the air.
The U.S. National Science Advisory Board for Biosecurity, a committee that advises the U.S. government on issues relating to federally funded research, made the request. Both studies received money from the U.S. government, according to the Washington Post.
The NSABB recently
altered its decision after learning more specific information about why the
studies were conducted and what their potential impact could be on further H5N1
Prep Watch, 2012).
Title: Senate Committee Calls For More Oversight On Risky Biological Research
Date: April 27, 2012
Abstract: Senators raised concerns about oversight issues on government funded biological research at a Thursday hearing of the Senate Committee on Homeland Security and Governmental Affairs.
“When the American people pay for scientific research intended for the common good, they have a right to expect that their money will not be used to facilitate terrorism,” Senator Susan Collins (R-Maine) said, referring to the recent controversy surrounding NIH funded research to create a highly contagious airborne form of the H5N1 bird flu virus. “These are not hypothetical threats.”
Democrats and Republicans both stressed the importance of enacting more oversight for any sort of dual use research.
“We need to put in place better systems to track this kind of research at each experimental stage rather than waiting till its ready for publication to make decisions about what can or can’t be revealed,” Senator Joseph Lieberman (I-Conn.) said.
Dr. Anthony Fauci, the director of the National Institute of Allergy and Infectious Disease, and Dr. Paul Keim, the chair of the National Science Advisory Board for Biosecurity, agreed with the senators’ remarks, but also warned against the possibility of regulatory oversight stifling future research.
“It is critical that we establish policy that intensely monitors high potential dual use research of concern from cradle to grave in order to protect us from misuse, but also to free low-potential DURC research from onerous regulations,” Keim said (BioPrepWatch, 2012).
Title: Censoring Data On Influenza Could Increase
Date: April 9, 2012
Source: Bio Prep Watch
Abstract: The attempt to censor science by redacting scientific research may cause the very bioterrorism problems it is trying to prevent, a a leading cyber-security specialist has revealed.
Bruce Schneier, the chief security technology officer for the London-based telecommunications firm BT, spoke before a meeting of flu and security experts last week at the Royal Society in London. He warned the assembled experts that the redaction could lead to additional bioterrorism problems, New Scientist reports.
The meeting came in the wake of a decision by the U.S. National Science Advisory Board for Biosecurity to publish two scientific papers reporting on an H5N1 flu strain that spreads among mammals. The board previously called to have details omitted from the papers so that bioterrorists would not be able to construct the viruses themselves. The board changed its mind, but the U.S. government published a policy regulating such research in March.
Schneier said that computer hackers are not likely to search the internet looking for random files related to science to hack into.
“If no one knows about it, it’s safe,” Schneier said, according to New Scientist. “If you announce that you have sensitive information by putting out a redacted paper, then if someone wants to know, they will. Any computer can be hacked.”
that he was talking about both scientific papers being hacked along with
experimental notes and data kept electronically in laboratories (Bio
Prep Watch, 2012).
Title: Science Journal Could Give Recipe For Deadly Avian Flu Virus
Date: May 14, 2012
Abstract: A science journal is poised to publish a study that some experts believe could give a recipe to bioterrorists.
The study is from an experiment by a Dutch scientist who engineered the avian flu virus to make it more deadly to mammals by making it spread through the air.
That experiment was funded by the U.S. government, and it has sparked a passionate debate among scientists. Part of that debate is over where this research could lead, and whether it is worth it.
The National Institutes of Health and some scientists say it is worth it. They say it could ultimately protect mankind by trying to anticipate how the virus could mutate to one that causes a pandemic -- like the one in the film "Contagion."
Dr. Anthony Fauci heads the NIH agency that funds infectious diseases research. It funded the controversial Dutch experiment.
"We need as scientists and health officials to stay one step ahead of the virus as it mutates and changes its capability," Fauci told CNN Radio recently. "To anticipate that would be important to determine whether the countermeasures we have available, such as antivirals and vaccines, would actually be effective against such a virus that changed in such a way."
But a number of scientists are stepping forward to say it is not worth it -- and that this research could actually bring us closer to that nightmare.
How? By making a lethal virus that spreads like seasonal flu.
"We are playing with fire," says Dr. Thomas Inglesby and his colleagues at the Center for Biosecurity at the University of Pittsburgh Medical Center.
The journal Science is now reviewing the manuscript by Dutch scientist Ron Fouchier, a virologist at the Erasmus Medical Center in the Netherlands.
In December, the National Scientific Advisory Board for Biosecurity warned against publishing Fouchier's studyand a similar study from Wisconsin. The Wisconsin study was based on a similar experiment but used a less lethal strain of the virus.
In March, that same advisory board looked at revised versions and said the Wisconsin study was safe to publish. But some on the panel broke ranks on publishing Fouchier's work. Twelve said yes; six said no.
Michael Osterholm, an infectious diseases expert at the University of Minnesota, was one of the six "no" votes on the board. In a letter to NIH after the vote, Osterholm described the studies as "nearly a complete cookbook" for those who would do harm.
The journal Nature just published the Wisconsin study. The journal Science is expected to publish Ron Fouchier's study within weeks.
Here's what you need to know about the avian flu research:
Is the Engineered Avian Flu Virus as easily Spread between People, as
well as Animals?
It's not certain. But evidence shows it's likely to spread the same way between people as it does between the ferrets that Fouchier used in his experiment.
Why did the Government Fund this Research if it's So Risky?
They wanted to know why avian flu spreads so fast among birds but not among people. People only catch bird flu if in they're in close contact with infected birds.
Here, the government funded two studies, one led by Fouchier and the other by Wisconsin flu researcher Yoshi Kawaoka. Both used genetic engineering to explore which mutations might turn an avian flu into one that could spread easily between people.
The NIH says these experiments show that it's possible for the bird flu virus to evolve to a highly transmissible killer virus like the one in "Contagion."
"These studies raised the red flag," said Robert Webster, a virologist and flu researcher at St. Jude Children's Research Hospital. "The cat's out of the bag."
Well, Now What?
These experiments lay a path to a whole new area of genetic engineering in flu research.
The government and supporters of the controversial experiments say more research will lead to a better understanding of the genetic mutations that could lead to a viral pandemic.
But other scientists say this is the wrong road to take.
Sir Richard Roberts, a molecular biologist who's won the Nobel Prize, spoke out at a recent National Academies workshop on the bird flu experiments.
"Someone is trying to make the most dangerous virus we can think of," Roberts said. "I don't understand how one can justify that, unless there is no other way of getting the data that you're interested in.
"And the way you get data is surveillance, and to see what is going on in nature, and to respond to it accordingly. And you go out of your way to find a universal vaccine. I would much sooner see money spent on that than on creating the most dangerous virus imaginable. I find it indefensible."
Roger Brent, a biologist at the Fred Hutchinson Cancer Research Center in Seattle, said he believes these experiments create more danger than benefit.
Brent told CNN that in order to be valuable -- that is, to reliably show the ways that bird flu could evolve to infect humans -- these experiments would require more experiments that could generate recipes for more, and different, man-made viruses -- all of them dangerous.
"Scientists must ask: Do we really want to do these experiments?" Brent said. "If we're generating knowledge that we feel dodgy about, do we really want to generate 20 or 100 additional (engineered viruses) that create something that most people would believe to be bad?"
Is the Government going to Fund more of this Research?
Possibly. The controversy over the Fouchier experiment led to a temporary "voluntary moratorium" by flu researchers on genetic engineering.
It also prompted the U.S. government to begin crafting a policy on how to deal with "dual use" research like this that can lead to harm, as well as good.
At a recent hearing on the bird flu virus research, Sen. Joe Lierberman, I-Connecticut, asked Fauci whether he thought there were any experiments that should not be done.
Yes, Fauci replied, but he said he thought that would be rare.
Supporters of the Fouchier experiment say the results make the case for more support and funding.
At the National Academies workshop, one journalist said he had talked to a number of scientists who questioned the value of these experiments and where they could lead.
Flu researcher Robert Webster replied by saying the experiments brought bird flu back into the research conversation.
"Concern for bird flu had dropped. Really, H5N1 had disappeared
from the radar screen. This shows it can occur. So we have to maintain pandemic
preparedness" (CNN, 2012).