(an exploration of mankind's future in seven parts: clones, nanotech, cryogenics and hibernation, cyborgs, hive minds, evolution or devolution, plagues and death)
by Peter Jekel
"Death is frightening, and so is Eternal Life."
In 2003, it was declared complete: The Human Genome Project was one of the most ambitious projects in the history of science. Science fiction author, Robert Sawyer, in his novel, Frameshift, explored some of the implications of the project, the goal of which was to map all of the 100,000 genes that lay hidden within the 23 chromosomes found in every human cell, except gametes. While this goal was indeed astounding, its potential applications are even more far-reaching. With the exception of the computer industry, biotechnology has the potential to impact human society more than any other scientific endeavor in history.
Other books that explore our biological future include Aldous Huxley’s novel Brave New World, which foresaw some of the implications of biotechnology as early as 1932, and may soon find its concepts moved from the realm of science fiction to the realm of science fact. More recently, Jack Dann and Gardner Dozois edited the book Genometry—a series of eleven tales by leading science fiction authors on both the promises and perils of our biological future.
We as a species have the potential to go in one of two directions: One that leads to the possible panacea of eternal or much-extended life, and another more dire one—the extinction of the human race as we know it. Perhaps the best and safest course of action is to move in a direction that is a hybrid of the two if we are to benefit fully from biotechnology.
One way to extend human life expectancy comes from a number of biotechnologies, many of which are currently in use today. Let us start by imagining being able to make an exact duplicate of yourself, a clone. When one body wears itself out, you have another duplicate to take over and thereby extend your life. Richard Morgan in his novel, Altered Carbon, tells of an individual who will never die as his consciousness is rotated through clone after clone when he dies.
Cloning is simply the production of cells or organisms that are entirely identical in their genetic makeup. This sounds simple in principle, but the technology is truly groundbreaking and has never been tested on a human being—at least not yet. The possibilities and repercussions of cloning tread into territory that only speculative fiction authors can imagine.
Cloning might sound like science fiction, but gardeners perform a form of biological manipulation, in effect, cloning, every time they take a cutting from a plant and replant it to grow somewhere else. That cutting, if it takes root, will grow into a plant that is an exact genetic duplicate of the original plant. In fact, the word “clone” comes from the Greek word “klon”, which translates as twig, referring to the process whereby a new plant can be created from a twig. Nature also performs cloning daily. In the world of single cells, when cells divide to reproduce, the two resultant cells are duplicates of the original.
Then there is the cloning that occurs in the laboratory. One method currently employed is when the developing embryo from the uterus of a female animal is removed and cut into two. Thus far, it has been successfully performed on horses, sheep, cattle and pigs. Each half is halved, and the resulting embryo is placed into a surrogate mother for further development.
There is also the nuclear transplantation method that has been used on animals such as frogs and mice. In this method the nuclei are removed from the embryonic cells of on individual and transferred to the unfertilized eggs of another. Since all of the transplanted nuclei are from the same animal with the same genetic makeup, the resulting individuals from this technology are classified as clones.
In 1997, cloning technology took a leap forward. Now, scientists were able to not just generate clones from embryonic cells, but from adult cells. At the Roslin Institute in Edinburgh, Scotland, Ian Wilmut cloned a mammal from an adult cell, not an embryonic cell. The way his team accomplished this feat was to coax an adult cell back into its embryo state. The first cloned mammal using this technique was a sheep named Dolly. Unfortunately, Dolly suffered from premature aging as her genes were copied from an older sheep. She died in 2003. Other teams of biologists have since improved on lessons learned from the Dolly experiment.
Even before Dolly’s debut, there were authors that looked at the concept of cloning and envisioned all sorts of possibilities. Two stories in particular looked at what the repercussions could be if we were to clone somebody famous. James Beau Seigneur’s Christ Clone Trilogy, deals with the cloning of Jesus Christ from cells found on the Shroud of Turin. On the opposite end of the spectrum, Ira Levin’s The Boys from Brazil, is about a post World War II Nazi plot to clone Adolph Hitler.
Other authors have looked at the ethics of cloning. Kate Wilhelm’s novel, Where Late the Sweet Birds Sang, depicts a world of enormous environmental change and disease, which causes the collapse of civilization. The survivors find themselves infertile only to discover that cloning may be the solution to their reproductive dilemma.
We may find in the future that some species are unclonable. In fact, there has been serious discussion on cloning endangered or even extinct animals to protect and resurrect them, but this may not always be an option in light of the fact that some animals are truly not able to be cloned due to some unknown genetic barrier. Then again, some scientists think that while present techniques used for cloning may not work for some species, other currently unknown methods may do the trick in the future.
There is also some speculation that cloning could be used to create not only cells but entire body parts and even entire bodies which would be genetically identical to the original—a ready-made organ bank so to speak.
A technology that was strictly science fiction about 25 years ago has entered the realm of science fact: Genetic recombination. The technology is not limited to humans, but has been extended to our food products and even manufactured items. Rice, for example, has been found to have a gene that protects the plants from leaf blight. By inserting the gene into other plants such as corn and wheat, the same protection is provided. In another case, genetically altered bacteria afforded plants such as strawberries protection against light frost. In another use of genetic recombination by the British biotechnology firm, Zeneca, biodegradable plastics have been created in vats by bacteria.
However, it is in the promise of a longer, healthier life that could make genetic recombination the leading healthcare science. While some scientists are working on cures for diseases, others are trying to stem diseases before they even start. For example, some scientists are now working on transforming mosquitoes and other disease vectors into non-infectious agents; in other words, their bodies will be made into unsuitable hosts for infectious diseases such as malaria and yellow fever. Conversely, this process could also be used on beneficial insects such as honeybees to make them resistant to disease.
There are also over five thousand recognized human genetic diseases. We continue to find more. We may even find that infectious diseases only become a reality if we are genetically predisposed to acquiring the illness. Up until recently, methods to deal with genetic disease were macroscopic in nature—dealing with the effects, or symptoms, rather than the cause. Genetic recombination may even hold the key to finding a cure for all cancers.
In simple terms, genetic therapy involves the replacement of a flawed gene with a normal one. Normal genes are delivered to patients in one of two ways. The more common method involves cells being removed from selected tissues. The tissues are then exposed to the gene transfer agent and the altered cells are re-injected back into the patient.
The other method is through the use of nature’s most efficient gene delivery mechanism, the virus. A virus’ mission in life is to reproduce, and the way it does that is to invade living cells in order to utilize the tools of the cell to reproduce its genetic package. Of course, if using viruses as the genetic transfer tool, scientists have to ensure that the virulent genes are removed; otherwise, they would be transmitting a viral disease.
Alternatives to viruses are being developed since the use of viruses carries the inherent risk of transmitting a viral illness. A group at Stanford University School of Medicine uses what is known as a jumping gene to transport genes to desired sites. They put a gene that promotes the clotting of blood into a jumping gene, also known as transposon—a piece of DNA that moves about within the genome. Other exciting research is looking at the construction of artificial chromosomes. These have the promise of providing the efficiency of viruses without the potential risks. It is now possible, for example, to insert genetic material from one organism into a host from an entirely different species, even inserting animal genes into plants and vice versa. What would a vegan think of this when planning his diet? This is known as transgenics. Human genes have been successfully transferred to bacteria, which subsequently create such medically important chemicals as insulin, interferon and human growth hormone.
Perhaps the first fictional tale that looked at the repercussions of genetic recombination gone wrong was H. G. Wells’ The Island of Dr. Moreau. Another, Falling Free by Lois Bujold, is about the bioengineering of humans who would be able to work in the weightless conditions of space stations. Hot Sky at Midnight, by Robert Silverberg, tells of a world in which bioengineers attempt to create human beings able to survive in a heavily polluted Earth.
Now comes the true speculative portion of genetic recombination technology. Is it possible to engineer a human being that could fly, perhaps by adding the genes responsible for wing formation from a bat or a bird? There is more involved in this kind of manipulation than just adding a few genes to a human genetic complement. Simply adding wings, for example, is not enough to allow humans, let alone bats and birds, to fly. Figuring out all of the genes necessary to create the structures responsible for flying is a task that may only be accomplished in the distant future—not to say that it will never happen. After all, years ago, who would have thought that we could create bacteria that could create life-sustaining insulin for diabetes patients?
Overall, genetic recombination can only make one marvel at what Nature and science have been able to accomplish.
Genetic engineering is only one way that we may some day alter the human species. The relatively new field of nanotechnology began as a musing by Nobel laureate physicist, Richard Feynman, in an article entitled "There’s Plenty of Room at the Bottom." The article looks at how to make a nano-sized machine that could still be consistent with the laws of quantum physics. In fact, he speculated that the machines could be as small as molecules themselves. Nanotechnology is defined as the manipulation of matter at the atomic and molecular level.
Eric Drexler, with his book, Engines of Creation, took the gauntlet from Feynman and ran with it. He followed his book with a scientific treatment of nanotechnology called Nanosystems. The book was an attempt to address criticisms by some scientists of nanotechnology. The ultimate goal of nanotechnology is to create programmable matter that will be directed to design materials with properties that can be easily, reversibly and externally controlled.
Though controversial when it was first proposed, nanotechnology has the potential to make radical changes in our biological lives. For example, during the Foresight Conference on Molecular Nanotechnology held in Bethesda, Maryland in November 2000, a team proposed the idea of creating microscopic robots to be powered by bacteria. This has the potential to send robots around the human body to deliver drugs to specific target sites or to scrape the interior of a clogged artery. Instead of bacteria workhorses, another nanotech company wants to develop tiny submarines propelled by altering the magnetic fields surrounding them.
Raymond Kurzweil, a futurist and transhumanist wrote The Singularity is Near, which proposes that medical nanorobotics could completely remedy the effects of aging by 2030. Science fiction has also speculated on the use of nanotechnology in medicine. Brian Stableford’s Inherit the Earth specifically looks at a world ruled by nanotechnology that renders humans immune to disease and eventually to aging itself. As if a warning were warranted, however, Greg Bear wrote Blood Music, a cautionary tale in which nanotechnology used for a medical intervention goes horribly wrong. He later expanded this story into a novel.
Cryogenics and Hibernation
Another way to extend human life is through a technique that is popular in science fiction, but not so effective with our current technology. Theoretically it allows for the freezing of an organism to be resurrected later on. The science appears sound since lowering temperature to a certain level suspends metabolic processes. Without the chemical reactions of life, the life of an individual can theoretically be extended. The problem with the technology at present is that freezing causes the disruption of cell membranes due to the property that water expands when frozen, unlike most other chemicals, thus making freezing impossible without cellular damage. The science is known as cryogenics or cryonics.
For cryonicists, storing the body at low temperatures after death may provide transportation into a future in which advanced medical technologies may allow resuscitation and repair. They speculate that cryogenic temperatures will minimize changes in biological tissue for many years, giving the medical community ample time to cure all disease, rejuvenate the aged and repair any damage that is caused by the freezing process. So far unfortunately, cryonics remains in the realm of science fiction since no mammal has been successfully preserved and brought back to life. The future, however, will probably show us another reality.
Many early science fiction writers speculated on human cryogenics. One, written as early as 1886, is a short story by Lydia Maria Child, entitled "Hilda Silfverling, A Fantasy." Jack London, better known for his tales of the wild, had a notable first story that looked at human cryopreservation entitled "A Thousand Deaths." H. P. Lovecraft wrote Cool Air in 1928 and Edgar Rice Burroughs wrote The Resurrection of Jimber-Jaw, in which the heroes are involuntarily preserved. In 1931, there was a science fiction breakthrough when Neil Jones wrote The Jameson Satellite, a story in which the subject is deliberately preserved after death. It was this tale that allegedly gave Robert Ettinger, the father of cryogenics, the idea of the potentials of cryonics. He subsequently founded the Cryonics Institute in 1976. The company is a member-owned non-profit that provides cryogenic services.
Science has recently embarked on the investigation of placing animals in suspended animation, which is a precursor to actual cryonics. Suspended animation involves the lowering of body temperature to a level where metabolic processes are slowed significantly. One trial was conducted in 2006 at the Massachusetts General Hospital in Boston. There, scientists placed pigs in suspended animation. The pigs were anaesthetized and major blood loss was induced, to be replaced by a chilled saline solution. As the body temperature reached ten degrees Celsius, damaged blood vessels were repaired and the blood was returned. In 2005, a laboratory in the United States successfully induced suspended animation hypothermia in mice. Similar experiments have also been successfully conducted on nematode and zebrafish embryos.
There are many other research projects currently investigating how to achieve "induced hibernation” in humans. This ability would be useful for a number of reasons, such as saving the lives of seriously ill or injured people by temporarily putting them in a state of hibernation until treatment can be given. It could also be used to allow for extended space voyages without aging of the ship occupants.
Science fiction appears to be very fascinated by the prospect of cryonics. Perhaps the most detailed tale of cryonics is by James Halperin called The First Immortal. Other novels that deal with the science of cryonics include The Door into Summer by Robert Heinlein, The Age of the Pussyfoot by Frederick Pohl, Ubik by Philip K. Dick, Tech-Heaven by Linda Nagata, and Tomorrow and Tomorrow by the late Charles Sheffield. Gregory Benford, under the pseudonym Sterling Blake, wrote Chiller, a thriller about cryonics.
More commonly, science fiction authors use the application of cryonics to accommodate the long voyages of interplanetary and even interstellar travel. Vernor Vinge in his award-winning novel A Fire Upon the Deep, depicts a hero having been resuscitated by a super-intelligence thousands of years after an accident in space. Stanislaw Lem uses cryonics to allow for his characters to travel the depths of space in his novel The Invincible. Larry Niven’s classic short story "Wait it Out" is about the emergency cryopreservation of men marooned on the frozen surface of Pluto.
Some scientists see the biological future of humans, not in organics but, somewhat ironically, in machines. The idea came about early in speculative fiction. As early as 1843, Edgar Allan Poe wrote the short story "The Man that Was Used Up." The story follows a narrator meeting the famous war hero, John Smith, whom we find out must first be assembled piece by piece. In essence Poe was describing what would become known as a cyborg.
The term “cyborg” came from a 1960’s article in Astronautics by Manfred Clynes and Nathan Kline. Their idea came from a discussion on the need for an intimate relationship between human and machine as the new frontier of space exploration was being opened up. A cyborg is essentially a man-machine system in which control mechanisms for the human portion are modified by drugs or other regulatory devices so that the being can live in an environment different from a normal one. We could look at the current prosthetic devices, artificial heart valves, cochlear implants, pacemakers, etc, as early examples of cyborg technology. Tissue has been structured with carbon nanotubes and plant and fungal cells have been used for artificial tissue engineering. In vision, direct brain implants have been used to treat non-congenital blindness.
It was in 1997 that Philip Kennedy, a physician-scientist, created the world’s first cyborg named Johnny Ray. Ray was a Vietnam veteran who suffered a stroke and in order to regain his old life, he agreed to be the guinea pig of a bold new experiment. Kennedy embedded a neurotrophic electrode in Ray’s brain so that Ray would be able to regain some movement. The surgery was successful, but Ray died in 2002.
One of the first tales of a cybernetic human being, or in this case a robot, who slowly evolves into a human being (kind of a reverse cyborg) comes from Isaac Asimov’s The Bicentennial Man, about a robot who attempts to modify himself with organic components. His explorations lead to breakthroughs in human medicine via artificial organs and prosthetics. By the end of the story, the robot, now named Andrew, is impossible to distinguish from people who have advanced prosthetics, other than his positronic brain. Asimov explored the idea of cyborgs further in another short story "Segregationist."
Martin Caidin wrote the novel Cyborg, about a severely injured astronaut rebuilt with state of the art technology. The novel would eventually become the basis of a spinoff television series The Six-Million Dollar Man and its subsequent spinoff The Bionic Woman. Frederick Pohl took an able-bodied man and surgically altered him to function without protection on the harsh surface of Mars in his novel Man Plus.
Taking a cyborg to its extreme, we eliminate the physical body altogether and copy or transfer a conscious mind from a biological brain to a non-biological computer system or computational device. What would happen to humanity if the uploaded humans became interconnected, similar to today’s Internet? Would we ever be human again? Could we ever revert back?
As early as 1879, Edward Page Mitchell wrote The Ablest Man in the World about a computer inserted into a man’s head, thus turning him into a genius. Anne McCaffrey in her book The Ship Who Sang wrote of a brainship—essentially a human body that could not develop normally—encased in and mentally connected to a spacecraft. In 1944 C. L Moore wrote "No Woman Born," a short story about a famous dancer who was burned almost completely. In order to save her, doctors place her surviving brain into a faceless but beautiful mechanical boy.
Humans With Hive Minds
Some Science fiction writers have envisioned a hivelike grouping of human beings, each acting as a single cell in a greater organism, not unlike our social insects. Ants, bees and wasps are not individual entities, but are part of a greater goal—the survival of the nest or hive. The social insects are successful, but is this what we would like to see in our future? Is it a leap forward in our evolution or a horror in which we lose our humanity once and for all? One truly horrific example that depicts a hivelike mind intent on the destruction of the human race is found in John Wyndham’s The Midwich Cuckoos. Stephen Baxter, too, wrote of a highly evolved human species in his Destiny Children series. Others evolved hive humans can be found in Michael Swanwick’s Vacuum Flowers, in Alastair Reynold’s Revelation Space series, and the Phoners of Stephen King’s Cell. On the opposite end of the spectrum, Arthur C. Clarke saw a hivelike human entity as the ultimate goal of humanity in his classic Childhood’s End, which describes a benevolent alien race that assists humanity in their ultimate evolutionary step.
Other authors have looked at hivelike societal structures in the aliens that they have created. One of the first authors to look at this hivelike mentality is Theodore Sturgeon in his novel The Cosmic Rape. In Robert Heinlein’s novel Starship Troopers, the Bugs are mirrored after the social hierarchy that we see in social insects. Other alien hivelike beings include the the Boaty-Bits of the Saga of Cuckoo by two early masters of science fiction, Frederick Pohl and Jack Williamson, the Formics (derived from the Latin, Fomica for ants) of Orson Scott Card’s classic Ender’s Game series, the universe-conquering Hive Mind of John Cramer’s Einstein’s Bridge, the Swarm of both Bruce Sterling’s short story "Schismatrix" and Michael Crichton’s novel Prey, and the very eerie and alien Squeem of Stephen Baxter’s Xeelee Sequence.
In Vernor Vinge’s A Fire Upon the Deep, there is an interesting hivelike being— a lone individual described to be like a pack made up of about four to seven entities, which are the equivalent of human adults. In larger numbers, the entities become confused and unintelligent beings.
Evolution or Devolution?
What if we don’t do anything? What if we just let Nature takes its impersonal course? No genetic engineering, no genetic recombination, nothing. Will humans still change? Are we still evolving? In H. G. Wells’ tale The Time Machine, our intrepid inventor/explorer finds himself in a distant future where the human race has evolved into two distinct species: the Morlocks that live underground and venture to the surface to feed on their distant humanoid cousins, the surface-dwelling gentle Eloi. Stephen Baxter’s Evolution gives us a story that stretches from the remote past of Earth to the equally distant future tracing the human lineage through time. For many species, extinction is the result. There is no reason to assume that humans will escape this fate either.
Devolution, dysgenics, degenerate or backward evolution is the notion that a species can change into a more primitive form over time. It is not without precedence in nature. Darwin stated that evolution worked in the direction that allowed for more complexity over time, but it turns out that in this regard, he was mistaken. When late evolutionary biologist Ray Lankester explored evolution by natural selection as promoted by Darwin, he found sea squirts, an oceanic group of invertebrates, as being an example of degenerate evolution.
Anton Dohrn, an early German Darwinist, looked at certain jawless fish, lampreys, as evolutionary degenerates. There is no evidence that their jawlessness is an ancestral feature, but rather is the product of an evolutionary adaptation to a life of parasitism. Another radical idea of degenerate evolution came from Lamarkian evolutionist Ernest MacBride, a student of Dohrn (Lamarkism is a theory of evolution in which changes in animals are a result of a need in the environment. For example, a giraffe’s neck became longer due to the fact that they were feeding on the leaves at the top of the trees). He stated that all invertebrates are actually degenerate vertebrates, based on the fact that crawling on the sea floor would be less stimulating than swimming in open water.
Are we as a civilization contributing to a degeneration?
Wars, for example, are fought by the young, healthy and able-bodied, thus potentially removing a vibrant gene pool if those soldiers end up being killed prior to having children. Even a supposedly proactive campaign such as designer babies, that has the potential effect of removing genetic illnesses from the human population, may also have a detrimental effect on the human race. It allows parents to design the ideal baby for themselves by removing any genetic flaws, basing their decision on emotion rather than the non-emotive selections of Mother Nature. By removing Mother Nature from the equation we may be losing our genetic futures by removing potentially beneficial genes from the human race.
One of the first tales that looked at this dysgenic future was Cyril Kornbluth’s 1951 short story "The Marching Morons," which describes a world where people are becoming increasingly stupid.
Plagues and Death
Another negative to our possible biological future comes from the world of microbiology. There is every possibility that a plague, either human created or created by Nature herself, can and will cause the ultimate demise of humankind.
Most recently in 2009, the world lived in fear of an influenza pandemic. When it killed young children with apparently no underlying health conditions, the world prepared for the worst. It was called H1N1, and experts predicted it would spread with alarming rapidity. Many people were infected but relatively few died. The virus has a high communicability factor but fortunately a very low virulence.
If the influenza virus of 2009 had been the same variety as the one of 1918 in which over fifty million people perished, the fear would have been well-founded. Imagine in this day-and-age of almost instantaneous movement from one continent to another, how devastating such an influenza strain could have been. Stephen King explored the impact of a truly virulent strain of influenza in his novel The Stand. The disease ended up destroying civilization, leaving only an intrepid band of survivors to tell their tale.
Influenza is only one of many diseases awaiting its turn to devastate humankind again. Other diseases of history have shown that influenza is far from alone in the world of virulent microbes. In 430 BC, typhoid fever killed a quarter of the Athenian army and over a quarter of the population of Athens. Its only saving grace was that it proved so virulent that it burned itself out before it could spread further.
Smallpox, which has been effectively exterminated from the planet today, caused an outbreak that killed up to five thousand people per day between 165 and 180 CE. From 541 to 750 CE, an outbreak of bubonic plague killed up to ten thousand people per day in Egypt. Spreading beyond the confines of Egypt it went on to kill about a quarter to half of the human population between 550 and 700 CE. Bubonic plague, also known as Black Death started in Europe again in the 14th century, killing an estimated 700 million people in a series of many plagues that struck Europe up until the 18th century. The Great Plague of London from 1665-66 was the last to strike England, killing off twenty percent of the population.
As Europeans spread around the world in an age of expansionism from the late 1500s to 1600s they took with them diseases that, though common in Europe, were unknown to the natives of the lands that they explored. It has been estimated that up to 95% of the native population of the New World were killed by the introduced illnesses.
More than likely it will not be these old diseases that will devastate humankind, but more opportunistic pathogens that don’t raise alarm bells—at least not yet. Normally these bacteria cause infections after surgery or injury, and have in the past been controlled with antibiotics. We may have let the genie out of the bottle, however, with our over-use of these antibiotics. Today, there are many common bacteria, such as Staphylococcus aureus, Serratia marcescens and Enterococcus, that have developed a high level of resistance to antibiotics, to a point where we may no longer be able to control the resulting illnesses. We are probably the only ones to blame for this new plague, by the overuse of antibiotics through over-prescription, through inappropriate self-diagnosis, and even to increase the weight of livestock prior to going to slaughter.
How many times have people not taken their full dose of antibiotics when prescribed or even shared them with a friend or relative whom they thought displayed the same symptoms? All of these misuses increase the evolutionary potential for antibiotic resistance in bacteria.
An old scourge may be making a comeback due to inappropriate antibiotic use. Up to a third of the world’s current population is said to have latent Mycobacterium tuberculosis, with new cases occurring at a rate of one per second. Considering that at one time we thought the disease was under control, we are sadly sobered with the reality that new, more virulent and resistant strains of tuberculosis are evolving.
Other diseases periodically raise their heads just to keep us on our toes. Acquired Immune Deficiency Syndrome was a virtual unknown before 1982, but now, though under some control, is still being spread at an alarming rate. Severe Acute Respiratory Syndrome (SARS) is another disease that appeared in 2003 and spread to many regions of the world. It disappeared due to an effective response by public health officials, but could come back again in a new, more contagious form. Where and when we may never know until it happens.
Perhaps the first author to imagine a plague-infested world was Giovanni Baccaccia in his 1353 tale The Decameron. Mary Shelley, more famous for her novel Frankenstein, wrote in 1826 The Last Man, which follows a group of individuals moving throughout Europe as a plague decimates the population. In 1912, Jack London wrote The Scarlet Plague about the world of 2073, sixty years after the devastating impact of a plague called the Red Death. George Stewart’s Earth Abides, written in 1949, is about an individual living in a world ravaged by disease. A small community coalesces around the hero as he struggles to start a new civilization. The Last Canadian by William Heine and The Last Town on Earth by Thomas Mullen are novels that look at worlds devastated by various plagues.
However, the defining book of the genre would probably be Michael Crichton’s The Andromeda Strain, about an alien microbe brought back to Earth by a space mission.
Other authors have looked at the current popularity of zombies more as a plague afflicting humankind rather than any supernatural raising of the dead. This started in 1954 with the novel I am Legend by Richard Matheson. Max Brooks immensely popular World War Z is a tale of the ultimate conflict between humans and zombies; here as well, the zombies are afflicted with an illness that can be cured.
Nature may be one source of devastating diseases, but current science also allows for microbes to be manipulated into doing our bidding. History is full of examples of biological warfare—defined as the intentional use of biological toxins and infectious agents to kill or incapacitate humans, animals or plants.
Assyrians, as early as the 6th century BC, infected enemy well water supplies with toxic fungi. Later the Mongols, in their conquests of medieval European cities, would not only infect the wells of their potential conquests with infected animal carcasses, but also launch their plague-stricken dead over the walls of besieged cities. Some historians believe that this practice may have contributed to the introduction of the plague into Europe.
In their conquest of North America, the British, too, resorted to a form of germ warfare in a more subtle way. During Pontiac’s War between 1763 and 1766, there were tales of the British giving smallpox contaminated blankets to the Lenape Native American tribe. There is also the possibility that bioweapons were used to devastating effect in past wars. Imperial Japan of World War II ruthlessly developed and tested bioweapons. They conducted human experimentation on thousands of Chinese and also employed the use of bioweapons in their war strategy with some claims of up to 560,000 dead attributed to those weapons.
Imperial Germany in World War I also experimented with creating bioweapons out of anthrax and glanders, with limited impact.
However, the potential for the creation of bioweapons prompted the development of the Geneva Protocol, which effectively banned the use of bioweapons; there are tales that the Soviet Union continued with its bioweapons research program in spite of the ban. In 1972, the Geneva Protocol was upgraded with the Biological Weapons Convention. Now it not only banned the use of bioweapons but also the production, storage and transportation of such weapons.
The United States and United Kingdom openly destroyed their stockpiles, and 170 countries signed the treaty, including the Soviet Union; but some nations are still feared to have active programs.
Not strictly confined to the literary world of speculative fiction, the concept of biological terrorism has a place in the genre of literary thrillers. Many of the best-selling authors of thriller fiction have at least one or two novels that contain components of bioterrorism.
Michael Palmer’s A Heartbeat Away, Vector by Robin Cook, The Judas Strain by James Rollins, Brad Thor’s Blowback, Clive Cussler’s The Plague Ship and Black Wind, Tom Clancy’s Executive Orders and Rainbow Six, Robert Ludlum’s The Moscow Vector, W.E.B. Griffiths’ The Outlaws, Raymond Khoury’s The Sanctuary, Nelson DeMille’s Plum Island and Ken Follett’s Whiteout are just a few of the many thrillers by best selling writers that deal with the topic of bioterrorism. Code Orange by Caroline Cooney is about a student who finds a smallpox scab in an envelope of an old medical text, and inadvertently alerts terrorists when he searches the internet about the disease.
David Palmer’s Emergence tells the story of how a human-developed plague destroys half of the world’s population. It won the Compton Literary Award in 1985. The Quick and the Dead, by Matthew John Lee, describes the aftermath of an attack on the British Isles using an enhanced smallpox virus.
Speculative fiction authors look at the concept of bioterrorism with a more scientific bent. For example, The White Plague by Frank Herbert is about a bioengineered disease that targets only women putting the human race of the path of possible extinction. Margaret Atwood’s 2003 dystopian tale Oryx and Crake is set after a genetically modified virus wipes out the entire population except for a small group of humans who themselves have been genetically modified.
All in all, our biological future is highly unpredictable. It holds the panacea for eternal life (or the closest thing possible) or a highly degraded future where all of our humanity is lost. The human race has the control to make the necessary adjustments to see that our future is directed as we wish; however, we must all work together to achieve this. If not, our biological future is up for grabs.
Alibek, Ken and Handelman, S. 2000..Biohazard: The Chilling True Story of the Largest Covert Biological Weapons Program in the World-Told from Inside by the Man Who Ran it. Delta.
Allhoff, Fritz et al. 2010. What is Nanotechnology and Why Does it Matter? From Science to Ethics.Wiley-Blackwell.
Balsamo, Anne. 1996. Technologies of the Gendered Body: Reading Cyborg Women. Duke University Press.
Bancroft, E. 2007. Antimicrobial resistance: it’s not just for hospitals. Journal of the American Medical Association. 298(15):1803-1804.
Bell, James. 2003. Explorting the “Singularity” The Futurist.
Beveridge, W. 1977. Influenza: The Last Great Plague, An Unfinished Story of Discovery. Heineman Educational Publishers.
Bostrom, Nick. 2002. Existential Risks. Journal of Evolution and Technology. 9.
Bostrom, Nick and Cirkovic, Milan (eds). 2008. Global Catastrophic Risks. Oxford University Press.
Bowman, D. and Hodge, G. 2006. Nanotechnology: Mapping the Wild Regulatory Frontier. Futures. 38(9):1060-1073.
Broderick, Damien. 2002. The Spike: How Our Lives Are Being Transformed by Rapidly Advancing Technologies. Tor.
Bulzomi, Pamela. 2012. The Pro-apototic Effect of Quercetin in Cancer Cell Lines Requires ERB-Dependent Signals.Journal of Cellular Physiology.227(5): 1891-1898.
Campbell, K. et al. 1996. Sheep cloned by nuclear transfer from a cultured cell line. Nature. 380(6569):64-66.
Clark, Andy. 2004. Natural-Born Cyborgs. Oxford University Press.
Clynes, Manfred and Kline, Nathan 1960. Cyborgs and space. Astronautics. 26-27 and 74-75.
Dembek, Zygmunt (ed.) 2008. Medical Aspects of Biological Warfare. Government Printing House.
Devlin, B. et al. (eds). 1997. Intelligence, Genes, and Success: Scientists Respond to the Bell Curve. Copernicus.
Diamond, Jared. 1999. Guns, Germs, and Steel: The Fates of Human Societies. Norton and Company.
Donaldson, L. et al. 2009. Mortality from pandemic A/H1N1 2009 influenza in England: Public health surveillance study. British Medical Journal.339:b5213.
Dourgherty, Michael. 1998. Is the Human Race Evolving or Devolving. Scientific American.
Drexler, Eric. 1986. Engines of Creation: The Coming Era of Nanotechnology. Doubleday.
Drexler, Eric. 1992. Nanosystems: Molecular Machinery, Manufacturing, and Computation. John Wiley and Sons.
Dreyfus, Hubert and Dreyfus, Stuart. 1986. Mind Over Machine. Blackwell Publishers.
Endicott, Stephen and Hagerman, Edward. 1998. The United States and Biological Warfare: Secrets from the Early Cold War and Korea. Indiana University Press.
Ettinger, Robert. 2005. The Prospect of Immortality. Ria University Press.
Gazit, Ehud. 2007. Plenty of Room for Biology at the Bottom: An Introduction to Bionanotechnology.World Scientific Publishing.
Gray, Chris. 2002. Cyborg Citizen: Politics in the Posthuman Age. Routledge.
Gray, Chris. 1995. The Cyborg Handbook. Routledge.
Grenville, Bruce. 2002. The Uncanny: Experiments in Cyborg Culture. Arsenal Pulp Press.
Halberstam, Judith and Livingston, Ira (eds). 1995. Posthuman Bodies. Indiana University Press.
Hamilton, W. 2000. A review of Dysgenics: Genetic Deterioration in Modern Populations. Annals of Human Genetics. 64(4):363-374.
Haraway, Donna. 1990. Simians, Cyborgs, and Women; the Reinvention of Nature. Routledge.
Holliday, Robin. 2009. The extreme arrogance of anti-aging medicine. Biogerontology. 10(2):223-228.
Holloway, S. and Smith, C. 1975. Effects of various medical and social practices on the frequency of genetic disorders. American Journal of Human Genetics. 27(5):614-627.
Jordan, David. 1988. War and the Breed: The Relation of War to the Downfall of Nations. Cliveden Press..
Kawaoka, Y. (ed.). 2006. Influenza Virology: Current Topics. Caister Academic Press.
Keith, Jim, 1999. Biowarfare in America. Illuminet Press.
Kidwell, M. 1983. Evolution of hybrid dysgenesis determinant in Drosophila melanogaster. Proceedings of the National Academy of Science, United States. 80(6):1655-1659.
Klugman, Craig. 2001. From Cyborg Fiction to Medical Reality. Literature and Medicine. 20(1):39-54.
Kubik, T. et al. 2005. Nanotechnology on duty in medical applications. Current Pharmaceutical Biotechnology. 6(1):17-33.
Kurzweil, Ray. 2006. The Singularity Is Near: When Humans Transcend Biology. Penguin..
Larson, E. 2007. Community factors in the development of antibiotic resistance. Annual Review of Public Health. 28:435-447.
Leary, S. et al. 2006. Toward the Emergence of Nanoneurosurgery: Part III-Nanomedicine Targeting Nanotherapy, Nanosurgery, and Progress Toward the Realization of Nanoneurosurgery. Neurosurgery. 58(6):1009-1026.
Lynn, R. 1995. Dygenic fertility for criminal behaviour. Journal of Biosocial Science. 27(4):405-408.
Lynn, R. and Harvey, J. 2008. The decline of the world’s IQ. Intelligence. 36(2):112-120.
Martinez, D. 1998. Mortality patterns suggest lack of senescence in hydra. Experimental Gerontology. 33(3):217-225.
Masoro, E. J. and Austad, S. (eds.) 2010. Handbook of the Biology of Aging. Academic Press.
Mayor, Adrienne. 2009. Greek Fire, Poison Arrows, and Scorpion Bombs: Biological and Chemical Warfare in the Ancient World. Overlook Press.
McLaren, A. 2000. Cloning: pathways to a pluripotent future. Science. 288(5472):1775-1780.
Mills, C. et al. 2004. Transmissibility of 1918 pandemic influenza. Nature. 43:904-906.
Moravec, Hans. 1998. When will computer hardware match the human brain? Journal of Evolution and Technology. 1.
Muri, Allison. 2007. The Enlightenment Cyborg: A History of Communications and Control in the Human Machine, 1660-1830. University of Toronto Press.
Narayan, R. et al. 2004. Nanostructured Ceramics in Medical Devices: Applications and Prospects. JOM. 56(10):38-43.
Orent, Wendy. 2004. Plague: The Mysterious Past and Terrifying Future of the World’s Most Dangerous Disease. Free Press.
Palese, P. 2004. Influenza: old and new threats. Nature Medicine. 10.:S82-S87.
Paull, J. and Lyons, K. 2008. Nanotechnology: The Next Challenge for Organics. Journal of Organics Systems. 3(1):3-22.
Pence, Gregory. 2004. Cloning After Dolly: Who’s Still Afraid? Rowman and Littlefield.
Perry, R. Michael. 2000. Forever For All: Moral Philosophy, Cryonics, and the Scientific Prospects for Immortality. Universal Publishers.
Potter, C. 2001. A History of Influenza. Journal of Applied Microbiology. 91(4):572-579.
Prasad, S. 2008. Modern Concepts in Nanotechnology. Discovery Publishing House.
Preston, Richard. 2002. The Demon in the Freezer. Random House.
Reitzer, L. et al. 1978. Evidence That Glutamine, Not Sugar, is the Major Energy Source for Cultured HeLa Cells. The Journal of Biological Chemistry.254:2669-2676.
Rose, Michael, et al. 2005. Late life: a new frontier for physiology. Physiological and Biochemical Zoology. 78(6):869-878.
Rorvik, David M. 1973. As Man Becomes Machine: The Evolution of the Cyborg. Souvenir Press.
Saini, R. et al. 2010. Nanotechnology: The Future Medicine. Journal of Cutaneous and Aesthetic Surgery. 3(1):32-33.
Shay, J. and Wright, W. 2000. Hayflick, his limit, and cellular ageing. Nature Reviews Molecular Cell Biology. 1:72-76.
Skloot, Rebecca. 2011. The Immortal Life of Henrietta Lacks. Broadway Books..
Smith, A. 1857. Problems in the Resusitation of Mammals from Body Temperatures Below 0 degrees C. Proceedings of the Royal Society of London. Series B, Biological Sciences. 147(929):533-544.
Solomonoff, R. 1985. The Time Scale of Artificial Intelligence: Reflections on Social Effects. Human Systems Management. 5:149-153,
Taubenberger, J. and Morens, D. 2006. 1918 Influenza: the mother of all pandemics. Emerging Infectious Diseases. 12:15-22.
Thomas, C. et al. 2012. Telemere maintenance and telomerase activity are differentially regulated in asexual and sexual worms. Proceedings of the National Academy of Sciences of United States. 109(11):4209-4214
Tumpej, T. et al. 2005. Characterization of the Reconstructed 1918 Spanish Influenza Pandemic Virus. Science. 310(5745):77-80
Valente, Joseph. 2011. Cyborgization: Deaf Education for Young Children in the Cochlear Implantation Era. Qualitative Inquiry. 17)7):613-620.
Valleron, A. et al. 2010. Transmissability and geographic spread of the 1889 influenza pandemic. Proceedings of the National Academy of Sciences-United States. 1071(9):8778-8781.
Vinge, Vernor. 1993. The Coming Technological Singularity. Vision-21:Interdisciplinary Science and Engineering in the Era of CyberSpace, Proceeding of a Symposium Held at NASA Lewis Research Center.
Walker, S. 2007. Comparison of meat composition from offspring of cloned and conventionally produced boars. Theriogenology. 67(1):178-184.
Warwick, Kevin. 2004. I, Cyborg. University of Illinois Press.
Warwick, Kevin. 2004. March of the Machines. University of Illinois Press.
Wheelis, M. 2002. Biological Warfare at the 1346 Siege of Caffa. Emerging Infectious Diseases. 8(9):971-975.
White, N. 2004. Antimalarial drug resistance. Journal of Clinical Investigation. 113(8):1084-1092.
Williams, Peter and Wallace, David. 1989. Unit 731: Japan’s Secret Biological Warfare in World War II. Free Press.
Wimmer, E. et al. 2009. Synthetic viruses: a new opportunity to understand and prevent viral diseases. Nature Biotechnology. 27:1163-1172.
Zelicoff, Alan and Bellomo, Michael. 2005. Microbe: Are We Ready for the Next Plague? AMACOM Books.