The rise and successful implementation of biotechnology has created a myriad of ethical concerns about the role of man in the environment, how to handle genetic knowledge, whether certain experiments are "ethically wrong," and what constitutes as property.
First, the rise of biotechnology has led to the use of genetically modified crops and animals. While the potential benefits of generating genetically modified crops were discussed in the Biotechnology section, such as improved crop yield, cheaper and more nutritious foods, and pharmaceutical plant/animal products, there are many ethical drawbacks.
The argument for most agricultural biotechnology is that poorer people will have better and more nutritious crops. Unfortunately, this comes at the potential expense of the ecosystem. Many genetically modified (GM) crops inadvertently cross-pollinate non-GM crops, creating GM hybrids. Therefore, there is no control of which crops are GM and which are not: for farmers who do not want to use GM crops, thereby respecting the wishes of the many consumers who still fear GM foods, this lack of control is a problem.
Golden rice is a GM rice crop that makes excess vitamin A. Vitamin A deficiency is a major problem in developing countries, causing night blindness, maternal mortality, complications in pregnancy, and difficulty fighting infections. However, many people oppose golden rice because they oppose all GM crops, based on the notion that GM crops reduce the biodiversity of other food crops. Others argue that GM crops, even though they help to feed the hungry and poor, do not help solve other larger issues of poverty. Still others prefer to have peanut butter and jelly. On paper, GM crops sound beneficial, but many worry about introduced genes in GM crops being transferred into non-GM crops.
Human cloning is almost universally opposed, however, many groups accept the use of cloned human organs for transplants. Cloned human organs and genetically modified pig organs have been proposed as a solution to the problem of the need for organs. However, animal rights groups question whether it is ethical to genetically modify animals. Cloned human organs have yet to become a reality, and many of the ethical concerns on human cloning are raised over the speculation on how this cloning will be performed.
With the rise of DNA sequencing, PCR, and increased medical funding, it has become easier to identify genetic sequences that are linked to various disorders. It has therefore become easier to identify whether an individual will develop a genetic disorder. However, is it right to allow an individual to live if they know they will develop a genetic disorder? Is it better to diagnose an individual with a genetic disorder if it could potentially have social and economical ramifications? Discovering you have certain diseases may affect someone hiring you for a job or how friends and family treat you. On one hand, knowledge is power, but on the other hand, too much knowledge can be dangerous.
Who should have access to your genetic information? Should health insurers be able to exclude you for coverage because they know your genomic sequence has many genes known to cause diseases? The major concern is that genetic tests cannot accurately predict that an individual will develop a disease, much like algebra tests do not necessarily determine how good you are in algebra (right, Ms. Smith?). Therefore, there is a chance that the individual will be diagnosed with a disease allele but will actually never develop the disease. With that in mind, it becomes even more important to think about how we handle genetic test results.
About 10-15 years ago, it became popular for biotech companies to patent genetic sequences that they performed research on. However, many question whether this is ethical. Can someone own a patent on a genetic sequence? What if you have a patented sequence in your DNA, do you need to pay a royalty to a company? Fortunately, the US Patent and Trademark Office has made patenting DNA much more difficult, requiring patent seekers prove that the genetic fragment is of "specific and substantial utility that is credible." However, some argue that these rules are still too lax, and that these patent holders hurt other research with their patents.
Over the last 30 years, biotechnology has completely reshaped the way we live our lives. A discussion over whether this has been a good or a bad thing is in the Ethics section. Nevertheless, there have been many major advances since the advent of PCR and recombinant DNA techniques.
One of the most significant advances is the sequencing revolution. We now can easily sequence whole genomes upward of 3 billion bases, which gives us almost more information than we know what to do with. Most genome sequencing is done by "shotgun" sequencing chunks of DNA in BACs, which is a sequencing technique that is random and scattered over large parts of the genome, much like how a shotgun shoots. The National Science Foundation (NSF) has begun a project looking at sequencing multiple different species to recreate the "Tree of Life." This tree should not be confused with Shmoop's horror movie Tree of Death that opens next summer.
Over 180 different species have had their genomes sequenced. While this research has proven to be interesting, the final usefulness of this sequencing effort has yet to be realized. One practical vision is that, in the near future, personal genomics will be a reality, where each person will have their genome sequenced, and we can easily tailor what types of medicines would be appropriate for an individual based on their genomic information.
The use of biotechnology has completely revolutionized medicine. Now it is much easier to generate vaccines than it was previously, and more effective vaccines can be made through biotechnology. DNA technology and sequencing allows us to perform DNA tests to determine paternity, to calculate the probability of developing a genetic disorder, and even to identify a missing person or murderer. However, being a psychic detective has nothing to do with biotechnology. If an individual has a genetic disorder, we can repair it through gene therapy (see the "Spiderman and Other Examples of Recombinant DNA" section).
Plant biotechnology is one of the most discussed uses of biotechnology, with much work focused on improving crop yield by making plants resistant to pesticides, pests, droughts, and other environmental stresses. Other work includes actually improving the amount of useable crop a plant will yield, or even improving the taste, texture, or appearance of food. One example is modifying plants so that they do not spoil so quickly, or can make enzymes for food production. With all this biotechnology improving the quality of food, we at Shmoop like to use the good old-fashioned way of making vegetables more palatable: adding a lot of cheese.
One example of genetically modifying plants to produce enzymes is the production of a "vegetarian" variant of rennet. Rennet is an enzyme used to coagulate cheese, and many vegetarians do not eat cheese because animal rennet is used, which is derived from sheep intestines. Production of "vegetarian" rennet from E. coli is an active industry, as is production of maltogenic amylase, an enzyme that keeps bread fresher and longer. These examples show how biotechnology improves or allows us more opportunities to consume food.
Another intriguing use of biotechnology is the concept of "pharming," where plants and animals produce chemicals that can be used medicinally simply through their consumption. Examples of pharming are plants that make insulin or proteins for certain vaccines, as well as added nutrients. "Golden rice" is an example of genetically modified rice that has more vitamins, and was intended for consumption in developing countries.
Biotechnology applications that have yet to be fully realized, but hold potentially amazing results, include the following: bioremediation, or the process of using bacteria to remove environmental pollutants; biocomputers, or computers that use biological molecules like DNA and proteins to perform computational tasks; and nanobiotechnology, or the nexus of nanotechnology and biotechnology. The much-discussed field of "cloning" whole organisms has had few successes, but the ethical ramifications of such work have drawn such ire that cloning research has progressed slowly. Or, maybe the dislike of clones is due to everyone hating Star Wars Episode II: Attack of the Clones.
It could be argued that the field of biology became the field of DNA research around the time of James Watson and Francis Crick's discovery of the structure of DNA. Since then, much of biology research has focused on learning more about manipulating DNA and the functions of DNA, from protein coding to evolutionary processes. While they, along with Maurice Wilkins, earned the Nobel Prize in 1962, another researcher, Rosalind Franklin, was not awarded the Nobel. Despite the essential nature of her work in the discovery of the crystal structure of DNA, she died 4 years before the award was given, and the Nobel Prize is not given posthumously.
Nevertheless, the crystal structure of DNA stands on the shoulders of many others who laid the foundation for the discovery that DNA is the fundamental molecule of life. While many know of Darwin and Mendel, few know of Hugo de Vries, Carl Correns, and Erick von Tschermak-Seysenegg. Their works together provided the link between Mendel's inheritance and Darwin's evolution, creating a gene model for natural selection. All these giants standing on giants must be obnoxious for the average person trying to watch the show behind them.
While we know that DNA is the molecule of life, how did we first find DNA? Friedrich Mieschler was the first to crudely isolate DNA from pus, calling it nuclein. Emil Fischer and Albrecht Kossel were the first to chemically isolate the nucleotide bases in the late 1800s, including isolating guanine from sea bird poop. Finally, Phoebus Levene was the first to show that a DNA monomer is composed of a base, sugar, and phosphate group. All that scientific work in poop and pus, and yet few ever remember their names.
Though Thomas Hunt Morgan showed that hereditary information is found on the chromosome in the early 20th century, he was not the first to report it. Theodor Boveri showed that each chromosome houses different genetic information, and he also showed that each part of the chromosome is responsible for inheriting a certain gene. Thomas Hunt Morgan's work was the culmination of Boveri and Walter Sutton's work supporting the chromosome theory of inheritance.
Even after Watson, Crick, Franklin, and Wilkins' publications about the structure of DNA, work was required to show the mechanism of DNA replication. An experiment from Matthew Meselson and Franklin Stahl showed that DNA replication is semi-conservative, meaning that each daughter strand of DNA contains one parental strand from the double helix. And, while Crick followed on his double helix work to show codons coded for amino acids, it was the work of Marshall Nirenberg, Robert Holley, and Hargobind Khorana that actually unlocked the genetic code.
Therefore, while biology textbooks make it sound like there were only 5 or 6 people who contributed to the discovery of DNA, inheritance, and chromosomes, you now know that there were many. Every major advance in DNA research has come on the shoulders of unknown or forgotten scientists. The work is not finished yet, and many questions remain over the role of epigenetics in evolution. Are DNA and histone modifications inherited from one generation to the next? How is this inheritance regulated? A recent study in rats showed that what can be learned in one generation can affect future generations, but how does this work, and is this applicable to humans? Biology is all about unanswered questions, and while we already know so much about DNA, we still understand little. Nevertheless, if you play your cards right, you too could be a footnote in history, if not a Nobel Prize winner.