Tired of ads?
Join today and never see them again.
Advertisement - Guide continues below
A large part of what makes species look and act differently is coded in their genetic material. What genes make species look different? A possibility is to have unique genes that make a fly, a mouse or a human look like a fly, a mouse, or a human, respectively. As it turns out, you have a lot more in common with a fly and a mouse than you think…
Research has shown that genes determining how major body parts develop are almost the same across very different species. A really cool example is a gene involved in eye development known across species as Pax-6. In humans, this gene is called aniridia (an for short), in mice its Small Eye (or Sey) and eyeless (Ey) in the fruit fly Drosophila. Despite the difference in its name, it is the same "master control" gene for eye development across these widely disparate organisms. In humans, defects in this gene cause a disorder called aniridia (absence of the iris and other eye problems), and in flies it can cause a range of eye mutations including lack of eyes. The fly, mouse, and other species' versions of this gene are so similar that they are virtually interchangeable: in some cases over 90% of its sequence is identical! Expression of either the fly or the mouse Pax-6 can make eyes anywhere in the fly: in its legs, for example (Halder, Callaerts, & Gehring, 1995).
So you have at least one gene that is the almost the same as in a mouse and in a fly, yet your eyes do not look like mouse or fly eyes. That is because Pax-6 is one of many genes in a complex pathway that leads to the development of eyes. How is it possible that such disparate species can be so similar at the genetic level? Millions of years ago, Pax-6 evolved in an ancestor shared by flies, mice and humans. Research on Pax-6 is part of a body of work suggesting that eyes, despite their enormous diversity, evolved only once in animals (Gehring, 2005; Gehring & Ikeo, 1999).
And the Pax genes aren't the only ones you share with other creatures: did you know you also have one named after Sonic the Hedgehog (Shh), that's involved in regulating body development in embryos? And why Sonic? Because a mutation in the Drosophila version of the gene makes the fruit flies look all prickly, just like little hedgehogs (Arthur, 2011).
Genetics are an essential element in the process of evolution. Many of the differences within and between species are the results of differences in the genes they carry – consequence of the evolutionary processes directly affecting genes: mutation, recombination, and selection. These processes affect all living organisms, including humans.
You might think that because of advances in fields such as medicine and technology, humans are not evolving any longer. Sure, its not the same situation as in early human history. Now that the human population is so large, most of us don't have to worry about getting preyed upon by wild animals or deadly diseases such as the bubonic plague ravaging whole towns, so it is possible that selection processes are somewhat relaxed (although it is worthwhile remembering that many of the people alive today are descended from the ones who survived the Spanish influenza outbreak of 1918). But as long as not everyone reproduces equally, it is likely that our gene pool is changing over time, and thus humans are technically evolving. While frequencies of different alleles might vary, such changes might not bring about particularly drastic changes in how we look or behave.
Could humans evolve into a superhuman species further down the road? It's rather unlikely. First of all, evolutionary processes have no directionality, and no progression towards a perceived "better" stage. Also, while there is plenty of genetic variation in humans, the large population size, coupled with the lack of evidence for strong selection of particular genotypes, suggests the human species is likely to stay about the same for the time being. Some studies, however, have found the signature of selection at particular loci in human studies. A fragment of chromosome 17 is commonly "flipped backwards" or inverted. The frequency of this inversion varies in different human populations. A study in Iceland, where the inversion is common, found that women carrying this chromosome have about 3.5% more children than women who do not (Stefansson, et al., 2005).
Some people, on the contrary, think we might actually be evolving towards a more "simple" human, rather than a superhuman. Just as evolution might give rise to new traits, it might also lead to a loss of traits. This is the case for cavefish – fish species or populations that inhabit caves and live in complete darkness. As they no longer have any real need for their eyes, they have simply lost them. So could humans potentially lose body parts because we don't need them? Even if technological advances might allow us to never leave the house or decrease the need to move around, it is rather unlikely that our body will change much. Otherwise, we would have probably already lost our appendix! Medicine so far has not found much of a use for this tiny pouch attached to the large intestine. On the contrary, its inflammation, known as appendicitis, is potentially lethal.
We now know so much about the human genome: it is made of approximately 3.2 billion base pairs of DNA, which contain the information for about 25,000 genes. Less than 0.1% of the genome varies across humans, and thus this incredibly small fraction is responsible for many of the differences we observe across races and individuals (US Department of Genome Programs, 2010)! In the not so distant future, it will be possible to have your own genome sequenced, and you will be able to look at many, many pages covered in As, Ts, Cs, and Gs, containing all the genetic information about you - your very own genetic blueprint. But can you learn everything about you from looking at these pages? No, not really. Variation in the series of base pairs making up your genome does determine a lot about you, but because of the complex processes that lead from genotype to phenotype, and because not all characters have a genetic basis, it is impossible to know everything about someone simply from a printout of their genome. For example, we would definitely know your blood type, and whether you might be able to taste the bitterness of Brussels sprouts, and we'd be able to take an educated guess at your height and your hair color. But we wouldn't know what your favorite kind of music is, or what languages you speak just from looking at your DNA.
Let's pick a well-studied gene to see how small changes in sequence might lead to major differences in phenotype, while affecting many processes. Fibroblast growth factor receptor 3, or FGFR3, is involved in bone development and maintenance. A common point mutation at nucleotide 1138 in this gene replaces a G with an A. This mutation, or SNP (single nucleotide polymorphism), changes a codon, originally GGA, into AGA (National Center for BiotechnologyInformation, 2010). This also changes the corresponding amino acid from a glycine into an arginine, and thus affects the structure of the proteins FGFR3 codes for. This modification in structure affects how these proteins function. Because the FGFR3 products behave differently, bone development does not follow the usual trajectory. Individuals born with this mutation have a genetic disorder called achondroplasia: short stature because of shortening of the limbs, amongst other symptoms. Achondroplasia is the most common genetic disorder associated with short stature and occurs between 1 in 15,000 to 1 in 40,000 births. Most often (approximately 80% of the cases) it is a "de novo" mutation – not inherited from the parents, but appearing in the offspring. The mutation usually occurs in the father's sperm (National Center for Biotechnology Information, 2010).
So one little change, a single letter in 3.2 billion that constitute the human genome, can affect many levels of organization in an organism; this single nucleotide base difference changes codon identity, protein structure and function, bone development, and overall physical appearance. And yet, because of the many factors involved in the translation of genotype into phenotype, not every variation in the coding sequence results into such a dramatic and predictable phenotypic effect.
Throughout history, genetics have often been at the center of controversy. The desire to improve humankind has been present in many societies and different times. Sir Francis Galton coined the term eugenics in 1883: a formalized field of study aiming to improve hereditary traits in humans. He based this theory on extrapolations from some of the publications of his relative, Charles Darwin. Darwin wrote about how species evolve through the process of natural selection: as resources are limited and some individuals do better than others, those better suited to a particular environment tend to produce more offspring, and thus their genetically determined traits tend to become more prominent in the next generation. Darwin also showed how this process is similar to the artificial selection humans have used to generate varieties of domestic animals; he made a great study of the various fancy pigeon breeds, looking at how breeders picked particular wild rock pigeon individuals bearing unusual traits (such as ruffled necks and particular color patterns) to breed from and develop new varieties. Galton believed similar processes could enhance desirable traits, such as higher intelligence, in humans.
Eugenics inspired many social and political movements around the world during the 19th and 20th century. After the Second World War however, eugenics started loosing momentum and in modern times the term has a negative connotation.
Probably one of the most prominent and deadly adoptions of eugenic principles occurred in Europe during the Nazi era. In the Nazi ideology, the advent of medical care and modern comfort had eliminated the natural processes keeping the human species fit. They believed that "defective" individuals would have not survived under normal conditions and had become an expensive and unnecessary drain on society. This misinterpretation of Darwin's "survival of the fittest" is at the core of one of darkest episodes of modern history.
Both positive and negative eugenic policies were adopted in Germany and the countries they invaded. "Positive" practices aimed at favoring reproduction of individuals bearing desirable traits, whilst "negative" practices sought to limit the reproduction of, or simply to eliminate altogether, individuals bearing "undesirable" traits. People judged as superior representatives of the master race, the Aryans, were encouraged to reproduce more, while people seen as inferior were sterilized and euthanized by the hundreds of thousands.
In the United States, eugenics were adopted after the Civil War (Allen, et al., 2010). Economic and social instability, coupled with the new challenges of industrialization, brought forth progressivism. With this new general atmosphere of reform and trust in science came the idea of "social engineering" to manage and shape a better society. At the time, eugenicists believed that many of the social issues such as criminality, alcoholism and pauperism could be explained by the inheritance of defective genetic material or "germ plasm." As these "affected" individuals were often supported through public welfare and the state, the reproduction of such individuals brought a large cost to society that could be avoided by mandatory sterilization. The views of eugenics in the US also lined up with the interests of those alarmed by the rise of the socialist party and the strengthening of labor unions. The blame in this case was placed on immigrants from southern Europe who were believed to carry defective germ plasm and associated characteristics that should not be allowed to enter the American genetic pool.
Unfortunately, eugenic ideology tainted policies as well as American popular culture throughout a large part of the 20th century. Immigration restrictions were adopted to keep out immigrants whose genetics were judged inferior. More than 60,000 Americans were sterilized involuntarily. In fact, sterilization in institutions housing patients with mental problems was not banned in some states until the 1970s. Eugenics also provided new, supposedly "scientific" arguments in favor of racial segregation. People were also influenced by eugenics in their daily lives. At a state fair, you could sign up for "Fitter Families Contests" where you and your family members could be examined and rewarded for "superior" genetic heritage. You could read the latest on eugenics in the Eugenical News published by the Galton Society.
Much of the science conducted by eugenicists was inherently flawed. The central goal of eugenics is an attempt to apply Mendelian laws to human characteristics. To do so, researchers attempted to trace the inheritance of a trait through pedigrees or family trees. The studied traits were expected to fit the simple Mendelian scenario of one locus, two alleles, autosomal or sex-linked, with simple dominant and recessive relationships. But most of the traits eugenicists were interested in were complex in nature, often difficult to quantify (for example, sense of humor and self respect), and in some cases probably did not have much of a genetic basis (pauperism). Thus, their observations and conclusions were often misleading.
Despite the rejection of eugenic science, some of its ideas remain in the public mind, even today. We often hear how something making a particular group of people "inferior" or "superior" has a solely genetic basis. Shows about crime and its prosecution sometimes have episodes where criminality can allegedly be explained by a single gene. Studies looking at IQ have estimated its heritability could be as low as 40%. In this case, someone's IQ is mostly determined by the environment they grow up in. Thus education, early stimulation, diet or the myriad factors that have been implied in IQ development, could make a large difference. Other studies estimate IQ heritability to be as high as 80%. In this case, genetic factors greatly influence any variation in IQ, while the environment plays only a small role. These contrasting findings have led to various interpretations of variation in IQ across different races that have fueled racial debates. However, most scientists agree that there are no significant genetic differences in human intelligence at the race level; science provides absolutely NO justification for discrimination. Extrapolations of scientific findings should be interpreted with care to avoid falling into the same patterns that have proved so detrimental in the past.
Unfortunately, not everyone can stomach a 3 scoop sundae covered in chocolate fudge and whipped cream - some get some rather discomforting digestive issues we don't want to describe in detail here. This is very often the consequence of lactose intolerance: the inability to break down lactose, a common sugar in milk and milk-derived products. Lactose intolerance in adults is often viewed as a disease, yet this condition is the norm not only among humans, but all mammals.
The lactase enzyme in intestinal cells allows young mammals to digest milk. After weaning, however, its expression decreases dramatically as lactose is no longer an essential part of a mammal's diet. Humans, on the other hand, are rather unique mammals: some of us like milk with our breakfast cereal and in our coffee even as adults. Most humans are lactose intolerant to some degree in adulthood (in fancy terms, this condition is known as "lactase nonpersistance" or "adult hypolactasia"). As we've already said, this is because the body naturally makes less and less lactase as you get older because milk no longer forms the major part of your diet (unlike when you were a baby) and its wasteful in terms of resources to keep on making the same amount as when you were dependent on milk for all of your nutrition. But in some people this happens at a very young age, sometimes as early as two or three years old; this type of early lactose intolerance is genetically inherited. It's even possible for newborn babies to be lactose intolerant due to other inherited gene mutations, meaning that there's either little or no lactase, or the lactase produced (in normal quantities) just doesn't work. And, as you can imagine, if you can get some forms that don't work at all, then you can also get some forms that work even better than average, so some people's lactase is just more effective than others, even if there's only a small amount of it being made. As ever, environment can have an effect, too: some ladies who are lactose intolerant regain their ability to happily process dairy products whilst they're pregnant! Some drug treatments, such as chemotherapy and antibiotics, can also cause lactose intolerance, as can some illnesses, like Crohn's disease.
In some Northern European, African and Middle Eastern populations and their descendants, lactase activity persists into adulthood thus making them lactose tolerant. This condition is genetically determined, with lactose tolerance being the dominant trait. The mutations that gave rise to this new phenotype evolved recently in evolutionary terms; research suggests it only appeared in Europe within the last 8000 years (Burger, et al., 2007). Also, lactose tolerance has evolved in humans more than once: in other words, the same phenotype arose in different populations independently - the mutations that allow Northern Europeans to digest lactose in adulthood are different than those that let East African or Saudi communities enjoy milk (Enattah, et al., 2008; Tishkoff, et al., 2007). Lactose tolerance in all these different groups most likely evolved through natural selection because of the many adaptive benefits of milk consumption. Thus losing the ability to produce lactase as you get older is considered the "wild phenotype," while persistent lactose tolerance, the rare and more recently evolved trait, is the "mutant" phenotype.
Join today and never see them again.