Evolution Icon Evolution
Intelligent Design Icon Intelligent Design
Life Sciences Icon Life Sciences

The Human Genome Project Ten Years Later

Scientific American recently reported on what has transpired since the completion of the Human Genome Project ten years ago. When the HGP was first announced in 2000, many scientists said that it would be the key to understanding disease and for developing cures. Ten years later, however, this has not been the case. The human genome project has aided in developing better research and technology, particularly in our abilities to sequence genes. It has also shown us that much of what we once considered junk DNA isn’t really junk at all. (See here, here, here, and here for past ENV discussions on junk DNA). However, scientists are coming to a sobering conclusion that perhaps their models and assumptions on the nature of disease may be mistaken.


Perhaps the key to disease is not necessarily found in the common variants in the genetic code. New research is demonstrating that the current view of disease may be too narrow. In a controversial article published in the April 2010 issue of Cell, Jon McClellan and Mary-Claire King suggest “…that complex human disease is in fact a large collection of individually rare, even private, conditions.” Their work, as well as several other studies, suggests that there may be many factors that contribute to disease. Just because an individual may have a similar base pair mutation as other individuals with the same disease does not mean this one mutation results in the drastic effects seen in diseases. Or just because an individual does NOT have a particular mutation does not mean that he or she will not get the disease. There may be a complexity of other factors causing the disease. These factors could be environmental or “epigenetic,” having to do with how parts of the genome interact together.
The idea that common mutations indicate the cause of disease is known as the common variant hypothesis, and has been the predominant view among geneticists for many years. As the Scientific American report suggests, the prevailing assumptions behind the common variant hypothesis were based on evolutionary presumptions:

The belief that common variants would be helpful in understanding disease had a certain evolutionary logic. The rapid and recent population explosion of ancestral humans tens of thousands of years ago “locked” many variants in the human gene pool, Lander says. The bet was that these common variants (“common” usually meaning appearing in at least 5 percent of a given population) would be fairly easy to find and that a relatively small number of them (from several to perhaps dozens) would shape our susceptibility to hypertension, dementias and many other widespread disorders. The disease-related genetic variants and the proteins they encode, as well as the pathways in which they played crucial roles, could then become potential targets for drugs.

McClellan and King are not the first to suggest that common variants may not have the biological effects that some thought. Scientific American quotes Kenneth Weiss and Terwilliger who, in the 1990s, pointed out that if a common variant actually did powerful harm, then natural selection would have weeded it out of the population. They suggested that diseases may come from many rare disease-promoting variants. Unfortunately, this was not the popular notion at the time, and their arguments were dismissed.
Now there is a debate between two groups, the ones that say the common variant hypothesis is still valid and only needs more time for sequencing genes, and the other that considers the common variant hypothesis a complete failure. From the first group, the article quotes an interview with Francis Collins who points out that scientists have “‘figured out’ how almost 1,000 of those common gene variants ‘play a role in the risk of disease, and we have used that information already to change our entire view of how to develop new therapeutics for diabetes, for cancer, for heart disease.” Counterexamples include work with type 2 diabetes, where geneticists have found 18 common variants associated with type 2; however, these variants only account for a small portion (the article cites 6%) of the heritability of type 2 diabetes. Furthermore, these variants do not account for any of the “causal biology.”
It turns out not all diseases are a consequence of a few wrong base pairs, as Tay Sachs or Cystic Fibrosis are. While the reports in 2000 said that the human genome would be the key to finding cures to cancers, Alzheimer’s, diabetes, dementia and a slew of other presumably heritable diseases, thus far, scientists have only been able to identify genetic markers for a few diseases, let alone develop cures. Even the recent strides that have been made with BRCA1 and BRCA2 analysis do not pinpoint the cause of breast or ovarian cancer in all patients. And some people with the markers do not necessarily develop breast or ovarian cancer, which means that other factors, either genetic or environmental or both are at play for the same disease. This calls into question the standard assumptions regarding genetics and disease.
From the article (emphasis added):

A FEW BRAVE VOICES ARE suggesting that the rabbit’s hole of human biology may go still deeper than a focus on DNA sequences and proteins can reveal. Traditional genetics, they say, may not capture the molecular complexity of genes and their role in disease…Put simply, the very definition of a gene–not to mention a medically significant gene–is now vexed by multiple layers of complexity. What was once assumed to be a straightforward, one-way, point-to-point relation between genes and traits has now become the “genotype-phenotype problem,” where knowing the protein-coding sequence of DNA tells only part of how a trait comes to be.

The article then cites examples of how one gene may depend on certain variants within its proximity, and that perhaps common diseases are due to an ensemble effect of multiple gene variants. This is the idea behind the study of epigenetic factors, and has been of interest to many within the intelligent design community. From a design perspective, a highly complex organism, such as a human being, cannot be understood by reducing to its constituent parts (i.e. its chemistry). Given the presupposition that a complex organism is engineered, then the parts themselves must be taken within the context of the whole. So, while it may be new news that traditional genetics does not account for the complexity of interactions that are being discovered in the genome, this has been an ongoing area of research for scientists at Biologic Institute.
A reductionist, neo-Darwinian view of the human genome does not seem to have explanatory power regarding disease. This is a clear case where some scientists’ dedication to the presupposition that similar DNA sequences means whole organism similarity, the same presupposition that many have used to argue for evolutionary relatedness among species and against man being anything more than another animal, has, 138 million dollars later, lead to more dead ends than cures.

Evolution News

Evolution News & Science Today (EN) provides original reporting and analysis about evolution, neuroscience, bioethics, intelligent design and other science-related issues, including breaking news about scientific research. It also covers the impact of science on culture and conflicts over free speech and academic freedom in science. Finally, it fact-checks and critiques media coverage of scientific issues.

Share

Tags

Junk DNAScienceUnlocking the Mystery of Life