Wednesday, October 13, 2010

Out with the old, in with the new”

Cancer has become this generation’s crusade, and in any crusade, an army needs weapons to fight with. For years, this weapon has been biological experimentation and observation and it has worked quite effectively. However, with the influx of new powerful technologies on the rise, traditional methods are being threatened by newer and possibly more efficient methods. These methods include wide scale, data harvesting performed by computers. In Robert Weinberg article, “Point: Hypothesis First,” he argues that while technology based data collection maybe seem like best approach, “From a cancer researcher’s perspective, the successes of hypothesis-driven science are clear and undeniable.” I, however, believe that Instead of using traditional experimentation methods for cancer research, as argued by Robert Weinberg in his article, the scientific community should take full advantage of the technology available to generate large amounts of data and more efficiently combat diseases that
kill millions of Americans each year.

In his article, Weinberg claims that while this new technology seems like an obvious route to pursue, it shouldn't replace smaller, hypothesis driven projects. He states that hypothesis based experimentation has worked for decades and continues to yield the most beneficial results in the fight against disease. He attacks large scale data efforts, saying they a costly and time consuming, yet only result in beneficial discoveries occasionally. He states, "These massive data-generating projects have yet to yield a clear consensus about how many somatic mutations are required to create a human turmour, and have given us few major breakthroughs in our understanding of how individual tumors develop." Weinberg's overall point is that while data harvesting may seem like an attractive option, it is ultimately risky. In order to undertake data harvesting efforts, smaller experiments must be replaced and the truth is, we don't exactly know how effective data harvesting is yet.

Biology is a science that can be traced all the way back to ancient times. The ancient Greeks, Egyptians, and Sumerians all studied the basics of life questioning what exactly makes things “work.” Originally, biology was a purely descriptive science, and the first man to make a truly significant contribution to biology was Alcmaeon in the 5th century, being the first person to practice dissection. Alcmaeon made strides in the discovery of how humans function internally through observation and description. He observed that when one suffers a concussion to the head, they have trouble thinking and performing, leading to the conclusion that our brain is in our head. Also in his dissections, he discovered the connections between the eyes and optic nerves, and the ears and the Eustachian tubes leading to the brain. These findings overturned another early biologist’s, Aristotle, claims that the brain was located in the heart.

During the 20th century, scientists began to break down complex biological systems into individual areas of study and the study of biology became a less of a descriptive science, and moved to hypotheses and experimentation. This evolution made it clear that observation alone was not enough without the support of experimentation, and the field of biology was permanently changed. Biology as an experimental science has lead to thousands of contributions to society, especially in the fight against disease and other threats to humanity. However, like all things that evolve, they don’t do so just once. The most recent advances in biological study have led to a new form of biological research based upon data and powerful computers. I believe that these data driven methods of research offer a greater hope for combating cancer than more traditional methods.

Todd Golub’s article “Counterpoint: Data First,” directly refutes Weinberg’s argument that traditional experimentation is more effective. Golub even says, “Despite decades of research, cancer-related death and suffering remains a massive public-health problem, with half a million deaths each year in the United States alone.” Golub recites several examples of how effective data harvesting can be, including the story of one of the most famous genome-inspired drugs, inmatinib or “Glivec.” This is the primary drug used in the treatment of chronic myeloid leukemia, replacing risky bone marrow transplants. The discovery of this drug began in the 1960’s when researches began to notice genetic abnormalities under the microscope. Traditional, hypothesis-driven experimentation was used for decades, however, it wasn’t until the 1990’s, with the introduction of powerful, data-harvesting computers, that scientists were able to sequence entire genomes, resulting in the development of the drug. Here is a case where traditional methods only could do so much, yet newer technological advances enabled researchers to make unprecedented strides. Advances in the field of cancer research such as this provide hope for the millions people with cancer, or who are at risk. Sequencing genomes is quick, efficient, and can ultimately even be less expensive than traditional methods.

Perhaps the most famous example of the potential of data harvesting is the Human Genome Project (HGP). The HGP officially began in 1990, however, its planning stages began in the 1980’s and its conceptual roots can be traced back even farther to the beginning of the 20th century. The project was completed in 2001, and has now provided the potential for many advances in biological research. According to the U.S. Department of Energy Office of Science (USDEOS), the project will benefit a wide range of subjects including molecular medicine, energy and environment applications, risk assessment, anthropology, forensics, agriculture and more.

The subject most influenced by the information gained from the HGP, is definitely molecular medicine and according to the USDEOS this should mean “improved diagnoses of disease, earlier detection of genetic predispositions, drug design, and pharmacogenomics or ‘custom drugs’.” Without the HGP, we would surely be farther behind in many of these fields than we are now. These potential products of data harvesting are far too important to be ignored simply because the method of obtaining them is possibly risky. Sometimes risks must be taken in order to make significant advances, and the human genome is a perfect example of this.

It is clear that hypothesis-driven experimentation is our primary tool for biological research, at least for the time being. However, with recent advances in the field of large-scale data harvesting, it is also apparent that using computers to generate massive amounts of data will become vastly more effective and efficient than traditional methods in the future.

http://www.nature.com/nature/journal/v464/n7289/full/464678a.html
http://www.nature.com/nature/journal/v464/n7289/full/464679a.html
http://www.ornl.gov/sci/techresources/Human_Genome/home.shtml
http://www.ornl.gov/sci/techresources/Human_Genome/project/benefits.shtml
http://www.historyworld.net/wrldhis/PlainTextHistories.asp?historyid=ac22
http://www.genome.gov/12011239

No comments:

Post a Comment