When I am asked what kind of research my lab conducts, I try not to immediately talk about the penis bone. I begin by saying I study genetics or evolutionary biology. If pressed, I specify that my project revolves around bone development. In the few cases where someone asks for details, I resignedly inform them that my specific work centers around the development and evolution of the baculum, the mammalian penis bone.
The reactions to this response all begin the same way: “the what?” accompanied by a quizzical expression that is remarkably consistent among people. Confusion is usually followed either by a loss of interest and polite transition to a more normal topic, or a series of unsophisticated jokes at my expense. The latter reaction is always from my close friends and family. Once the laughter has died down, someone inevitably asks “Ok, but what do you want to do with that?”
Over the years, this question has become increasingly concerning to me, not because I don’t like answering it, but because of its ubiquity and what it represents. I love my research. I think it tackles a question that no one has been able to answer, namely what the function is of the most unique bone in the mammal skeleton, using the most advanced computational and molecular techniques available in modern science. One of our long-term goals is to generate a mouse without a baculum, which is exciting for me simply because it would be fun to see how the lack of penis bone changes a mouse’s behavior and reproductive abilities. This rationale is enough to justify the intellectual effort that goes into designing these experiments, not to mention the physical effort it takes to carry them out. However, most modern scientific investigation is motivated by other factors. Science has become increasingly focused on applications; the only questions that are asked are ones whose answers can cause a substantial change in society. While this trend has resulted in many new innovations and inventions, much has been lost along the way. The apparent effectiveness of this system will prove to be short-lived. Unless basic research is allowed to flourish once more, the forward march of science will stagnate, and society along with it.
The National Science Foundation (NSF) defines basic research as any activity to “acquire new knowledge of the underlying foundations of phenomena and observable facts, without any particular application or use in view.” By contrast, applied research is defined as “original investigation […] directed primarily towards a specific, practical aim or objective.” While 1 concise, these definitions represent two extremes of the research spectrum. Most modern scientific research does not fit neatly into either of these categories but includes elements of both. Furthermore, even the purest “basic” research is often used by subsequent scientists to advance their “applied” research. In this way, the different varieties of research complement one another. Basic research elucidates the detailed mechanisms behind the natural world and applied research uses this understanding to advance society. However, recent trends in modern science seem to reveal a preference for applied research.
Consider citations in academic journals. One way to measure the impact of a scientific discovery is to count the number of times the article relating to this discovery has been cited. Let us narrow our focus to my own field of study, biology. In 1962, the Cambridge developmental biologist John Gurdon demonstrated that adult cells maintained the potential for pluripotency, or the capacity to differentiate into every tissue-type in an organism. His basic-research paper has 2 been cited over 1,000 times, indicating its importance and impact. A half-century later, stem cell biologist Shinya Yamanaka of Kyoto University built on Gurdon’s work to artificially induce pluripotency in human cells. He was able to exploit the potential for pluripotency that Gurdon 3 originally identified, returning adult cells to a state closely mirroring that of embryonic stem cells. Yamanaka’s discovery is closer to the applied-research end of the spectrum because of its direct implications for the development of stem cell therapies. His 2006 paper has been cited over 21,000 times. From this comparison, it would seem that Yamanaka’s discovery was the more important one. His work seems to have influenced significantly more research than Gurdon’s. And yet, Yamanaka’s experiment would have been impossible to conceive of without Gurdon’s initial experiment. The major clinical application of Gurdon’s work would have been impossible without Yamanaka’s, but both are equally important because, without Gurdon’s discovery, Yamanaka would have had nothing to work with. The Nobel Committee recognized the value of this relationship in 2012 when they awarded the Nobel Prize in Physiology and Medicine to both Gurdon and Yamanaka.4
Although illustrative, this comparison is incomplete without considering one more landmark paper. In 1951, Oliver Lowry, a biochemist at the University of Washington, completed an experiment that is closest to the NSF’s definition of applied research. Lowry’s research had nothing to do with cell biology or medicine at all; rather, he and his team invented an improved method to measure protein content in a solution using phenol. Unlike the research 5 conducted by Gurdon or Yamanaka, their work was not motivated by a desire to obtain “new knowledge or understanding” but rather to create a tool that they believed would be useful to biochemists like themselves. Lowry’s paper has been cited a staggering 200,000 times.
When a scientific paper is published, the journal it is published in is often used as an indicator of how significant the research is. When they believe they have found something ground-breaking, researchers send their manuscripts to the most important, well-known journals such as Nature, Science, and Cell. The editorial boards for these journals are staffed by important scientists and distinguished professors who decide whether or not the manuscript is right for their journal. This process is highly selective and entails a strict review because publication in a high-impact journal guarantees that a paper will be seen by the most number of people, increasing its impact tremendously. According to a series of recent case studies by Isobel Ronai of Columbia University, “90% of the articles on applied research [are] published in high-profile journals, compared to 20% of articles on basic biological research”. Apparently, it is easier to 6 publish in a high-profile journal if the research is applied, rather than basic. This is a result of the system through which journals like Nature become “high-profile”.
In the world of academic publishing, one metric is used above all others when considering the prestige of a journal. This number, called the “impact-factor”, is essentially a quantification of how important the research published in a journal is to the scientific community at-large. It is calculated by dividing the total number of times articles in the journal have been cited for the most recent two-year period by the total number of citable publications in that journal in the same period. In other words, it is a ratio that reveals how many times the average 7 article in a given publication is cited. A high-impact factor is a source of pride for a journal, often prominently displayed on its website. So the editorial boards of the journals are naturally incentivized to do what they can to increase their impact factors while maintaining their academic rigor. Since applied research is cited considerably more frequently than basic research, as Ronai has shown, it follows that journals should preferentially publish applied research to maximize their impact factors and prestige. As a result, researchers that focus on applied research are more easily able to gain several high-impact publications, which in turn facilitates the advancement of their academic careers and the procurement of research funding. Indeed, Ronai’s comparison of the number of patents and Nobel Prizes obtained by basic versus applied researchers suggests that the “basic phase […] is systematically less recognized than the applied phase.” Unfortunately, this system prioritizes the apparent impact of the research, often at the 8 expense of the strict criticism that is the purpose of the peer-review process. This has led to several cases where fraudulent research has been published under the auspices of a prestigious journal, leading to widespread consequences.
The most infamous example of this is the tragedy of the modern anti-vaccination movement. Vaccination hesitancy, while in itself a new phenomenon, has enjoyed a dramatic upsurge in the developed world. As a result of dropping vaccination rates, there have been numerous outbreaks of previously eradicated diseases, most notably measles. While the number of deaths from these 9 outbreaks remain low, their increasing frequency and the possibility of other diseases resurging is alarming. Modern vaccine hesitancy is the direct result of a paper published in 1998 by the now defamed physician, Andrew Wakefield. The study supported a now-debunked claim that a common vaccine administered to children could cause autism. Wakefield had been receiving private funding from anti-vaccination groups and employed fraudulent, unethical research methods that completely invalidated his conclusions. Replicate studies from other research groups also directly contradicted the study, and the journal retracted Wakefield’s paper, recognizing it as “utterly false”. Unfortunately, the damage was done. The journal this study 10 was published in was The Lancet, the pinnacle of publication in academic medicine. The Lancet’s prestige helped propagate the falsehoods of Wakefield’s paper, giving new fuel to the anti-vaccination movement and igniting it once more in the western world.
In retrospect, the unethical and problematic experimental procedures in Wakefield’s study should have been caught during the peer-review process. This is the purpose of peer-review, to ensure that only credible experiments and reliable results receive academic endorsement. And yet, the paper was published without incident in arguably the most important medical journal in the world. The editors at The Lancet were likely motivated to publish this study because of its significant implications. Publishing famous studies like this one is part of how journals like The Lancet maintain their powerful branding. However, in this case, their desire to do so overwhelmed their commitment to peer-review and academic integrity. As a result, the modern world is once again plagued by diseases that haven’t been seen in appreciable numbers since before the advent of biomedical science.
The bias in academic publishing towards applied research is mirrored by the process through which scientists receive funding for scientific research in the United States. Modern scientific research is extremely expensive and is usually funded by grants that range from several thousand to several million dollars for two to three-year projects. This money is used for the acquisition of raw materials, the use of research facilities, and staff salaries. The two main sources of funding for scientific research are the federal government and private industry. According to a 2015 report prepared by the Congressional Research Service, the federal government provided roughly the same amount of funding for both basic and applied research that year, $36.9 and $34.5 billion, respectively. However, private businesses provided $22.7 billion to basic research, compared to $51.7 billion to applied research. Applied researchers in 2015 received roughly 11 equivalent amounts of federal funding but almost twice the amount of funding from private sources. This discrepancy in funding grants a serious advantage to applied researchers, who are now able to pursue more ambitious, expensive projects with fewer financial limitations. Furthermore, receiving a major grant as a researcher can significantly improve one’s career prospects, especially in academia. The combination of high-impact publication bias and support from funding sources creates a system where applied researchers operate at a distinct advantage and applied research can flourish, unfortunately at the expense of basic research.
This observation is not particularly controversial. Most scientists recognize that applied research is seen as preferable, simply because it results in a definite application rather than new information that is often only significant to a few academics. This evaluation is especially true in medical science, where the emphasis is always on the translational potential of research, or the ability for a discovery at the bench to be translated to a therapy at the bedside. University medical centers across the country devote resources to this ideal, building structures with names like “Center for Translational Research” or “Institute for Clinical and Translational Science.” The directorships of these institutes are prestigious, sought-after positions that attract scientists who are leaders in their field. They in turn recruit bright young faculty and provide them with enough funding so that they can receive major grants of their own. In recent years, such an emphasis has resulted in numerous important discoveries that have changed the lives of countless patients. It seems that if this system continues, society can only benefit. However, historical examples predict a different outcome.
In the middle of the 20th-century, the American research establishment seemed devoted to purely applied research to an even greater degree than it is now. The reason was World War II, which created a demand for new scientific discoveries to aid the war effort. In his sweeping history of cancer, Siddhartha Mukherjee of Columbia University describes how this war-time emphasis on application translated to another war that began after the conclusion of World War II: the War on Cancer. Mukherjee writes of the early crusades against cancer, led by Harvard pathologist Sidney Farber and philanthropist Mary Lasker, who successfully campaigned to combat cancer, not by studying it in detail but rather through imprecise barrages of various drugs in an
unsophisticated effort to discover what was effective. They reasoned that if they could 12 find chemotherapies that were therapeutic, they didn’t need to understand why they worked at all. It was enough that they did.
Although the NSF definition of applied research was not published at the time, Farber’s strategy is a clear example of it. He and his colleagues sought to combat the disease without understanding it, excited by the prospect of discovering therapies for cancer in a time before oncology was a recognized medical discipline. Furthermore, Mukherjee shows that these early oncologists were aware that they were bypassing the basic research; indeed, they recognized that they were operating without a solid scientific foundation, but decided to proceed anyway. What resulted was a style of cancer treatment that ultimately harmed the patient as often as it helped them. The drugs had horrible side effects and survival rates were universally low. In the words of William Moloney, one of the pioneering oncologists of the time, “If we didn’t kill the tumor, we killed the patient.”13
It took another generation of oncologists to realize how short-sighted and ineffective this method of treatment would be. In the meantime, a generation of cancer patients would suffer unnecessarily until oncology shifted towards a more modern approach, one that used basic research to identify cancer mechanisms and genes that could be targeted by specific drugs.14
The first generation of oncology treatments exemplifies the need for strong basic research in science. Without basic scientific research, applied research lacks a strong foundation. Scientists can forge ahead searching for applications in the same cavalier way demonstrated by the early oncologists. But eventually, they will reach a point where the scientific foundation behind their applications is inadequate, too sparse to support any further applications. At this point, they will be forced to return to basic research, and applied research will stagnate until enough basic research is conducted again. The recent trends in clinical trial failures may be an early indicator that this is already happening.
New medical treatments must pass through an extensive screening process conducted by the Food and Drug Administration (FDA) before they can reach patients in a clinic. These clinical trials are incredibly expensive, costing an average of $19 million per trial, according to a 2018 study by researchers at the Johns Hopkins Bloomberg School of Public Health. In the realm of 15 drug development, less than 10% of the products that qualify for clinical trials will make it to the market; the remaining 90% still incur significant costs before they fail. According to a 2016 16 analysis of the factors that drive success in clinical trials, the Biotechnology Innovation Organization found that drug development programs that used selection biomarkers, genetic factors that sometimes allow researchers to identify how effective a drug will be on a specific patient, experienced significantly higher success rates at all phases of the process. However, only 5% of programs used this method. This is because the identification of biomarkers for a disease 17 relies on a great deal of basic research. As the NSF study showed, private businesses, like the pharmaceutical companies that develop new drugs, are hesitant to spend money on basic
research. They would rather invest greater amounts of money in applied research, hoping to find the drug that works without completely understanding the biology behind it.
Rather than reject this system as unfair, basic scientists have found a way to adapt. Biologists who work in basic science have begun regularly emphasizing the clinical implications of their research. Most papers, not to mention grant applications, include sections on the translatability of any research. This leads to some particularly creative connections between basic research and possible applications. For example, most of my proposals for undergraduate fellowships include a portion on how discovering the origins of the penis bone could have important applications for osteoporosis treatments. While this line of reasoning has potential (osteoporosis can be caused by sex hormone deficiencies and sex hormones are central to baculum development), it is not a motivating factor in my lab’s research. And yet, every proposal I have ever written included this reasoning because it made the proposal more likely to be approved for funding. I believe this kind of rhetoric will continue to prevail in basic science proposals as a way to attract the attention of an establishment which seems committed to preferentially supporting applied research.
What this research environment implies is that basic research is somehow worth less than applied research. Even at the undergraduate level, we are being taught that manufacturing relationships between a basic research proposal and some application is necessary to grant value to the proposal. Basic research seems to only be worth something when it apes applied research. This scientific value system will create a generation of scientists fundamentally unmotivated by a desire to generate new discoveries for their own sake. Research of this variety, divorced from passion and devoid of creativity, will transform science into something impersonal, industrial, and unexciting. It has been three months since I started seeing my girlfriend. Her questions on what exactly I do in my lab are becoming harder to evade. Like me, she is working towards her Biology degree and hopes to attend medical school sometime soon. Unlike me, her research focuses on the factors that drive angiogenesis in bones. The clinical applications are self-evident; the grants and papers almost write themselves. I’m sure The New England Journal of Medicine and The Lancet are fighting for the right to publish her lab’s work. One of these days, I will work up the courage to explain to her that I work on the mouse penis bone, what that means, and why it is important. I will tell her why I think basic scientific research is important, why my research does not need to have a direct application in medicine or industry to be interesting and worthwhile. Whether or not she is convinced, she will believe that I believe that it is important. And she will know that, despite the apparent emphasis on applied research, there are still labs pursuing science in the name of science, seeking knowledge for its own sake.
1 Definitions of Research and Development: An Annotated Compilation of Official Sources, p. 2
2 Gurdon, 1962.
3 Takahashi and Yamanaka, 2006.
4 Colman, 2013.
5 Lowry, 1951
6 Ronai and Griffiths, 2019
7 Griffiths, 1955
8 Ronai, 2019.
9 Hussain, 2019.
10 Hussian, 2019.
11 Sargent, 2018
12 Mukherjee, p. 121
13 Mukherjee, p. 143
14 Mukherjee, p. 217
15 Moore et al, 2018.
16 Hay et al, 2014. 17 Thomas et al, 2016, p. 18.
References
Colman, A. (2013). Profile of John Gurdon and Shinya Yamanaka, 2012 Nobel Laureates in Medicine or Physiology. Proceedings of the National Academy of Sciences, 110(15), 5740–5741. doi: 10.1073/pnas.1221823110
Definitions of Research and Development: An Annotated Compilation of Official Sources. National Science Foundation. March 2018. https://www.nsf.gov/statistics/randdef/rd-definitions.pdf Garfield, Eugene. “The Thomson Reuters Impact Factor”. June 1994. Thomson Reuters. https://clarivate.com/webofsciencegroup/essays/impact-factor/
Gurdon, J. B. (1962). The Developmental Capacity of Nuclei taken from Intestinal Epithelium Cells of Feeding Tadpoles. Development, 10(4), 622–640. doi:
https://dev.biologists.org/content/10/4/622.article-info
Hay, M., Thomas, D. W., Craighead, J. L., Economides, C., & Rosenthal, J. (2014). Clinical development success rates for investigational drugs. Nature Biotechnology, 32(1), 40–51. doi: 10.1038/nbt.2786
Hussain, A., Syed, A., Ahmed, M., Hussain, S. (2018). The Anti-vaccination Movement: A Regression in Modern Medicine. Cureus, 10(7). doi: 10.7759/cureus.2919.
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6122668/
Lowry, O. H., Rosebrough, N. J., Farr, A. L., & Randall, R. J. (1951). Protein Measurement With the Folin Phenol Reagent. Journal of Biological Chemistry, 193, 265–275. Retrieved from http://www.jbc.org/content/193/1/265.long
Moore, T. J., Zhang, H., Anderson, G., & Alexander, G. C. (2018). Estimated Costs of Pivotal Trials for Novel Therapeutic Agents Approved by the US Food and Drug Administration, 2015-2016. JAMA Internal Medicine, 178(11), 1451. doi: 10.1001/jamainternmed.2018.3931
Ronai, I., & Griffiths, P. E. (2019). The Case for Basic Biological Research. Trends in Molecular Medicine, 25(2), 65–69. doi: 10.1016/j.molmed.2018.12.003
Takahashi, K., & Yamanaka, S. (2006). Induction of Pluripotent Stem Cells from Mouse Embryonic and Adult Fibroblast Cultures by Defined Factors. Cell, 126(4), 663–676. doi: 10.1016/j.cell.2006.07.024
Thomas, D. E., Burns, J., Audette, J., Carroll, A., Dow-Hygelund, C., & Hay, M. (2016). Clinical Development Success Rates 2006-2015, 1–26. Retrieved from
https://www.bio.org/sites/default/files/Clinical Development Success Rates 2006-2015 – BIO, Biomedtracker, Amplion 2016.pdf U.S. Research and Development Funding and Performance: Fact Sheet. John F. Sargent. June 2018. https://fas.org/sgp/crs/misc/R44307.pdf
Leave a Reply