Taking Risks with Translational Research

See allHide authors and affiliations

Science Translational Medicine  24 Mar 2010:
Vol. 2, Issue 24, pp. 24cm10
DOI: 10.1126/scitranslmed.3000667


The recent bankruptcy filing by deCODE, a company with an exceptional pedigree in associating genetic variance with disease onset, highlights the commercial risks of translational research. Indeed, deCODE’s approach was similar to that adapted by academic researchers who seek to connect genetics and disease. We argue here that neither a purely corporate nor purely academic model is entirely appropriate for such research. Instead, we suggest that the private sector undertake the high-throughput elements of translational research, while the public sector and governments assume the role of providing long-term funding to develop gifted scientists with the confidence to attempt to use genetic data as a stepping stone to a truly mechanistic understanding of complex disease.


The launch of Science Translational Medicine provides an excellent opportunity to discuss the impact of the translational badge on science funding and career structures. Accumulating knowledge and applying it to advance our understanding of disease is a natural consequence of scientific research. The current impetus to define some research areas as translational has developed predominantly from high-throughput genomic technologies and their application principally to genomics and medical genetics. This has accelerated our ability to rapidly define, at single-base resolution, the genetic backgrounds against which diseases arise. These technologies are now being used to define the epigenetic landscape of genomes and genetic response elements for transcription factor binding, as well as for cross-species comparisons (1).

This unprecedented capacity to accumulate large amounts of data has exposed academic science to new pressures that are more familiar to the industrial world. Academic research budgets are limited, and the clinical and scientific infrastructure needed to undertake this data-intensive research is expensive. However, because the approaches used guarantee the generation of large quantities of data, and the short-term goals are to provide correlations between these data and clinical outcomes, this sort of investment offers near-guaranteed returns. For example, genome-wide association studies (GWASs) have provided linkages between loci and the risk of developing complex diseases, such a prostate and breast cancer, and have created a platform to propose population screening for genetic backgrounds to target clinical interventions and prioritize clinical budgets. But in many cases, the odds ratios for high-risk genetic variants have proven to be small, and the loci themselves are not in palpable coding regions of the genome. Thus, the herculean effort required to identify and functionally characterize the true disease drivers within these large data sets of genetic variations means that bridging the data with intricately targeted therapies remains an expensive and time-consuming proposition.


Despite the hurdles associated with the integration of genomics research and clinical medicine, the rapid pace of technology development ensures that increasingly detailed descriptions of genomic complexity can be developed, although not necessarily coherently interpreted. However, even if the vast amounts of data do not eventually spur the discovery of new drugs or yield detailed mechanistic insights into a disease process, there is still the possibility of developing genetic tests that will allow patient populations to be stratified on the basis of clinical risk.

The realization that the ability to sequence genomes could provide rapid genetic linkage to disease led to the founding of a number of medical genomics companies at the turn of the millennium, of which deCODE is perhaps the most famous example (2). On a wave of euphoria that followed the sequencing of the human genome, it was possible to secure private funds to establish companies that promised new disease diagnostic tests and therapies based on genetic screens (2, 3). deCODE developed an extensive biobank—a collection of biological materials and associated data—in Iceland and published pivotal papers in Nature Genetics in which new associations were made between genetic loci and a range of diseases. The capital then dried up, and debts mounted, resulting in deCODE filing for bankruptcy (Fig. 1) (4). The company’s human genetics work, however, continues through its Iceland-based subsidiary Islensk Erfdagreining, which owns neither the samples nor the data (5).

Fig. 1. The need for cross-sector collaboration to bridge the translational divide.

In this cartoon, the figures on each side of the canyon represent the public and private sectors or academia and industry, and the tightropes that join these sides represent genomics technologies and data. Without an effective dialogue and risk sharing between the two sectors, progress in translating datasets into clinical applications will be limited and impact negatively on the patient.


Clearly, there were many factors behind the demise of deCODE, but one opinion is that investors became frustrated at the long-term commitment required to convert these descriptive studies into effective drugs and meaningful diagnostic tests. In the interim, governments and research agencies throughout the world have begun to view this sort of research as offering guaranteed returns while being conceptually intelligible to the public (6, 7). Consequently, academic research budgets are increasingly deployed to acquire next-generation sequencing facilities, support biobanking, and invest in the computing and bioinformatics infrastructure necessary to collect and interpret the data (8). Many scientists, too, have embraced this genetic sequence–based approach, because it offers guaranteed returns in the form of data, publications, and thus grant money.

But the major challenges of developing such projects are new ones for some scientists. They are not so much the challenges of developing new methods, thinking of smarter experiments, and iteratively refining research projects to arrive at the right question and an answer to a clearly defined problem. The key bottlenecks in this next-generation sequencing world are of organization, infrastructure, and resourcing, which can mean that a researcher requires skills of persuasion and political savvy. How are funders to be convinced to purchase the expensive equipment required to generate the data? How are hospitals and clinicians to be resourced to collect and archive clinical samples while preserving patient confidentiality? Thus, the dominant forces who dictate whether this sort of research moves forward are clinical and political leaders, and success or failure is often more dependent on high-level political and networking skills than has traditionally been the case in academic research.


More generally, society has embraced the new technology, and this enthusiasm perhaps reflects a culture of instant gratification. In a scientific context, this state of mind may translate into a desire to generate large data sets quickly, while at the same time authoring translational papers that conclude by stating that the real impact of such work will arise from more complex functional studies to be undertaken by others at a later stage. Society wants solutions quickly, and there is impatience politically, scientifically, and socially to generate a sense of progress while we endeavor to translate genetic knowledge into disease mechanisms and treatments.

This state of affairs, however, raises an interesting quandary, because the career structure for scientists has changed little over the past decades. Progression to research independence is based predominantly on first and last authorships on papers and the ability to secure independent grant funding. But if these are the criteria in a landscape that has become highly dependent on networks and infrastructure and highly susceptible to social and political pressures, then the risk is that our next generation of scientists will be selected and promoted, not because of their vision and courage to test new and risky ideas, but rather because of their ability to work well with the most influential people in the most highly promoted institutions. Creative individuals who attempt high-risk conceptual leaps or tackle research themes that seem opaque now but may reap dividends in the future might be lost from the research community. While the field is still at the stage of assembling large genomic data sets, this fear or prediction perhaps seems unimportant, because the work will progress regardless. The only short-term consequence will be that some talented scientists may not develop independent careers because their political instincts are insufficiently refined.

The real danger, however, lies in the medium to long term. Just as with deCODE, a stage will be reached at which investors, in this case public funding bodies, will expect more than simply another descriptive/correlative paper in a high-impact journal and a large data set. At some stage, the public will expect scientific research to have fostered a group of highly committed and capable researchers who are willing to take the risk of working on the slower, long-term challenges of translating this descriptive science into a truly mechanistic understanding of diseases and more effective diagnostic tests and therapies. Risk-taking clearly requires political, financial, and scientific will as well as public understanding. There is no reason to believe that the public cannot distinguish between committed scientists with vision and politically savvy scientists with limited vision or, indeed, accept that both have a place and value in the system. However, researchers accept that many biomedical questions are complex, and the ability to clearly define the scientific hurdles to addressing tough questions, as well as to market progress or commercial value, will serve the field well in periods in which political and financial goodwill becomes scarce.


With the existing career structures and the political imperatives that are at work, we need to ask ourselves now: How do we ensure the development of creative visionary scientists? Who is funding them? Who is encouraging them? How do we identify them? This will clearly depend somewhat on the wealth and political priorities of individual countries. Chances are that these scientists are currently fighting for a diminishing pool of funding to study in detail, for example, chemotactic responses in slime mold or the cell biology of secretory pathways in yeast. In other words, the sort of research that you cannot easily label as “translational” in the current climate but which helps to develop the commitment and skills necessary to fully explore the functional significance of genes in model organisms or systems. The challenge of functional analysis is of course amplified in disease genetics, when we consider that most risk loci associated with complex diseases are to be found in gene deserts. In a politicized and risk-averse world, it will be necessary to defend research funding for scientists willing to take the risk of focusing on a single small fraction of the genomic data pool and to attempt to explore the phenotypic impact of genetic risk at a molecular level through biochemistry or cell biology.

Some funders have recognized the need to support bravery and vision. The aim of the European Research Council Starting Grants is to support young independent researchers as they bid to tackle risky but scientifically important questions ( (9). The UK’s Wellcome Trust is about to radically alter its funding model to provide long-term (5- to 7-year) support along similar lines ( In the United States, the Howard Hughes Medical Institute’s funding is aimed at a similar goal ( The Swedish Foundation for Strategic Research is launching a scheme to support the next generation of research leaders, with the aim of fostering inter‐disciplinary skills and network development (

These are just some examples of funding schemes with promising aspirations, and they are to be applauded in principle. Whether these funders will achieve their goals is something that will only become apparent with time. If a high proportion of researchers funded through these routes do not achieve every goal defined in their research proposals, but have the freedom to explore intriguing research tangents or only partially progress toward an answer to their original question, then it will be clear that the schemes have succeeded in encouraging researchers to tackle complex and ambitious problems. Unpredictable outcomes are the essence of “blue-skies” research, and this is most readily fostered by public funding through these sorts of schemes in an academic environment. However, if these schemes further support infrastructure-dependent research that yields large data sets by exploiting genomic technologies and clinical sample collections, they will simply be adding to the existing wave and an opportunity will be lost. If publicly funded research can provide an effective functional and mechanistic outlet for the vast quantities of genomics data that have been and are being generated, then it will have built a strong basis for public/private partnerships.


Infrastructure-dependent high-throughput research can be sustained by private finance in an industrial setting, as evidenced by deCODE and others, and has a future, provided that government/public funding nurtures researchers with the long-term support needed to put these data into a mechanistic context. Industry was and still is the main exponent of high-throughput drug screens and assay automation, although academic research institutes are beginning to engage in this on a small scale. Given the high levels of public debt in many developed countries as a result of the recent global financial crisis, it is unlikely that some countries have the will to support fundamental mechanistic research, and the career tracks for scientists to undertake such research, through long-term government or foundation spending commitments. This model has worked in previous decades and has yielded, for example, vital insights into yeast cell cycle control, which have been translated successfully to the study of tumor biology. It therefore has a place in science that is worth maintaining.

The alternative model is an increased duplication of technology and modus operandi in which academic institutions apply industrial technology-driven approaches on a sub-industrial scale and in an uncoordinated manner. The aim of such research may increasingly mirror industry, too, arriving at commercializable end points that politicians believe the public will understand and a revenue stream that is at the least self-renewing. This debate has recently come to the fore in the United Kingdom in part because of the impact of the credit crunch on public finances. Ironically, however, academia may adopt an industrial ethos just as industry becomes more open to the idea of partnership with academia, as it begins to recognize that a different philosophy that embraces risk and complexity can provide new translational avenues if lines of communication and partnerships are fostered. Examples include the European Union–sponsored Innovative Medicines Initiative ( (10), memoranda of understanding between pharmaceutical companies and academic research organizations, and the recent announcement by GlaxoSmithKline (GSK) that they are making 13,500 antimalarial agents available to the academic research community. This new policy signals a recognition by GSK that industry can accelerate the wider research effort of academia by providing access to reagents and other research start points that academic institutions could not easily generate themselves. Arguably, the large-scale medical genetic data sets from genomic sequencing and GWASs fall into the same category, being of principle interest as starting points for functional studies and most usefully disseminated through databases and repositories rather than journal articles. This latter point could be further encouraged through the policies of the funders and editorial boards. Just as an open-access approach has accelerated the development of software, so the same principle will accelerate the pace of translational research.

Such openness will also ensure that academia does not feel coerced into morphing into a mini-industry but is confident of its place in a complementary research world. This, of course, also depends on minimizing the impact of political debate where it endeavors to impose industrial models on academia solely to tackle an industrial and public debt burden, rather than with the aim of developing the next generation of research leaders and having a lasting impact on health and disease. There is no point in everyone trying to occupy the same limited ground in a race for quick returns. In addition, journals such as Science Translational Medicine must continue to provide encouragement and publishing opportunities for focused research as well as genomic screens. There are no easy solutions to these issues. But given the current economic circumstances in many countries, it is now an opportune time for these issues to be discussed with the aim of achieving a balance that preserves curiosity-driven research and enhances the effectiveness of industrial and academic partnerships.


  • Citation: I. G. Mills and R. B. Sykes, Taking risks with translational research. Sci. Transl. Med. 2, 24cm10 (2010).

References and Notes

  1. The opinions expressed in this commentary article represent personal viewpoints and are not intended to reflect the views or policy of any funding body or industrial enterprise.

Stay Connected to Science Translational Medicine

Navigate This Article