Risks in research: why you have to take them

Apr 25 2012 Published by under [Education&Careers]

"Risky" is a double-edge sword in science that can cut both ways when it comes to funding agencies. In some contexts, such as the R21 mechanism at NIH and some seed programs, risky with a high up-side is a good thing. In more typical mechanisms, risky can be StockCritique (TM) for "we want to see proof before we give you money". Sometimes that criticism is warranted, other times it's just a bar-raiser for having most of the project done before you submit for funding.

Both NIH and NSF have certainly be criticized for shying away from risky projects and both have had trouble responding to those claims. Panels and Study Sections can be conservative with limited funds and unless the POs actively encourage the consideration of potentially transformative (in NSF-speak) work, a prove it or lose it mentality can take over.

But when was the last time you walked out of a really good talk and thought "that was some super cool data that came from a totally ordinary idea!"? Sure, it can happen, but...

When I started my lab I basically started three projects: One that was a slam dunk in terms of working, one that was a reach but I knew we could do and a related one that needed a LOT of ground work to be laid but would be worth it.

That first project we were able to get funded relatively easily (one resubmit), but the other two have taken some time. Part of the reason is that those two have had to be developed from scratch and that process takes longer (and more start-up $) than you think it will. In retrospect, it may have been wiser to have one more slam dunk project in the hopper earlier on before working on these but hindsight is always 20/20.

In any case, with project number two the gamble appears to be paying off. We had a lab meeting today where my grad student presented some compelling data that are demonstrating some of the things we had hoped to see. We have the data and a well thought out analysis plan that have borne some very exciting findings already with more undoubtedly behind them. The project is evolving from something that was risky into something that will be a slam dunk for funding* once we wade through the remaining data. Nearly fours years since starting the lab, with many adjustments along the way, we're seeing the project come together the way we hoped it would.

Yes, this could have blown up in our faces and we had several plan Bs ready to get something out of the data if it did, but if things keep going the way the are** this will be well worth the loss of sleep and gray hairs cultivated over the data and finances of the project.

Four years may seem like an eternity when trying to get the lab rolling (and it has), but seeing the fruit of that labor, the incredible collaborations that have come out of it and the development of the students who have made it happen, is all pretty sweet.

*Inasmuch as anything is....

**que lab disaster that smites me for hubris

4 responses so far

  • gerty-z says:

    woot! yay PLS lab.

  • Darwin says:

    How do you think the risk-taking apply to the graduate students involved? Is it better to just work on a safe project or take a chance with less developed ideas?

  • proflikesubstance says:

    I would never put a grad student on something that is all or nothing. If a postdoc wants to come in and include a project like that in their stable, fine. However, I try hard to plan our projects so that there is high upside but so that there will be something for the effort in the end. That's not too hard with what we do.

  • All experimental research has risk if it is truly original and not merely confirmatory. The decision to carry out a specific experiment or line of enquiry is dictated by the rewards verses the costs. The costs include time, money, and lost opportunity to pursue more productive directions. Safer research may be initially less expensive, but it usually does not yield much further insight.

    Granting agencies generally support low risk, hypothesis-driven research, because this is often believed to lead to better designed experiments that produce clearer answers to the questions posed. However, I doubt that this is generally true. Even today, our knowledge of the true complexity of biological systems is extremely rudimentary and limited, and most of the hypotheses that are advanced are very simplistic. The formulation of more powerful hypotheses is best driven with access to large amounts of data as well as human creativity in linking diverse observations.

    Most scientific breakthroughs arise from the development of better tools, which are largely products of innovation, and their application in new directions. The study of systems biology with equipment and reagents that facilitate "omics" research has yielded immense amounts of data. Analysis of this data has been assisted by advancements in computing, but unfortunately most biomedical scientists still focus their attention on a relatively small subset of just a few proteins in their research. Such thinking is re-enforced with how most research is selected for funding by granting agencies.

    Unfortunately, grant support for "omits" research, when it does happen, usually tends to be restricted to mega-projects that are highly onerous to apply for and even worse to administer. Such mega-projects are extremely expensive, usually lack followup and longer term objectives, and are thus poorly translated into practical purposes such as the development of new diagnostics and therapeutics. Because of the large sums of money involved with these projects, they often receive a much higher level of management and accountability towards ensuring achievement of specific goal-oriented milestones. Similar to what transpires with industry-based research, interesting leads are often ignored and either discarded or buried.

    Scientific discovery would be more fruitful if the powerful, new tools of "omics" research were made more easily and widely available to greater numbers of researchers. Rather than making these resources available to a select few, government-funded core facilities might be established that subsidize this kind of research for undertaking by a wider range of investigators on a contract basis. After a brief hold-period, the resulting data sets should ultimately become deposited in open-access databases that are more user friendly to query. More resources should also be devoted to the development of synthetic intelligence systems (better known as artificial intelligence) to assist in specific queries. Finally, grant agencies should support a larger number of researchers, with less funding if necessary, to get more human brain power focused on the biomedical problems that confront our societies.

Leave a Reply


+ 9 = seventeen